App Frameworks • watchOS • 37:23
Move beyond architecture changes for Watch apps and explore the different ways you can use the Digital Crown to enhance your apps. Find out how to create persistent app experiences to keep your Watch app at the ready for users and learn how to use the playback and recording APIs to add audio and video to your Watch apps.
Speakers: Nathan de Vries, Chloe Chang
Unlisted on Apple Developer site
Downloads from Apple
Transcript
This transcript has potential transcription errors. We are working on an improved version.
[Applause]
NATHAN DEVRIES: Good morning, everyone. My name is Nathan deVries, and shortly you'll meet my colleague Chloe Chang. We are here to present WatchKit in depth part 2. In the previous session, part 1, we talked about the architectural differences between watchOS 1 and watchOS 2 and how it relates to your applications. In this talk we will be looking a little bit more into the new APIs enabled by this new architectural change bringing your watch app and extension to the watch.
Now, to start with, I will be talking a little bit about something that's pretty exciting. We are enabling Digital Crown experiences in your own apps so this will be great. Next, we will be talking a little bit about media playback both for short video and audio content as well as long form music, podcasts, et cetera, in the background. This is also a great new API.
Next, we will be talking about audio recording. You can now record audio straight from your wrist and send the short clips of audio to friends, to a server, share them on the internet, and so on. And lastly, we're going to be talking a little bit about storing sensitive data on the watch. Now that your watch is fully responsible for all of the data it shows, it's important to keep in mind the sensitive data you might be storing. So let's get started.
The Digital Crown is a brand new user input mechanism that allows you to fluidly and very tactilely scroll through and pick elements on the screen using the Digital Crown. It's a really fantastic way to interact with the device. Let's take a look at some of the core interactions that are enabled by the Digital Crown.
To start with, the one that you will be most familiar with is, of course, scrolling. Place your finger on the Crown, turn slowly to scroll slowly or flick quickly to scroll with momentum. It's a really natural experience. It feels like when we did Multi-Touch scrolling. Next, you can use the Digital Crown to choose options.
This is a similar picker style to S iOS, but on the watch you can use the Digital Crown. Here I'm selecting my age in the activity set up UI where I have a sheet that shows three different pickers allowing me to have a concise and precise UI for choosing my age all in a single modal sheet.
Next, we have a different way of presenting option picking. In many cases, the options that you are choosing have a more special or specific way of presenting the data. In this case, I'm setting a timer of 45 minutes, so we think that setting time it makes sense to lay out the information in a way that's specific to time.
In this case, it's a 60 second dial, so turning the Crown shows you progress as you are rotating through each of those minutes. In each of your applications, you will need to find ways to present the options that you are presenting to the user in ways that are relevant to the data. So keep this in mind.
Lastly, because the Crown provides smooth, precise and continuous movement, here we can use the Crown to rotate through continuous values like volume in the now playing glance. So these are a few of the core interactions enabled by the Crown, and we are excited to be able to provide those to your application too.
So stepping back a little bit from the core interactions, what are some of the benefits of the Digital Crown? Well, we believe that the Digital Crown is an incredibly intuitive experience when you first put on an Apple Watch, the first thing you reach for is the Digital Crown.
People love to play with it, and it's a very natural way to interact with content on the screen. Next, and very importantly, it keeps the UI visible, by not obstructing the screen with your finger, you can see what you are interacting with. So exposing different interactions using the Digital Crown allows your users to clearly navigate the system.
Thirdly, it's a very precise input mechanism. If we look back at the timer example, you can't imagine trying to tune between 0 and 60 minutes of a timer using touch alone. With the Crown you can very precisely tune a 2, 3 or 4-minute timer or zip through and create a 60-minute timer very quickly, just with a few flicks of the Digital Crown.
Finally it's a playful. This is a fun experience. It's very tactile and we think the users are going to love it. So what are we exposing in watchOS 2? This one you will be very familiar with. It's the same style of picker that we use elsewhere in the system, like in complication editing and activity setup.
We call this the WKInterface picker, and it's available in watchOS 2 applications. We have the list style. This is great for selecting a range of options where the options are showing in text, in a list with a little bit of context to either side to show where you are in the selection.
Next, we have the stack style. Now, the stack style is perfect if you have a very visual, graphical way of presenting your data. It could be photographs, it could be stickers in a messaging application. This is really great if you want to show in place presentations of your data that you can turn through using the Digital Crown like a flip book. And finally, we have the sequence style picker. Now, this picker is really cool. I'm excited to see what you are going to be able to do with this guy.
Here, I have used it to create an emoji picking interface. So if you turn the Digital Crown, we cycle through different emotions. This can also be used to create your own sliders, knobs, meters and even games if you get creative enough. So that's the sequence style. The three styles are list, stack and sequence.
Let's talk a little bit about what we call focus styles. In addition to the three visual styles of pickers, each of these styles also supports focus styles. What does that mean? So here we have an example of another emoji picking UI where we have the opportunity to choose both a face and a hand.
It's not immediately obvious what turning the Crown would do in this case. It's not even obvious that there are two pickers on the screen. What we can do is change the default focus style of these two pickers to an outline style. What this does is it outlines the picker to show it's an interactive element, and it also clearly shows in the system green color that the selected or focused element is editable with the Digital Crown. Now, when I tap the other item on the screen, I see the focus outline changes, and I know what turning the Digital Crown will do.
There's an additional level of control that you have over the focus outline. If the items in your picker need more context to describe what's in them, we can use an outline with caption focus style. This allows you to build a UI similar to the complication editing UI where you want to show a little breakout box that clearly shows what the item is that you currently have selected.
And, again, if I move to the other, move focus to the other picker, you will note that the caption also moves with it. The way that I have achieved this here is I have each item in my picker with the same caption, but you could also have a different caption for each item in the picker.
Now, there are other useful ways to provide context while the user is interacting with the Digital Crown. One of these is the system style of showing a contextual indicator, the Digital Crown indicator. This is shown beside the Digital Crown and as the user turns the Digital Crown it gives them context on the range of values available to them for picking. It also shows how far through the values they currently are. So it's good for getting your bearings when you have a picker that doesn't necessarily show where you are in the options, just through the visuals itself.
For applications that want to show a more customized progress UI, they don't want to show the little Crown indicator, we have fantastic new ways for you to build those experiences. It's best to show with an example. In this case, I have a picker in the center of the screen which allows me to tune between 0 and 100%.
So as I turn the Digital Crown, you will see the values tick up in increments of 10%. This is kind of cool, but we don't know that the bottom limit is 0 and the upper limit is 100. It's not a very visual experience. It's not a pleasant experience. We could use the contextual indicator, but in our app we would like to show progress using a slightly different UI that is more relevant to your application.
Here, I have chosen to use an arc or a progress ring style UI. So as I turn the Digital Crown, I have coordinated the picker with another interface element in my UI so it updates the image as I turn through the picker's items. With this coordinated image API, you can build all sorts of fantastic Digital Crown experiences.
So there are all the different knobs and bells and whistles that you can configure on a WK interface picker, how you create one. Let's jump into looking at that right now. So those of you that are familiar with watchOS 1, this is a very straight forward process. If you go to your storyboard and select an interface controller, you can simply drag a new picker object straight from the objects library into your storyboard.
Once you have a picker, you can obviously configure its various attributes. We have already been speaking about these such as the style, list, sequence or stack. The focus style and whether or not you would like that optional Digital Crown indicator to be shown. In this case, I'm using the list style, the outline with caption style, and a disabled indicator, because I'm going to provide my own, or I don't feel like the context is necessary with the data that I'm presenting.
Next, we need to jump into the interface controller code, and we need to create an IB outlet for the picker so we can hook up our code and the object in the storyboard. This gives us a reference to the picker. And finally, we jump back to the storyboard and hook these two things up. So there are just a few simple steps similar to creating buttons and images and so on with the picker very easy to create.
So now we have a picker, and we have hooked it up, and we have build and run and this is what we get. It's possibly the most useless picker in the world. Turn the Digital Crown and you can quickly navigate through nothing. It's great! So what we need to do is configure what we call items as you can see from the title. To do that, now that we have a reference to the picker through the IB outlet, we can simply call the set items API and specify an array of WKPickerItems. These are the very simple objects with a few optional properties.
Here I'm using the title property, the caption property and the accessory property to define the visual style that I'm presenting on the watch. Now, if I want to have full control over what's shown in each row of the picker, I can also use the content image property of each of these different WKPickerItems. This gives me full control over what's shown for each item in the picker. For the stack and sequence styles, the content image is used to define what's shown in those two pickers. So for those two styles, you will just use the content image property.
So new in watchOS 2 we have a new API for specifying images you would like to be shown in your UI for pickers and for a few other objects which you will see later. This is called the WK image class. And I will give a few examples of how you might use this class.
The most common example will be that you have your images included in the application already in your asset catalog or in the bundle itself, and so you can use the WK image, image name API to load those images into the application just prior to display. All the extension needs to do is specify the name of the image.
For cases where your extension has downloaded the image from the internet or from the companion using watch connectivity, instead of turning that image and turning that image data into an image, you can save the necessary overhead of decoding the image in the extension and pass the image data into the application where it will be converted into an image just prior to display.
These first two ways of creating images are the preferred way if you already have the imimage data. There is no sense in creating UI images unnecessarily in your extension because they are displayed by the application. The last case is where you do need UI images. Now that core graphics available on the watch, you can use it to render custom effects into images to use in your WKPickerItems. So here I'm using the WK image image API, which takes a UI image object.
Now, we have a picker configured with items and turning the Digital Crown will move through those items, but it's useful in your application to know when the selected item changes, and to do this, we use a similar technique to a button or a slider or a switch by hooking up an IB action.
Now, the IB action for pickers in particular is a method that takes a single argument, which is an integer representing the index of the selected item in your items array that you specified using set items. So in this case, I am indexing into my picker items using the selected index provided to me, and I'm printing the title of that picker item.
So really easy to handle selected item and changes. One thing to note is that you don't want to do particularly expensive work in response to item changes. As you saw in the core interaction section, you can turn the Crown quickly to move through items quickly so your picker action is going to be called very rapidly. If you are doing very expensive things I suggest you hold off until you haven't received a picker action call in a little bit of time, and then you do the expensive work like refreshing your table view and so on.
Next we have image coordination. This is what allows you to build those custom, beautiful interactions where many parts of your interface are coordinated with the picker as its value changes. This is also very simple to set up. In this case, I have a picker in the center of the screen which shows 70%, and the outline, the progress ring, is a WKInterfaceGroup which is the parent of the picker. Very easy, very simple UI to set up.
Now, to configure this in code, all we need to do is create a set of animated images which will be managed by the picker. So here we create a progress images variable and use the UI animated image with images API to specify an array of those progress images we have either drawn in code or we already have in our asset catalog. We will give it a duration of 0 because the picker will automatically manage seeking through this animation, and then finally on the progress interface group, which is hosting the progress ring, we will call set background image to make the animated image the background image.
Finally we call set coordinated animations on the picker giving it a list of all of the different interface objects that we would like the picker to coordinate as you turn through the picker. In this case, we only have a single group, but we could also specify multiple groups, a group in an image, and so on. This supports as many objects as you like.
Now, as I turn through the WK interface picker's items, we coordinate with any associated groups and images which image is shown in that particular interface object. In this example, I have matched the number of items to images, but I will show you in a little bit that that doesn't necessarily need to be the case. So that's how you coordinate images with a picker. Now, to finish things off for this section, I thought I would give you a little demo of how to do this in your application.
So let me just get that set up. So here is a little app that I created earlier. It shows each of the different styles of picker that you can create as well as the last one for a demonstration of coordination. So let's start by going to the list style picker.
In this particular case, I have created a picker with accessory images drawn as random colors using core graphics and I have quite a few, quite a few items here. You can see it's very fluid to scroll through these items. Everyone time the item changes I'm also updating a separate interface element in my storyboard. This is a simple label, and I'm updating the text. So these kinds of things are very possible to do in response to selected item changes.
It's really only more expensive things that you want to defer. So let's go back to the top level menu and into the stack style. Here I'm also using core graphics to render random squares of random color, and here you will see that I can very smoothly and quickly flick through images in this stack style picker. Also note that in the upper right-hand corner as I turn the Digital Crown, you will see that the Digital Crown indicator shows my progress through this stack of items. It's really useful.
The third example is a sequence style. I thought I would show the kinds of things you could achieve with the sequence style because the emoji case was kind of obvious what you could do with that. Here I can create a pretty compelling experience for choosing a random color using a color wheel. Here as I turn the Digital Crown, I'm selecting different images in the sequence representing a new color. So this is pretty cool, and also note that this supports momentum so I can still flick on the Crown and it will move through the values very quickly.
Finally, let's talk a little bit about coordination. I have created an example here with a left and a right picker where I have got the left picker focused. As I turn the Digital Crown, I'm moving through 12 options on the left-hand side. If I focus the right hand picker, now I'm moving through 60 items. You can see that this is maybe an approximation of the timer UI.
So one thing to note is that both pickers share the same coordinated image. So my coordinated image includes 60 different progress arcs for this right hand picker, but my left hand picker can also use that same progress image. This is a really handy technique to use if you want to save the number of resources you are using between your pickers. So that's the demo.
[Applause] So just to recap some of the things you have learned about the Digital Crown, the new experiences you can use provide to your users in watchOS 2, we have three brand new customizable styles, the list, the stack and the sequence. Each one is tailored to displaying different types of information, so you should choose which one you use based on what your users are interacting with in your apps.
Each of these styles supports focus and indicator support so as you are turning through the items in the picker, we both show you what the Crown would do by showing focus, and we also provide support for showing the standard Digital Crown indicator. And finally, for creating truly customized, compelling interfaces with very tailored graphics, you can use the image animation coordination APIs to synchronize other aspects of your interface with the pickers that are currently being edited. And with that, I would like to hand off to Chloe who will talk about media play back on watchOS 2. Thank you very much! [Applause]
[Chloe Chang]
Hi, everyone! I'm here to talk about more new features in watchOS 2. The first one is media playback. So let me show you what it looks like to play a simple video in the new media player in watchOS 2. First, the content is loaded and, once the user presses the play iCon, it will start to play. And at any point in time the user can turn the Digital Crown to adjust the volume and also if you tap on the screen, the player control will come up.
And for audio only content, we also have a built in UI for that. The audio for both cases will be routed to the selected output source. That could be your Apple Watch speaker or a paired bluetooth device. Now, let me show you how to do this in code.
Let's say you are trying to present this media player from your interface controller. The first thing you do is to figure out the URL for the content. Suppose you have a simple movie from your extension bundle. This is how you come up with the URL for the content. You can also use remote URL. In that case, the media player would handle the download for you. And also, it would update the UI with a progress indication.
And with the URL, we can then decide how we are going to play the content. We offer a few options here. The first one is auto play. When it's set to true, once the media player is on screen and the content is loaded, the playback will automatically start. The next one I use is start time. I make it to start at the 3 second mark of the video. This is really useful if the playback was previously paused and when user comes back to the media player, you want to resume the playback from the same spot.
And the third option here I use video gravity. This key actually determines how the video will be resized once it's in the media player. And with all of this information, now you can call present media controller with URL. And you can specify completion handler. It will tell you how the playback ended. And that's how easy you can present a media player from your code.
And in some other cases, for example, in the social media app, you want to embed a snippet of video in some kind of context. We have the object that you need. In watchOS 2, now you have the WKInterface movie object that you can insert into a UI. This object has a player icon overlaying a poster image that represents your content. Once user taps on the play icon, the media player I showed you at the beginning shows up here automatically. You don't have to write any code to do that.
So let me show you how to do that in Xcode. In the new Xcode, you will find this movie object in your object library. You can drag and drop it into your storyboard, and you can use the attribute inspector to customize this movie object. You can specify the video gravity, or the poster image.
You can also do this in code if you want to dynamically configure this. Suppose in your code you have member variables for the content, and you have the URL for it, and the poster image that represents the movie. And you drag the IB outlet from the interface builder and that represents the movie object that you are using. And once you want to update your UI or your content has to change, you can use this method to configure your movie object.
And because the -- is so different on Apple Watch than other iOS devices, we have a few specific media formats you should use for any content you are trying to play on Apple Watch. If your content works best in full screen mode, then the recommended resolution is 208 by 260. If your content actually works best in a 16:9 aspect ratio, the recommended resolution is 320 by 180. If you follow this set of media specs, you can expect the best result for your playback feature.
That's how easy you can play audio and video in your app. The next feature I want to talk about is called long form audio playback. The features I showed you so far present some UI for you and once the UI is dismissed, the playback will stop. Long form audio is different. It doesn't come with any built in UI for your app.
That means you can create your own, and once your own UI is dismissed, the playback can actually continue in the background. A common use case for this is a podcast app or a music app. The user might be listening to the audio tracks while he is out for a run or doing a commute.
The audio for this feature would be routed only to the paired bluetooth device. And the playback status would be updated in the now playing glance. So it's very convenient for the user to control the playback. Now, let me show you the API. If you are familiar with AV foundation APIs, you will think these are very familiar.
Let's say you have the URL for the audio you are trying to play. You can create an asset like this. With this asset you can create the audio file player item. The player item has the presentation status for the asset. And you can use KVO to observe the status.
Just like with AV foundation APIs. And with the player item, you can create a player and call play to play it. Super simple! For the example I mentioned, a podcast app, you can create a list of items that you are trying to play, and if you want to provide user a gapless listening experience, instead of using audio file player, just use audio file Queue player, and that will give you exactly what you want.
Just like on other iOS devices, if you want to create background mode playback, you need to enable it in the apps info list just like this in Xcode. Super simple! Now, I would like to take a couple of moments to talk about where you should store all of this media data in your app.
As you know now, your app on Apple Watch has two components. One is the app, the other is the extension, and when it's installed, the extension bundle sits inside the app bundle. Each component has a container for storing dynamic data, and each container can only be accessed by the individual component.
So let's say you have a static image that you want to use in your app, and you want to use WK image, image named to reference it, you should put it in an asset catalog in the app bundle. If you have other resources, for example, audio or video playback, and the image data that you want to directly manipulate in your code, you should put it in the extension bundle, so that you can create the URL for them.
Now, in other cases, if you want to have dynamic content for playback, you can either use watch connectivity to grab those content from the iPhone, or you can download your content from the internet by using NSURL session. For these contents, you cannot put them in either container I show you here because they won't be able to be accessed by both the app and the extension. So what you need to do is create an app group, which provides a shared container for you to store these data and they can be accessed by both.
So that's all you need to know, and if you want to learn more about app group, please take a look at this documentation. The next feature I want to talk about is audio recording. Let's take a look at the UI, and see how it works in action. Once the user presses the record button, the recording will start, and the wave form will be updated. You can do this with just one function call.
In your interface controller sub class, you can just call present audio recording controller with output URL and specify the URL. Now, you know this URL will point to the shared container. And then you decide a preset. I will talk about preset in a few seconds. And then you can specify the maximum duration for this recording. It's really useful if you want to control the file size. Right now, I set it to 60 seconds.
And you can also customize the action title which shows up on the upper right-hand corner of the UI. Suppose your app provides a feature to send this recording output to another person, customize it to say 'send' would be appropriate. Otherwise, the default will say 'done.' You can also specify completion handler. It will tell you if the content gets saved and if there is any error.
Let's go back to the URL. The URL has the file name, and the extension of the file name actually determines what codec we are going to use for generating the output. These are the file types we supported, MP4 and 4A. They will use AAC codec and wave will use PCM codec. We support three different presets. Each of them has different sample rate and based on the codec you choose, the output bit rate will be different. As you expect, the higher the quality the output, the output file will be larger.
So it depends on your use case. So if you are recording for a voice memo case, the narrow band speech preset will be ideal for you. If you are trying to record something with higher quality, maybe for post processing purpose, then you might want to consider high quality audio preset. With these presets, audio recording becomes really straight forward and easy to customize for your feature.
Next I would like to talk about security. In watchOS 2 you can now use key chains to store your user's sensitive data. You can make those data only accessible when the device is unlocked. Apple Watch is unlocked when user enters the correct password on the Apple Watch when the device is on the user's wrist. And then the data always becomes available to you until the user takes the watch off their wrist.
This is really useful if you are creating a complication for the watch face. And in the complication you might be showing some user data which is sensitive. And so this is really important to protect those data with key chains. And one thing I want you to bear in mind is that the key chains you create on Apple Watch do not participate in key chain sync. They are not icon key chains even though the usage is exactly the same as other platform.
This is an example of creating a key chain to store user credentials. Suppose you have a secret password and you want to store it, make it only accessible when the device is unlocked, you can do something like this. And call item add to add these tributes. So you would create a key chain. To summarize our talk today, at the beginning, you heard Nathan introduce you to a few new styles of pickers.
Each of them can be controlled by a Digital Crown that's really fun to use. Then you heard me talk about media player for audio and video, and you can also play long form audio in the background. For audio recording, now you can do this in just a couple of lines of code. And finally, in watchOS 2 you can use keychain to protect your user data.