2011 • 21:03
AirPlay enables your app to wirelessly stream music, video, or even the entire screen to an HDTV via Apple TV. See how to build an app that supports AirPlay and discover entirely new possibilities for games, media apps and more. Learn how your apps can output visuals and audio to a second display wirelessly over AirPlay or through a connected cable, and understand best practices for supporting AirPlay mirroring.
Speaker: Eryk Vershen
Unlisted on Apple Developer site
Transcript
This transcript was generated using Whisper, it has known transcription errors. We are working on an improved version.
Hi, I'm Eryk Vershen, Media Technologies Evangelist. AirPlay is an exciting new technology that lets users stream your application's audio and video to high-definition displays and high-fidelity audio systems. With AirPlay mirroring, content can be displayed on a nearby Apple TV, allowing everyone in the room to see what's on your screen.
In this video, you'll learn about AirPlay audio, AirPlay video, and AirPlay mirroring, and how easy it is to support these in your application. Imagine you're listening to music with your headphones on your iPhone, and you come home, and with a few gestures on your phone, your music is now playing through your home stereo speakers.
Or imagine that you're watching a movie on your iPad 2. You come into your office where you have a TV with an Apple TV, and with a few gestures, the movie is now playing on the TV for your colleagues to see. Or imagine that you're playing a game on your iPad 2 and you come into your family room where you have a TV and an Apple TV.
You make a few gestures on your iPad and now the game state is on your big TV with a controller and extra information on your iPad. Well, you don't have to imagine it. It's AirPlay and it's already here. There are three parts to AirPlay: audio, video, and mirroring. I'll talk about each one in turn. Let's start with the AirPlay audio.
AirPlay Audio delivers audio to any AirPlay receiver: an Apple TV, an Airport Express, or any of several third-party audio receivers. The user's experience with AirPlay should be familiar. Your controls might look like this. When the user comes in range of an AirPlay receiver, the route button will appear so that the user can choose a route.
When the user selects the route button, they will get a menu of AirPlay receivers. Let's say they choose the living room speakers. When they select, it will be checked. which will dismiss the menu, the route button will now show blue to indicate that audio is being routed away from the iOS device, and audio will be sent to the AirPlay receiver.
So how do you support AirPlay audio? Well, there are four steps. The first step is to provide a route button. If you're using MP Movie Player controller, then you get a route button as part of the standard controls. Otherwise, you should use MP Volume View. This provides a volume slider and a route picker. The system handles the details.
It detects the presence of AirPlay receivers and presents the list. You can't synthesize this yourself if you don't have access to a list of AirPlay receivers. Using MP Volume View is straightforward. You allocate and init an MP Volume View, tell it to display the route button, and add the view to your view hierarchy. The route button is visible by default, so you don't have to set it.
You can also not show the volume slider. Simply set "shows volume slider" to "no." Now, some developers have asked about not showing a route button. That's not a good idea. Any other app can change the route behind your back, so all you're doing is making things hard for your user.
The second thing you need to do is handle remote control events. There are a number of remote control events. For example, Toggle Play/Pause from the headphones, or Next Track from the lock screen, or any of this list of events. Most of these can be delivered from the headphones using double and triple clicks.
As I've already mentioned, you can get remote control events from headphones and via the lock screen controls. They also come from the multitasking bar, from AirPlay devices, from Bluetooth devices, and from MFI accessories. With so many ways of control, it's very clear that you should support remote control. Handling remote control is not difficult. First your view must implement can become first responder. Next, when your view appears, you must tell the application object that you want to receive remote control events and become first responder.
While you are first responder, events will come into your remote control received with event method. You need to make sure that it is a remote control event and then look at the subtype to determine the exact event. If you implement nothing else, you should implement toggle play pause. This is the most common event. When your view goes away, you must be a good citizen by telling the application object that you no longer want remote control events and resigning first responder.
The third thing you need to do is provide now playing info. Imagine the user is sending audio from your app to the Apple TV. If you do not provide now playing info, then the user will see something like this, that is nothing. Instead, they should see something like this, artwork, title, album, and so on. You make this possible with a now playing info. Now playing info is also provided on the lock screen and in the multitasking bar. are.
To set Now Playing Info, you use the MPNowPlayingInfoCenter object. It's a singleton object that you use to communicate with a system. It's not tied to any particular way of delivering media. It works with MPMoviePlayerController, AVPlayer, AudioCue, AudioUnits, and so on. It's another reason to support remote control events.
The information is only used if your app is receiving remote control events. To support this, you need to get together the metadata about your audio, and you want to provide as much metadata as you can. To pass it, all you do is create an NSDictionary. The keys identify artist, title, artwork, and so on.
Then you ask the MP Now Playing Info Center for the default center object and simply assign your dictionary as the value of the Now Playing Info property. You should, of course, change this each time the song changes. Here's a list of the available metadata: title, artist, artwork, genre, and so on. I want to mention elapsed playback time in particular. You don't need to repeatedly set this. From the time and playback rate, the system knows how to update this value.
You should reset it only when you've done a seek to some other point in the media or when the song changes. The system controls how your info is displayed. You have no control over the formatting. The information is also available to Bluetooth accessories via the Audio/Video Remote Control Protocol and to MFI accessories. The fourth step is to support background playback if it makes any sense for your app.
If the primary thing your application does is play audio, then users expect that audio to continue to play in the background. And in iOS 5.0, your app goes into the background when the screen locks. There are two main things you have to do to support background playback. You need to add the AppPlaysAudio item to the required background modes in your info.p list.
And you need to set your audio session to playback. If you are unfamiliar with audio sessions, you should watch the presentation from WWDC 2011 Audio Session Management for iOS. So, four steps to supporting AirPlay audio: adding a route button, handling remote control events, providing now playing info, and supporting background playback.
Now we turn to AirPlay Video. When we use AirPlay Video, we are sending a file to the Apple TV. But if that file is coming from the network, the Apple TV may be fetching it independently. However, even in that case, the iOS device is still being made aware of the state of the playback. The user experience for AirPlay video is very similar to that of AirPlay audio. The typical video playback controls look like this. When an AirPlay receiver is in range, then the system makes the route button visible so that the user can choose a route.
When the user clicks the route button, they get a menu of possible destinations. Notice that now some of the selections have a TV icon instead of a speaker icon. The user chooses one, say the Family Room Apple TV. The new selection is now checked and the user returns to the regular screen. The route button has turned blue again to indicate that content is routed away from the device. And now the video no longer shows on the screen because it's routed to the Apple TV.
[Transcript missing]
The fourth thing you need to do is don't leave the user in the dark. What do I mean by that? When the user redirects video to AirPlay, then the video stops playing in whatever view you had on screen. That is, it goes black. This isn't the best experience.
What you need to do is give the user a gentle hint that they've sent their movie off somewhere else. Let them know where to look. If you're using MP Movie Player Controller to display your video, then you don't need to do anything. The framework does it for you. But if you're using AVPlayer, the framework can't do it for you. So you have to do it yourself. You do this by key value observing the AirPlayVideoActive property on the AVPlayer object.
When the observed value for key path comes in, you should already be on the main queue, at least if you did your add observer on the main thread, which is what you should have done. You can then update your UI based on the current value of the AirPlay video active property. So that's the four steps for supporting AirPlay video. Provide a route button, handle remote control events, turn it on, and don't leave the user in the dark. Last, let's turn to AirPlay mirroring.
We've been able to support a second display via a hard cable for a while, but AirPlay mirroring allows us to do away with that cable and get a mirrored display wirelessly from the iPad 2 or the iPhone 4S. But it's not just the home screen that gets mirrored. Every app is mirrored.
And apps don't have to restrict themselves to mirroring. For example, in Keynote, you can have the presentation on the TV and the presenter display on the iPad. But we can do the same thing with games. Put our game state on the TV and keep the controls in our hand. And this need not be restricted to racing games. A pass-around game could do the same sort of thing.
The user experience for mirroring is similar to what we've seen before, except that now the user goes into the multitasking bar. If an AirPlay device is in range, then the route button will be displayed. The user sees a list of possible devices and can select an Apple TV.
When they do so, they also get a mirroring button. If the user selects that, now when they click outside the menu, The route button will once again turn blue to indicate that content is being routed away from the iOS device. And when they return to the home screen, the status bar will be blue.
to remind them that content is redirected and there's also a small route icon on the right side of the status bar. AirPlay mirroring gets you a wireless second display with an iPhone 4S or an iPad 2. When you connect either of these devices via a cable, you also get mirroring, provided you're using a digital adapter. With older devices, the iPhone 4, iPad, and fourth generation iPod Touch, you get a second display without mirroring. And on any device, you get a non-mirrored second display when you're using an older analog cable, such as Component or Composite.
Once again, there are four steps to supporting AirPlay mirroring. The first step is detecting the other screen. In your application did finish launching routine, you need to do three things. First of all, you need to add an observer for the UI screen did connect notification. You need to add an observer for the screen disconnect.
And because you don't get any notifications for screens that are present when you start up, you need to also check your initial setup. Now notice that all three of these call exactly the same method. Here's the method we call. In this method, we first count the number of screens. Then, if there is exactly one, we set up our views for a single screen. If there are exactly two, we make a window, associate it with the second screen, and set up the view properly.
If we have more than two, we do nothing. Why did I write it this way? In this case, there is a rare case where you can temporarily have three screens. The best thing you can do is ignore the third screen's arrival and fix things when the system drops you back to two screens.
The second step is choosing what goes where. You need to think seriously about what you're putting on each screen. Where's the user's attention? Is there more than one user? Where are they each looking? What should they see? Also, performance might be an issue if you have a 1080p display. It might be that your app can't handle that much data at a decent frame rate.
The third step is to manage orientation. You should already be doing this right, which is implementing "Should auto-rotate to interface orientation". If you do implement rotation yourself, you need to make sure you do not apply it to the second screen. The fourth step is selecting the screen mode or resolution. We have a number of ways of connecting a display. We can do it from a number of different iOS devices, and we can connect it to various sizes of TV.
Your resolution might be 720p, or 1080p, or 1024x768, or 800x600, or 640x480, or even NTSC or PAL. How do you choose which mode, which resolution to use? Use the preferred mode. The preferred mode is the optimal resolution for the display. It has the best aspect ratio, the best quality. It's determined by the system, and it's chosen automatically, so you don't need to do anything to get the preferred mode. It doesn't require a hardware mode switch, so you avoid a screen flash, you launch faster, and you don't use the TV scaler.
Having said all that, there are cases where you may want or need to use a mode other than the preferred mode. Let's look at this. If you use the preferred mode, this is good. Optimal quality and no mode switch. But you might not be able to achieve your performance goals on a large display. So you might render at a lower resolution and use layer scaling. You're still in the preferred mode, but the layer scaling does take some time and may introduce artifacts.
So you might choose a specific mode. If you do this, you may still have a display that doesn't support the mode, so you'll need to test a lot more, and in any case, you'll have a mode switch, and you'll probably be going through the TV's scaler. So those are the four steps to supporting AirPlay mirroring: detect the other screen, choose what goes where, manage orientation, and select the screen mode.
And those are the same steps that you need to do to support any external screen, whether AirPlay mirrored or via direct cable. Except, to support external screens, there's actually a fifth step, which is handle overscan. So what's overscan? Let's take a look at a game image. Let's say we have various status things we want to put around the edge, or your health and what not. With an overscan display, that can look like this.
Why is it chopping off parts of the image? Well, early CRT TVs had trouble being consistent about scanning and they could chop off images like this due to hardware issues. TV production reacted by defining a title safe area, which was the portion of your screen that you could guarantee was going to be visible on any TV. So you can solve overscanning by scaling your image so that it fits within the title safe area. But then you have black bars around the edges because not all TVs need the most pessimistic amount of overscanning.
Instead, what you want is for the image to go to the edge of the screen, but not lose any of your important information. So in iOS 5.0, we added overscan compensation. This knows about the amount of overscan needed with various devices and allows you three different ways of handling it: scale, inset bounds, and inset application frame. Let's use a 720p display as an example. If the display doesn't have overscan, then the compensation is not there.
The option you set doesn't make any difference. If it is overscanned, then the first choice is scale. This is the default choice. You get a 720p frame buffer and the system scales that buffer before sending it to the device. So you don't lose anything, but you're paying a cost for that scaling.
The second choice is Inset Bounds. Now this will give you an oddly sized frame buffer, but all the pixels will be visible. Note the size I've mentioned here is illustrative. You may get different sizes for different devices because different devices have different amounts of overscan. Now this choice avoids a scaling, but it may be problematic for your app to deal with these weird frame buffer sizes.
The last choice, Inset Application Frame, gives you a 720p frame buffer, but it sets the application frame property to the title safe area. Now you can write all over that 720p frame buffer, and you only need to pay attention to the application frame when you want something to definitely be visible.
So, five steps: detect the other screen, choose what goes where, manage orientation, select screen mode, and handle overscan. This supports any external screen, whether connected via AirPlay mirroring or via a cable. So there you go. As you can see, adding AirPlay to your application is very straightforward. People are really going to want it, and it's a great way to differentiate your app from others. I can't wait to hear about the cool things that you're planning to do with AirPlay.