Video hosted by Apple at devstreaming-cdn.apple.com

Configure player

Close

WWDC Index does not host video files

If you have access to video files, you can configure a URL pattern to be used in a video player.

URL pattern

preview

Use any of these variables in your URL pattern, the pattern is stored in your browsers' local storage.

$id
ID of session: wwdc2011-400
$eventId
ID of event: wwdc2011
$eventContentId
ID of session without event part: 400
$eventShortId
Shortened ID of event: wwdc11
$year
Year of session: 2011
$extension
Extension of original filename: m4v
$filenameAlmostEvery
Filename from "(Almost) Every..." gist: [2011] [Session 400] Graphics, M...

WWDC11 • Session 400

Graphics, Media, and Games Kickoff

Graphics, Media, and Games • iOS, OS X • 54:40

iOS and Mac OS X deliver an amazing lineup of technologies for developing cutting-edge games, innovative graphical applications, and platform-optimized audio and video experiences. Join your fellow developers in kicking off the Graphics, Media, and Games sessions of WWDC 2011 and gain key insights into the powerful capabilities you'll use to create great apps.

Speakers: Meriko Borogrove, Geoff Stahl, John Stauffer

Unlisted on Apple Developer site

Downloads from Apple

HD Video (1.41 GB)

Transcript

This transcript was generated using Whisper, it has known transcription errors. We are working on an improved version.

Welcome and thank you for coming to the Graphics, Media, and Games Kickoff. iOS and Mac OS give you a wealth of graphics, media, and game technologies for you to use in your application. And today what we're hoping to give you a little bit of insight into are some of the sessions and some insight into the technologies that we're going to be presenting this week at WWDC.

So right now, we're going to focus in a little bit on OpenGL and OpenGL ES, our industry standard 3D graphics API for accessing the power of the GPU. We're going to talk about OpenAL, the industry standard for building realistic 3D spatial audio environments. Then we're going to talk about Game Center. Game Center is Apple's social gaming network.

and AV Foundation, the framework you should use for all of your... All of your media editing needs of your application. And then Core Image, our powerful image processing library. And then we're going to touch on AirPlay. I think about giving you a unique opportunity for you to stream audio and video from your application. So let's get started. OpenGL is an industry standard available on Mac OS X. and iOS, we give you OpenGL ES, both powerful APIs for accessing the GPU for leveraging the 3D capabilities of the processor.

Graphics processors today are fundamentally built around shaders, the ability to write shaders and to run them on the graphics processor. Shaders take geometry and textures as inputs. They use those inputs to compute the visual effect that you're after. This is what is at the heart of the amazing visual effects you see in the applications today. And to get this session started, I'm going to bring Geoff Staahl up, and he's going to show us what's possible on the iPad using shaders. Thank you, John.

What you're seeing is the latest game, Shadowgun, from Madfinger Games, built on top of Unity Technologies, running on iOS, iPad 2, 60 frames per second. This is a beautiful game.

[Transcript missing]

We have a great reflectivity mask to give these specular highlights, all computed on the GPU. In fact, every single pixel you see is computed on the GPU.

All of this real time, 60 frames per second. We have a cloth simulation. The flag itself is a cloth simulation that's running on a vertex shader along with the wires, are animated with vertex shaders. Over here, you have this great HDR effect. That HDR effect and those god rays are all optimized shaders.

Fantastic visuals. Remember, this is on an iPad 2, on the GPU. This is not just a technology demo. This is a full game level. This has your enemies, it has AI, pathfinding. All those things are integrated here. It's not just the graphics technologies. Now let's take a look at another part of the demonstration. If we look at the character himself, look at a very, very realistic character. In fact, the character is run on shaders that are BRDFs.

They're Binary Reflectance Direction Function. Basically what they do is they define the way a material looks. For example, if you look at the top of the shoulder, you see there's a cloth. That cloth has a different look than the buckle or his skin. These functions can be encoded. You can run on the GPU to give you a fantastic look for your application. Now let's watch as we go into this green area.

What you're seeing here, you're seeing the character as it picks up this fantastic green glow in the simulation or in the game. What that is, is that's called a light probe. And this is, we believe, the first time that anyone's used light probes in a mobile game. Light probes are usually used, reserved for things like cinema, where Avatar used them to great effect, or console type of games. In this case, what light probes are, you take spherical samples around your game level. In this case, there's about 300 light probes on this game level.

Then in real time, you actually take the coefficients from those light probes, you can linear interpolate them on the GPU in a shader, and get this lighting effect. And now if we watch, as he transitions out of the green, into the dark, entering the elevator, and back into this green area. Amazing effect of really bringing the character into this 3D world.

You can see some more HDR effects, fantastic visuals, fully animated world. Madfinger Games built with Unity technologies running on iOS, 60 frames per second, stencil shadow for the fans, more god rays, true to force in graphics, running on mobile graphics. This really sets a new high bar for mobile graphics today. Thank you.

Thank you, Geoff. So shaders are critically important for you to be able to build these amazing visual effects you see. It's absolutely incredible what's possible. And with Lion and iOS 5, shaders are everywhere they run. So everywhere Lion and iOS 5 run, you have access to the power of shaders. So there's no reason for you not to use them now.

So as I said, the graphics processor has fundamentally evolved over the years to be about shaders. You write shaders and you run them on the GPU. And getting those shaders optimal And getting the syntax of those shaders correct can be challenging. So in iOS 5, we've introduced a series of tools to help you do that.

The first tool is the OpenGL ES Performance Detective. The OpenGL Performance Detective is an expert application that automatically analyzes your use of OpenGL, both in terms of performance and correctness, and gives you suggestions about how you can correct those problems, how you can optimize your use of OpenGL.

The second tool is OpenGL Analyzer Instrument. It's an Xcode instrument that gives you that next level of information about the performance characteristics of your application and its use of OpenGL, giving you that fine detail of information that you can then act upon to improve the performance of your application.

Third is OpenGL ES Debugger, and we've seen the demonstration from the Xcode team of OpenGL ES Debugger. It's a full-featured tool, giving you frame-by-frame access to information. You can step through your program code, your OpenGL code, accessing state information, buffer information, giving you all the detailed information you'll need to debug the hardest OpenGL problems you may have.

And all of these tools are available right within Xcode, making it so you can stay right in Xcode and use the OpenGL tools to debug and optimize your OpenGL implementation within your application development cycle. So to go into a demonstration of some of these tools, I'm going to bring Geoff on stage and we're going to look at these.

What we're going to do is we're going to take an application that we designed to be able to show the graphics power of the iPad 2 and some interactive lighting. So if we go to the iPad here, We can see where we stand. And this is not quite what I was looking for. It's a little dark, a little stuttery, and the lights aren't really interacting with the world. Let's use our tools and see if we can fix that.

So I'm going to switch over to the Mac and to Xcode. It's showing the standard instrument state on the top, but what we want to focus on is the area below here, which is really important. This is our expert area. And what you notice that there are things that it's giving you that are, we've analyzed your API use of OpenGL and we'll tell you some of the things that we think could be really important.

So you can look at these things and you can find redundant calls. And one of those things we talked about here is shader compilation. It's telling me that I'm compiling shaders. It turns out, if you see, I have 105 frames and 105 shader compilations. So I'm compiling my shaders every single frame. Probably not what you want to do, so that's a really interesting area to look at. Since this is integrated with Xcode, what we can do, we'll stop here. We'll use this arrow key. It brings us up the stack trace.

Go here, it shows us the area where the instrument found that problem. Click this icon right in this area right here, which is go back to Xcode, and it'll open the file right in Xcode. Turns out when we're setting up this demo, we add an extra code that did a light reset every time, which is not needed for the way we're running the demo here in the house. So I'll remove that, and we'll try and run that again. So I'll remove that. We're going to do a build and run. And we are going to be launching this momentarily.

Okay, this is not looking very good. We will try one more time and then we'll be done. Okay, now if we switch to the iPad. Can we switch to the iPad, please? You need to switch to the secondary iPad. So what you'll see is in a second when the secondary iPad comes up, you'll see the simulation is now running very smoothly, but we still don't have the right lighting. So we fixed our performance problem.

What we could use is the performance detective will tell us basically where it is, tell us the state problem, the shader compilation. We used the instrument to drill down deeper. You could see that there was a redundant shader compilation here. We remove that, and we get the smooth running application. We're going to switch back to Xcode, stopping our simulation, and we noticed there was a lighting problem with it. That's the other machine. There you go.

Here we go. Now we're on the right machine. Okay. What we've added, as you saw in the last session, is the ability to take a snapshot. The interesting thing here, again, is it's working on the iPad itself, and we've taken the snapshot in OpenGL. It's capturing the frame, and we're going to show you how you can use some of those markers that were introduced in the last session, and you can find kind of a detailed lighting problem.

I'm not sure why it didn't take so long to capture the frame, but we're going to move on with the talk. And if you come down to the lab when we do the OpenGL tools and there's OpenGL tools sessions later this week, we'll show you more details on how to use the OpenGL debugger. Thank you.

Thank you, Geoff. As you can see, we have a powerful set of tools. Oh, that must have been the other demo. So, new in iOS 5, we're also making it easier for you to use OpenGL in your application. We're offering a new frame kit called GeoKit. GeoKit gives you a set of services making the use of OpenGL easier for you. The first API is UIKit integration, giving you some high-level Objective-C APIs that get you started with creating OpenGL views and an OpenGL context in your application.

Second is a math library, giving you over 175 math routines commonly used in 3D applications like linear algebra, matrix operations, and such. And these math routines are optimized for iOS, giving you some high-performance math routines that you can use. Next, easy texture loading routines. One function called to be able to load a texture from file or memory directly into an OpenGL texture, again, limiting the amount of code you have to write.

and an effects library. An effects library to help you start using OpenGL Shaders. Because again, OpenGL Shaders is how you're going to get access to the real capabilities of the GPU. We've also extended OpenGL ES for iOS 5, giving you the ability to do HDR renderings with float buffers, improved access to floating point textures, A new technique now available to be able to do soft edge shadows. Occlusion queries for high performance occlusion querying in detailed and complex 3D scenes and more flexibility around shader objects.

We've also improved Lion. Lion, we've introduced a concept of profiles. What profiles allows you to do is to focus your use of the API, the OpenGL API, on the modern subset of the API. OpenGL has evolved over the years and has taken on a number of API calls. The Core Profile lets you focus that to the segment that's modern and efficient and shader-focused. So I encourage you to learn about Core Profiles when you're writing OpenGL programs for Mac OS X.

Now, a companion of OpenGL is OpenCL. OpenCL is our high-performance data parallel computing API. It's also a programming interface built off of a C-like language, giving you more flexibility in how you can write programs and execute them on the GPU. It's good for operations like physics operations and basic math operations or complex math operations. Combined with the tight integration and high-performance sharing of data between OpenGL and OpenCL, you can pick and choose between which API is right for the mathematical or rendering effect you're trying to achieve.

So that's OpenGL for Mac OS X and OpenGL ES for iOS. Next, I want to talk about OpenALE. OpenALE is the industry standard and John Schultz have been working with OpenGL for over a decade. They are working on a new open GL library for building realistic 3D spatial audio effects into your application. It's modeled after OpenGL, making it a perfect companion for your OpenGL applications, making it great for games. So we encourage you to look at OpenGL for your 3D spatial audio effects.

Let's talk a little bit more about what is a spatial audio Spatial audio is about having point sounds in 3D space and a listener. And the sound the listener will hear will be based on the position and the distance those sounds are to the listener. So as the sound moves around the listener, the sounds will move from left and right ear and such and give a spatial audio effect.

So OpenAL gives you the ability to simulate these spatial audio environments. And we've taken an open AL and we've extended it. We've extended it to include reverb. Reverb is where these point source sounds will reflect off of objects and obstructions within the environment. We've also extended it with obstructions.

Obstructions are objects or walls and such that are between this point source sound and the listener, muffling the high-frequency sounds but letting the reverb through. And we've added Occlusion. Occlusion is where the sound sources in the listener are in different spaces or a different room. This will muffle the high-frequency sounds and the reverb.

So combined with these three extensions, we've enhanced the ability for you to build even more realistic 3D spatial audio environments for your games and applications. OpenAL is built on top of Core Audio. Core Audio is our professional-grade audio processing framework. Core Audio gives you access to things like MIDI devices and a full feature set for building the most complex audio editing applications and music applications.

So that's OpenAL, OpenAL, the industry standard for 3D spatial audio, based after OpenGL, and we really encourage you to use OpenAL for your games.

[Transcript missing]

We introduced Game Center last year, last September, and now we have over 50 million registered users, making it an incredible success. Game Center is basically three components to Game Center. There's the application, which gets installed in iOS. Game Center is where players will go to build friend networks, look at leaderboards. It's also where they will initiate multiplayer games.

There's Game Kit, the framework, which is a set of APIs that you'll use to incorporate Game Center functionality into your application. And then there's the service segment of it, the network service segment of it, where players will go and where high scores are stored and leaderboards. It's also where the service that provides auto-matching capabilities for Game Center.

So let's talk a little bit more about the features available that you can start leveraging in your game. So first is friends. Friends is at the heart of the social gaming experience. Friends like to play against each other, look at achievements and leaderboards. Leaderboards are where players post high scores, get bragging rights, increases their engagement and excitement around your game.

There's achievements. Achievements are something significant or difficult a player will earn in your game. They enjoy finding them, they enjoy earning new achievements, and it just raises the excitement level that you can generate with your game. Then there's multiplayer. Multiplayer, of course, is the ultimate challenge where two players can play against each other.

and VoiceChat. VoiceChat enables two players, while they're playing a multiplayer game, to talk to each other, further enhancing the multiplayer excitement and enjoyment that a user will have while playing multiplayer. And we enable customization, so when you use Game Center, you can customize leaderboards and achievements to match the style of your game.

So new for iOS 5 is turn-based multiplayer. We're all familiar with this kind of game, which is like a card game or a board game, where a player takes a turn, and then the next player, and then the next, and so on. And the turns go around sequentially between the players.

So let's look a little bit closer at how iOS 5 works, multiplayer works. So what happens is a player takes their turn, and the game data goes up to the game center. And the next player is notified, and they run the application, and the data comes down to their game. They take their turn. The next player is notified. The data goes to them, and so on.

Now, a really neat part about multi-turn gameplay is that Game Center will auto-match players into empty seats. So a game could actually be started, and as it proceeds along, it'll auto-match a new player into the game. It'll be their turn immediately, and they can take their turn, and then the game will proceed. So multi-auto-matching is a fantastic feature for Game Center turn-by-turn playing.

and you have control over the order in which the turns will occur. You know, it can go forwards, backwards, you can skip players. It can proceed in any order that you wish. So you have complete control over the order which players will take their turns. And a player can have up to 15 sessions of your game running at a time. So in one session, it might be their turn. Another, it'll be someone else's turn. But they can have up to 15 sessions of your game running.

So to preview this a little bit, I'm going to bring up Meriko, and she's going to run an application. See this in real. So I'm a big fan of turn-based-- I'm a big fan of turn-based gameplay, and I'm a big fan of word games. So the guys wrote an application called Word for Word, and I'd like to show it to you.

The first thing I'd like to do is give you a look at the user interface for the new turn-based multiplayer gaming. So this is what you're going to get when you call a GK turn-based multi-- really long word-- a GK turn-based matchmaking view controller. This is what your users will see.

Right. So most importantly, your turns that you need to play are right here at the front of your game. I've got a couple of turns to play. I have some friends. They're lagging on their turns. New to iOS 5, you can actually rate your game from directly inside the game. So we're going to go ahead and give our game five stars. I think it's great.

I played it just a little bit. So in order to start a new match, I'd like to show you that. You can tap the plus button up in the upper hand corner. So you can see me right up at the top. You can see how many players can play this game. Games support up to 16 players, and this is going to come up when you add how many games are in your code.

You can use this button here to add players or remove players. If you'd like to invite one of your Game Center friends, you can tap on Invite Friend, and you're presented with a list of your Game Center friends. New in iOS 5 is the ability to have a photo attached to your user. So this is a really good way for me to see if this is actually Corey or not.

This Play Now button in the upper right-hand corner is also pretty powerful. What this lets you do is, if you don't want to invite specific friends, you can say Auto Match, and Game Center will find your friends and automatically match you with game players who are good to play with. So what I'd like to do is go ahead and play a turn.

Now, let's see, we're playing a little bit of a self-referential game here. What I'd like to point out is this game is entirely implemented inside of UIKit, and it's running at 60 frames per second. What that lets me do is have a very interactive game where I feel like I'm actually touching the environment that I'm in. I'm picking up those tiles, laying them down, and I'll play my turn.

So, given that this is in UIKit, in iOS 5, you can customize even more of UIKit, which is really powerful. This allows you to get the look you want in your game and still have the feeling of our buttons and our behavior, so your users are going to understand how to play your game.

I asked our guys to go ahead and make a menu that fit in with the look and feel of this game, and you can see that this fits into my gameplay pretty nicely. You can start an online match up at the top. You can also do pass and play. You might only have one iPad on an airplane or something.

You can still see the games where it's your turn and go and take a turn there. We're also using the Game Center services to pull down your achievements from the server. We're sorting those in an order that we prefer. We're going ahead and putting the completed, the earned achievements at the top, and we're putting the in-progress achievements down in the bottom.

You're looking at a whole lot of UI table views here, and in iOS 5 with the new interface builder, you can now do table cell view prototyping directly inside of interface builder, which is going to let you have a rapid iteration of your user interface. We think this is pretty powerful too.

I'd also like to show you what we did with leaderboards. In addition to making it look like your game, you can also go ahead and add functionality. Right here we have our friend's leaderboard. You can see our friend's pictures. Again, when I'm looking at Lyricist, it's good to know that that's really her.

But our designer really felt like you should be able to start a game from inside of the leaderboard. Maybe I want to knock Lyricist down a peg, maybe I want an easy game, so I'll play with Hans Sutter. But you can launch that game here without going back and starting a new game and looking for your friend again.

The last thing I'd like to point out are these buttons. So if you look at the challenge button, the back button, the refresh button, these little lozenges, we're using a feature in iOS 5 in UIKit called UI Appearance. UI Appearance is very powerful because what it lets you do is configure the look and feel for an entire class of buttons or class of controllers.

You can either tint our controller or you can replace it with artwork of your own in order to have a completely customized look and feel. To do all of these buttons throughout the whole game was one line of code for our developers. Which is pretty cool. So that's word for word. Thanks, John.

Thank you, Meriko. So let's look at some of the other new features in iOS 5 and Game Center. There's achievement banners. Achievement banners are when a player earns an achievement. They'll get notified with a banner. Custom invite sounds, where you can build and incorporate custom sounds associated with your application when a player receives an invite.

in-game ratings so your players can rate your game right inside of the game title, so making it easier for players to get your ratings out there. Achievement Leader Boards, Raising the Visibility of Achievements. So now we have Achievement Leader Boards. Game recommendations. Players will receive recommendations on games based on what they play. and friend recommendations helping players build their friends network. And now players can associate a photo that they want to share with other players.

So that's Game Center, Apple's social gaming network. It increases player engagement, raises visibility of your games. We highly encourage you to start taking advantage of all the features of Game Center in your game. So next, we're going to talk about AV Foundation. AV Foundation is the framework you should be using for all of your time-based media needs in your application. We've had AV Foundation on iOS for some time, and we now have brought AV Foundation to Lion. So you can use the same powerful API in iOS and Mac OS for all of your time-based media needs. AV Foundation has four fundamental operations: playback, edit, capture, and export.

And the beauty of the way we've designed AV Foundation, it's designed around some intuitive, easy-to-use, abstract objects. One example of that is AV Asset. And AV Asset abstracts your media types. So whether it be media on local storage on device, or whether it's coming over the network, or whether it's a live HTTP stream, it's all an AV Asset. This is just one example of how we've tried to design the API such that it's easy to use, so that you'll be able to incorporate all of the unique functionality and powerful functionality into your application.

So let's talk a little bit about playback. Playback starts with an AV asset. You attach an AV player object to it to control the playback sequence, start, stop, fast forward. And then to present it, you use an AV presentation object to present it on screen and out the microphone or the speaker.

It's that simple. But AV Foundation is a very powerful API, giving you flexibility and capabilities to use it in unique ways in your application. For example, you can use AV Foundation to build your own custom user interface for the playback experience. AV Foundation does not have a built-in user interface. To get the default user interface, you would have to go up to UI kit. But when you're at the AV Foundation layer, you would provide your own user interface.

We give you all the data you need to build a complete and full-featured playback experience. The user elements you would provide will be composited by core animation, giving you a high-performance user interface. And of course, AV Foundation is an asynchronous API, ensuring that the user interface will remain responsive even under the most demanding playback decoding sequences.

Now new on line in iOS 5 are media options. Media options allow you to have alternate audio tracks and video tracks, so you can have support for things like alternate languages, subtitles, closed captions. We also give you the ability to introspect chapter information for titles, play duration, and artwork. The combination of these gives you the ability to build complete professional-grade user interfaces into your application.

Let's move on to editing. Editing starts with a series of assets. You're able to take clips out of those assets, sequence them together, and combine them in an AV composition object. You can provide transitions between those sequences, such as wipes and blurs, You can overlay audio tracks. And again, we've provided an intuitive, easy-to-use model that you can use to build this editing capability into your application. But it's also a complete and professional-grade interface that allows you to support the most demanding editing problems.

Let's talk about capture. So most devices today come with a camera. And AV Foundation is how you will get access to those cameras. AV Foundation gives you control over the camera, such as focus, exposure, and Control to the Flash, so you can access the flash, allowing you to set the settings of the camera to get the right video and photos that you're desiring. In line, we've introduced a new framework called Core Media I/O. Core Media I/O allows you to write device drivers for your own capture devices. integrating them right into the AV Foundation pipeline, making them accessible to anybody who uses AV Foundation.

So that's capture. And then lastly is export. Now how we've made export easy is we've defined a series of presets. These presets are optimized for classes of devices. So you can simply use a preset when you are exporting or transcoding out to a file. And you can use those presets to ensure that the resulting file, the resulting video, is being captured. And then the final preset, the audio, is going to give the optimal experience during playback for that family of devices. So we have tried to make it easy for you to get the right export properties into the file.

AV Foundation is based on some of the best industry codecs, such as H.264 for video and AAC, ensuring the best quality, no matter if you're using it for HD resolution or for something smaller, or an AAC for scaling the bit rate and quality of your audio. Two of the best codecs in the world. And on Mac OS X, we provide ProRes encoding for integration into professional workflows.

Now, we talked about the pipelines of how you can have playback and edit and capture, but what we also allow you to do is to get access to those frames as they're being captured and do additional processing on them before they get displayed. So you can use OpenGL or the CPU to do additional processing, and we've had that on AV Foundation. Well, new for Line and iOS 5, we now allow you to get frame-level access to the video and the stills on export, allowing you to use Core Image, OpenGL, and the CPU to modify, edit, to enhance the photos and video on export.

So that's AV Foundation. AV Foundation has a wealth of classes. We'll have four sessions this week going into the details of AV Foundation. And we encourage you to learn all the details of how you can take advantage of it. So AV Foundation, it's the framework that you should be using for your time-based media needs in your application. Next is Core Image. Core Image is our powerful image processing framework. We've had Core Image on Lion, on Mac OS X for some time, and now we've brought it to iOS. So we now are giving you the same powerful image processing framework on iOS.

Let's look a little bit closer at how Core Image works. Core Image, you can take a still image or a frame out of a video and apply an effect to it. In this case, I'm showing applying a sepia filter effect, and that filter operates on each individual pixel in that image. Now, Core Image allows you to chain filters together.

and you can build simple or very complex filter chains. And what Core Image will do is, when possible, it'll take those filters and coalesce them together and in real time recompile them into an optimal filter. that you can target against a CPU or GPU, ensuring the maximum performance, no matter how complex of a filter chain you're building.

So on iOS 5, we've brought a number of powerful filters for you to use built right into Core Image. These filters are targeted towards photographic effects, like color controls, crop, straighten, affine transform, shadow adjust. These are available for you to build your filter chains and to process your photos.

We've also introduced a series of APIs. What these APIs do is they build filter chains for you to give you a more complex filtering operation automatically. The first one is Auto-enhance. Auto-enhance gives you that same one-touch enhancement that we built right into iOS 5 photo app. You now have access to that same capability right through Core Image.

We also have face detection. With an API call, face detection will identify the rectangles in the image that contain faces, allowing you to focus-- Allowing you to focus your image processing based on the information returned by those rectangles. Also, red-eye reduction. One function call to achieve the same red-eye reduction that we offer you in our photo application, iOS 5, through an API call.

So that's Core Image, the same powerful image processing framework. We've brought it to iOS 5 with a set of powerful filters that you can use for your photo editing needs. Next, I want to talk about AirPlay. AirPlay allows you to stream audio, to an Apple TV, as well as two third-party devices that support the AirPlay protocol. AirPlay also allows you to stream video to an Apple TV, HD resolution video. And with iPad 2, we introduced an audio/video cable that allowed you to mirror the display. With AirPlay, we now can mirror the display wirelessly.

This gives you an HD resolution wireless mirroring through AirPlay. And the interesting opportunity for you developers, I want to point out, is that-- yeah. So we've integrated AirPlay in such a way that that wireless display behaves the same as though somebody plugged that cable in. So it looks like a second display to you.

And if you have already started using The second display as a feature in your application where you can have it being out the window like this shows, and then the handheld being some other information like course information and being able to use it as a steering wheel. It will just work that same way with AirPlay Video. And I think that poses some unique opportunities for you to leverage this capability in your application, integrating you into the family room environment. So to demonstrate this, I'm going to bring Geoff on stage and we're going to do a demo of this.

Thanks, John. So first thing we'll do is let's go through AirPlay and show you how you can enable that. We bring up the multitasking bar. We'll quit that app. We'll slide over to the Now Playing controls, and you notice there's that AirPlay icon right there. We tap on the AirPlay icon, and you see we have the iPad and the Great Room Apple TV.

If I select the Great Room Apple TV, we now have that new controls as a mirroring of the iPad 2. What you have here is you have your AV assets. So if you're playing audio or you're playing a video, that's going to your Great Room Apple TV. If I turn mirroring on, we mirror. And I think we need to bring up the Apple TV at this point.

There we go. So now let's go demonstrate that again so you actually can see that we do have the Apple TV hooked up there. So if I do a back to select back to iPad, we have our Apple TV. Of course, like Apple TV, remembers your settings as you would expect. So now let's look at the Real Racing example and show you what they did to integrate the controls in front of the player while putting the play experience up on the Apple TV. Launched Real Racing.

If you notice the menu bar on top was blue, status bar was blue, and that indicates you're actually in airplane mode. Okay, so I have my controls up on this side. You have the car. And what we can do is we can hop right into a quick race.

That's my track. I want to do a single lap there. That looks good. Tap on the track into racing. So now I'm going to have the status, what the track looks like, what the upcoming corners are, where I am in the race in front of me, and everyone in the living room and the great room can enjoy the race itself. And of course, controlling directly with your iPad.

There we go. Not starting off very good. I'm in last place. We're seeing if we can pass a few cars here. Sliding by on the right. Right in front of me, I can see exactly what's going on. The airplay. I guess the move's an awesome experience. Great control here. Great work with the visuals and airplay. You've been hanging on the track. And I'll hand it back to John. Thank you very much.

Thank you, Geoff. So AirPlay, we think, is incredible. And we think it gives you some interesting opportunities for how to stream audio and video. Now, AirPlay is not a framework. What AirPlay is, is it's a system service. And how you control it is distributed through other frameworks. For instance, the media player is where the user interface is for routing of video and audio.

UI screen is how you, the class you'll use to get access to be able to drive the second display independently. Or AV Foundation that gives you control over the streaming characteristics you want for your video and audio. So I encourage you to go to the AirPlay sessions. Learn how to take advantage of this and learn what control and APIs you should be using to get the right behaviors for you and your application. So that's AirPlay, wireless streaming of video and audio, a great opportunity for you to integrate with wireless displays and audio devices.

So when we build these technologies at Apple, we put a lot of thought into how they can play together, how people, you developers, may want to use them together to build unique solutions, use them in innovative ways we haven't thought about. So we put a lot of time into high-performance integration of these technologies, allowing you to mix and match them.

And to show you a little demo we put together to try to show that off because we thought it would be worthwhile to give one last punch in this session, I'm going to bring Meriko up on stage and show off a demonstration. Meriko Borogrove, Geoff Stauffer Thanks, John.

If you guys bring up the Apple TV for me, that would be great. In the 1920s, the surrealists invented a parlor game, called the Exquisite Corpse. And the way they would play Exquisite Corpse is they'd sit around, somebody would write a sentence or so, and fold over the paper, so that you could just see the last word.

They'd pass it to their friend, and that person would take that word and key off of it and write another sentence or two. Fold over the paper, leaving just the last word, and then they'd continue on from there. They created crazy stories that had a really interesting thread running through them.

So over the last five or six days, we challenged our engineers to do that with the technologies that you've seen in this kickoff today. And I'd love to show you our story. So the first thing I'm going to do is bring up AirPlay, because I think 720p streaming is awesome.

Okay, so the first thing that they wrote was a fluid dynamic simulation. It's written in GLSL with a huge series of shaders. It has density and acceleration being injected into the simulation by my touch. The color is keying off of the mass of the particles in the plasma. It's quite computationally intensive. I also think it's quite beautiful.

The first thing that they decided to key off was that density. And what we've done is we've brought up an audio track, and we're controlling the complexity of the audio track on the density of the simulation. So you can just hear a percussion line with a little plasma. You can hear the bass come in.

I really start kind of playing. Bring up a guitar. And if I really get crazy, and John Steele. You can get a second guitar line in. So what we're doing here is I have a movie file with four synced audio tracks, one for each instrument. I'm using AV Foundation to play them back and to mix based on the density of the plasma in the simulation. I'm using the pan and the volume controls to do that. And that movie is my AV asset that John was talking to you about.

So the next thing that we thought about injecting in was some extra mass. And to do that, we started taking some input off the front-facing camera. See, I can continue running along here. One of the great things about a shader is that you can change the look and feel of it very simply with parameters once you've got your shader written. I kind of like this ghost look.

The CineDation API have a bunch of new performance enhancements in iOS 5 that are fantastic. You can take your CV pixel buffers and you can send them straight to a GL texture. The way we're doing this is we're taking that frame and we're processing it with an edge map that's a GLSL shader.

And then we're taking that edge map and using it to inject mass into the simulation. Next thing we kind of thought about is that maybe we could add some acceleration in. And we've got these great Core Motion API. So we keyed off of acceleration, and we're measuring the acceleration of the iPad.

Adding acceleration to the particles. And that's just really surreal. You can still see me. So the last thing that we keyed off of is the entire scene. We wrapped it up and we put it on a texture. And if you've been in a media and graphics State of the Union or a Kickoff before, you know what we do with textures, which is we map them on 3D objects. You can see.

We used GLKit to bring up this scene, and we used all three of the performance debugging tools earlier. They were successful there, and optimized the performance to keep the frame rate full here. So I guess what I'd really like to do is recap the storyline for you here. So what we've got is we have a full fluid dynamics simulation running on the GPU. It has density and acceleration being injected by my fingers.

It has mass being injected by camera frames that are being processed again on the GPU, mapped to the surfaces of this cube, along with a real-time video preview being processed to black and white, all on a spinning cube in 3D space, streaming at 720p to my Apple TV, all from my iPad 2.

and the team at Microsoft are here to share their exquisite story. Back to you, John. JOHN STAFFER: I'm glad to hear you clap. It's amazing what you can do on iPads nowadays and with iOS and Mac OS because that is an incredible amount of computing that's going on there to drive those simulations. It's absolutely incredible.

So this week, we have 22 sessions covering the Graphics, Media, and Games technologies. And hopefully, we gave you a little bit of an insight about what you may want to learn and go dive deep into and integrate into your application. We also have 25 labs. These labs are great opportunities for you to go and talk to the engineers that build these technologies, getting help from them, asking them the most detailed and difficult questions about how you should be using and taking advantage of these technologies. So hopefully, we kicked off your week in a useful manner. And thank you for coming, and welcome to WWDC.