Configure player

Close

WWDC Index does not host video files

If you have access to video files, you can configure a URL pattern to be used in a video player.

URL pattern

preview

Use any of these variables in your URL pattern, the pattern is stored in your browsers' local storage.

$id
ID of session: wwdc2009-300
$eventId
ID of event: wwdc2009
$eventContentId
ID of session without event part: 300
$eventShortId
Shortened ID of event: wwdc09
$year
Year of session: 2009
$extension
Extension of original filename: m4v
$filenameAlmostEvery
Filename from "(Almost) Every..." gist: [2009] [Session 300] Graphics an...

WWDC09 • Session 300

Graphics and Media State of the Union

iPhone • 1:07:47

Apple's Graphics and Media frameworks bring sweeping advances to developers with an incredible array of technologies for rich graphics, GPU computation, cutting-edge game development, and platform-optimized audio and video experiences. Learn how you can harness these capabilities in this overview session filled with in-depth information and captivating technology demonstrations.

Speakers: John Stauffer, Geoff Stahl, David Gohara, Tim Bienz, KC Estenson, Tracy Pesin, Meriko Borogrove, Graeme Devine

Unlisted on Apple Developer site

Downloads from Apple

SD Video (344 MB)

Transcript

This transcript has potential transcription errors. We are working on an improved version.

Ladies and Gentlemen, please welcome John Stauffer, Senior Director Graphics and Media Engineering.

[ Applause ]

Welcome, and thank you for coming to the Graphics and Media State of the Union. So you've been hearing a lot today about Mac OS X Snow Leopard and iPhone OS 3.0. Two great platforms for you to develop your applications on.

But a platform is more than just software. A platform is about the hardware and the software, and all the unique features that the hardware and the software bring together for you to use to build an innovative and a great application that we see you making today. So when we were building iPhone OS, we were able to take all the years of experience that we've been building into Mac OS X and bring a lot of those technologies and bring a lot of that experience over to the iPhone. And what enabled us to do is to leverage all that shared code, all that software, all that years of experience that we've been building in making the iPhone a great platform.

And when we brought these technologies to the iPhone, we took them there, we optimized them, and then we tuned them to work great with the iPhone. And in doing so, we were able to learn quite a bit about how to make our technologies better. And when we learned on the iPhone all the things we learned about building that platform, we took that technology and we started bringing it back to Mac OS X. So these two platforms share a lot of technologies, a lot of learning has gone on, on both sides., which is making both platforms better.

So today we're going to be talking about the graphics and media technologies and how we have built these technologies for the iPhone OS and Mac OS X. So whether it be that you're using OpenGL on Mac OS X to access all of the graphics capabilities of the GPU or OpenAL to leverage all of the audio processing capabilities on an iPhone, the graphics and media technologies, our goal when we're building the graphics and media technologies is to provide you with technologies that's been optimized for the platform and to access all the unique hardware capabilities available on that platform.

When we built these technologies we're trying to plan for the future. So when you're looking at our technology this week, when you're hearing about them, we are always asking ourselves what can we do to make those better for you, to give you a solid platform that you can rely on and that you can keep building on.

It's this process that allows us to build a stable base that allows you to build the innovations that we're seeing in the kinds of applications that you're making. It's also what's enabled us from the first day when we launched the iPhone SDK to provide you with a stable set of technologies that we've seen you use over the last year and bring you all of the great applications onto the iPhone App Store. We've seen some incredible applications. An example of this is Brushes. Brushes is an iPhone application where you use your finger to paint. It's based on Quartz, Quartz was a Mac OS X technology that we brought to the iPhone.

We optimized it on the iPhone. And combined with the touch screen of an iPod touch or iPhone, this innovative application was able to be made And with that, just last week Jorge Columbo took that application and on his iPhone, he painted the cover of last week's New Yorker We find that pretty incredible, that the picture, the cover of the New Yorker was painted on an iPhone.

So this is an example of using the software features of the platform and the hardware features together to build an innovative application -- we think this is a great example and we really like seeing these kinds of combinations of technology brought together. So there's 20 categories of applications on the iPhone Store. And games, by far, represents the largest category of applications. Games is interesting for us. Being a graphics and media person, because they stress the graphics and media technologies to the limits.

And that helps us learn how to make those technologies better for the future. It helps us learn how to optimize those technologies so that we can provide you with better, more optimal technologies. So today we're going to talk about graphics and media on the iPhone and Mac OS X. And talk about some of the shared technologies on those platforms. But also point out some of the unique technologies on the iPhone and Mac OS X platforms.

And then we're going to talk about the iPhone, and the unique capabilities of iPhone OS 3.0 and try to give you some insight into the technology, the capabilities that we're bringing to that platform. And we're also going to pay special attention to games, because we think games are a big part of what make the iPhone. So with that, I'm going to invite up Geoff Stahl. Geoff is going to start the next section of the presentation, to talk about graphics.

[ Applause ]

Thanks John. I'm going to talk about three key technologies that allow you to access the power of the graphics processing unit. And we'll start at the foundation with OpenGL. OpenGL is a hardware abstraction layer built on top of the GPU that has evolved over a number of years from a configurable fixed function graphics pipeline for 2D and 3D graphics to= today's fully programmable pipeline that unlocks the power of the most modern GPUs providing dynamic lighting, real-time depth of field immersive environments like this example of the latest game from id software. All possible because of the power of OpenGL as a foundational technology for graphics. In Mac OS X we recognize that, and we built a number of technologies on top of that.

So we have Core Image or Core Animation, Core Video, even our Windows server in Quartz, accelerated by the graphic processing unit using OpenGL. These allow you to program either to OpenGL as a foundation technology or pick one of these higher level abstractions to optimize your interface to our software.

And of course all these technologies from the kernel graphics drivers all the way through these high level APIs are native 64-bit for Snow Leopard. It made sense when we looked at iPhone and iPhone OS and we needed a foundational graphics technology, it made absolute sense that we look at OpenGL ES. OpenGL for embedded systems, it's based on GL, providing again, a configurable fixed function graphics pipeline for 2D and 3D graphics that allows you to do some amazing things. And you guys picked these up too. We've seen some amazing things. And these are just a few of them.

We've seen the projected shadows in Motion X Poker, or the real-time -- the realistic textures and real-time lighting in Zen Bound, taking a very simple concept and making it immersive. Or the almost infinitely scalable world that Google Earth provides. These are all powered by OpenGL ES 1.1 on the iPhone. But there's one thing we heard from the time we released SDK a year ago that you guys all wanted. You wanted access to fully programmable hardware and a fully programmable graphics pipeline. Well today we're providing that with OpenGL ES 2.0.

[ Applause ]

So what is OpenGL ES 2.0, and why is it different than OpenGL ES 1.1? Well, it is based on desktop OpenGL 2.0 for mobile devices. It provides that access to the programmable hardware through GLSL shading language, allowing you to write vertex shaders and fragment shaders.

Those are very small pieces of code that run directly on the GPU, that give you really full access to the power of the graphics processing unit. And of course it allows very efficient GPU access for your mobile device. Here's an example. Instead of writing these multiple lines of OpenGL API calls and a multipass algorithm to do some kind of color correction, what we can do is collapse that down into a single line of a fragment shader. This is really powerful, again.

Giving you more direct access to that GPU. So instead of worrying about state and how to configure the graphics pipeline, you're worried about exactly what kind of code and commands you want to execute on the GPU. And as an example of this I want to bring Alex Eddie up on stage, and look at a demo we worked with our partners at Imagination Technologies on, which shows the power of the fragment shaders on the GPU.

So what this demo is, is a sky box, basically. So it's a kind of an immersive environment you're in, and we have these windows. And each window contains a fragment shader or a program that's running on the GPU that's directly operating on the fragments in the scene. And what you can do is, you can see how some of these fragments with this distortion, you can write the pieces of code to directly manipulate the pixels on the screen.

Remember, all of this is on the GPU. No CPU work involved in actually manipulating the pixels on the screen. You can do things like, here, this is a great edge filter. So you know, we've seen some color effects, but this is actually using the color of the pixels to check the edges in this scene.

You can imagine what you can do with this kind of power. Here's more artistic looking shader. So it's not limited to just, you know, adjusting saturation, or making a black and white image or doing edge detection, you can do many kinds of artistic effects. And let's take a look at some of the things that we can do with fragment shaders on the GPU. Again, interacting with a 3D scene, it's not just limited to a 2D background. Fully interacting with that geometry, with the shader here, the glass distortion. So this is running on the iPhone 3G S, fully on the GPU fragment shading.

Thank you, Alex.

[ Applause ]

So iPhone 3G S, the most powerful and fastest iPhone we ever built, we provide OpenGL ES 2.0 to allow you full access to the programmable hardware in that device. We also provide, of course, OpenGL ES 1.1. This allows you as app designers to determine how you want to target your next application, and combine this with the power of the smart App Store, which allows -- combined with the device capabilities in your application bundle ensures that your users get the correct application, correct version of your application for whatever device they're on.

Really powerful technology and integration there. So that's OpenGL, and OpenGL ES. Providing the graphics foundation technologies, unlocking the power of the GPU for both iPhone OS and Mac OS X. But it's not always that you want to be at that low level, dealing with points, lines, vertices, sometimes you want higher level abstraction.

You want to look at that user experience. We have Core Animation. One of the things I mentioned at the very beginning. This is a higher level abstraction on Mac OS X and iPhone OS, and what it does is -- it is the automatic animation of 2D planes in space.

It's accelerated by the GPU. You can move to this higher level abstraction, optimize your software interface without giving up that power of the GPU. In fact, Core Animation is so powerful that it is the basis for the user experience on iPhone. From cover flow to the iTunes video player. All of this user experience is driven through Core Animation utilizing the GPU on the iPhone.

And of course, you recognize this. Whether it is Aki Mahjong with its tiled, scalable user experience making it very easy to navigate a complicated game. Or RoamBi, which allows you to create custom business charts with your critical data, and take it with you, keeping it in your pocket. Both of these user interfaces, outstanding examples of using Core Animation as a higher level abstraction and creating great applications.

So it was only natural, created on the iPhone, that we would look at this for Mac OS X. And we brought it to Mac OS X, we've talked in the past about Time Machine, and the use of Core Animation there, but top sites, we showed it this morning, talked about it as a great technology in Safari 4. Again, built on Core Animation.

The designers, the engineers, can now concentrate on that user experience rather than having to worry about whether -- what mipmap level they have, or how big the textures are relative to what the hardware capabilities are. Core Animation extracts that for you. This morning we showed Dock Expose. So it takes your cluttered desktop and allows you easily to highlight the application that you're interested in, show those windows, and pick your window of interest that you're actually looking for that content. Again, powered by Core Animation. So instead of rewriting an entire animation system, they rewrote the way they use Core Animation, rely on those optimized fundamentals there. Concentrating on a great user experience, allowing the designers and graphic artists to not worry about those low-level details.

And we haven't sat still with Core Animation. Core Animation, for Snow Leopard and for iPhone OS 3.0 has a number of new features, including particle systems and Bezier paths, improved efficiency to the GPU. And we brought those APIs even closer together, making them nearly identical. So the technology you learn in one place you can use in another place.

The synergy there is fantastic. Core Animation, created for the iPhone, brought to Mac OS X, allow us to do the heavy lifting for you so you can concentrate on user experience. So I want to take a little change of direction here. We introduced OpenCL, open compute language last year. And the last two areas of technology that I talked about are unlocking graphics technologies for the GPU.

OpenCL is about compute technology. It's about taking the GPU beyond graphics. That's key. The idea here is that the GPU in that -- extreme power is not just for graphics processing. So what is OpenCL. OpenCL is a very thin runtime layer that allows you to build kernels written in OpenCL C, a C-like language based on C99 with some vector intrinsics, that's layered on top of the industrial strength runtime compiler that provides automatic optimization for whatever platform you're on. We also have IEEE 754 base math. And this is critical.

With some previous GPGPU implementations what we saw was you move from device to device or vendor to vendor, you may have differences in your calculations. OpenCL is one of the primary design tenets wants accurate, repeatable math, allowing you to really concentrate on your algorithm, rather than what device you're on. So OpenCL. Obviously, it runs on the GPU, it's designed for that.

It also runs on the CPU. Putting that together, most importantly, you can run your algorithms on the GPU and the CPU, unlocking the full power of that desktop system. So what are we talking about. Last year we did the Galaxies demo. The Galaxies demo is very interesting, because it's both a demonstration and it's a real world application.

We have processing that's done on the GPU or on the CPU. We combine those results for display, updating the data set and continuing that through every frame of the simulation. So it's a real world task, and we used this to show some real world performance last year. Let's see where we were.

A core CPU, about 70 gigaflops, 70 billion floating-point operations per second. Bringing that to the CPU last year, about 200 gigaflops. 200 floating point operations per second. And like I said, even more importantly, combining that together the full system almost a quarter of a terraflop. A quarter of a trillion floating-point operations per second. So that's pretty darn good. We got great power out of the system. What I want to do is take a demo and I want to show you where we've come in the year -- in the last year, working with software and working with the hardware.

So this is the Galaxies demo. Same algorithms as last year. And right now we're running on a multicore CPU. We're running a CPU on this Mac Pro. And now we're seeing about 100 gigaflops. So it's improved since last year, unlocking for potential of the CPU. Let's move this to the GPU and see where we end up. So now with the GPU, we surpassed a quarter of a terraflop.

We're at 360 gigaflops, billions -- floating-point operations per second. But what's more interesting, what's more important is to be able to take that GPU, a couple GPUs, combine it with a CPU, and see what we get. CPU, two GPUs, over a terraflop. Over a trillion floating-point operations per second, real world application measured performance. This is truly amazing. A terraflop of processing power underneath your desk.

[ Applause ]

So let's look at where we -- let's put those new results on our little graph here. 8 Core GPU, CPU, and full system. Now we're seeing -- CPU got better, the GPUs even moving farther up, almost toward 400 giga flops, and that full system combining CPU and GPU, same algorithm hosted on the CPU and the GPU using OpenCL, a terraflop of processing power under your desktop. So what do you use that power for? Well, I want to invite of David Gohara from Washington University School of Medicine to talk about his experience using OpenCL and -- David.

Thank you.

[ Applause ]

It's a pleasure to be here this afternoon. I'd like to start off with an example of a calculation we've been able to accelerate using OpenCL. What you're looking at is a typical biological molecule that we might be interested in studying. For this demo, the specifics of the calculation aren't important, but what we typically need to evaluate are the electrostatic and chemical properties of these molecules. And how those properties are important for molecular interactions, and that's what we're visualizing on screen right here.

It's important also to consider that when we do these calculations, we typically have to perform them tens of hundreds of thousands of times. So speed is important. However, we can parallelize this calculation, and when we do this on 16 threads we get about an 11 X speedup, which is really quite good. But it also begs the question can we get more out of this by running this on the GPU. And it turns out we can. So can we see that? And maybe one more time.

Pretty awesome, huh? So --

[ Applause ]

So what you can see is that by moving the calculation over to the GPU, certainly over 16 threads we're getting a dramatic performance increase, but when you compare this to the performance of the code on the single CPU we're talking about 175 X speedup, which is phenomenal.

There's two things that I'd like to point out though about this calculation. The first is that this code that we're running on the GPU is identical to the code that we're running on the CPU. It's quite literally a copy and paste. The second thing, which is more important, is that the code that -- the results that I'm showing you here on the GPU are numerically identical to what we get on the CPU. Which for us is far more important, because while speed is nice, accuracy is the ultimate goal. And with openly CL we seem to be able to actually achieve both of those kinds of things.

However, when we do this calculation it's typically a part of a larger set of computations. An example of that is shown here, where we might be looking at the interactions between hundreds or thousands of molecules. And how those interactions are important, for example, in the development of drug delivery systems, which is what we're seeing here.

So in this animation what you're looking at are the results of a series of molecular dynamics and electrostatics calculations of a drug loading into a drug delivery system. This represents about 150 nanoseconds of atomic motion -- so it's fast. But in order for us to calculate, it required about one month of wall clock time on a 16 CPU cluster. Now it's important for me to point out that behind all the pretty picture that you're seeing here, there's real science, there's real data going on, this is the real deal.

And so that's important to keep in mind. The last thing I'd like to point out with regards to OpenCL is that OpenCL really makes programming for the GPU far more accessible to people like myself. Prior to about January of this year I never programmed anything to run on a GPU.

And within about a month and a half had significant portions of our electrostatics code running on the GPU efficiently. And I think this speaks very highly of the underlying technology and its implementation. So with that, I'd like to turn the presentation back over to Geoff, and I thank you all for your time.

[ Applause ]

Thank you David. Absolutely amazing work. But the two examples we've shown are kind of the hard science examples. We have a gravitational calculation where we're having multiple bodies, tens of thousands of bodies, all each interacting with each other in an end square kind of problem, then we have our electrostatics of the medical research. Is that all that we can use OpenCL for? You have to have a hard science problem? Absolutely not.

OpenCL can be used for any application. Identity your data intensive, your computationally intensive parts of your application, host them in OpenCL C, write a kernel that executes on the GPU, the CPU, or the entire system, and you can really unlock the power of your platform, your desktop machine.

So that's OpenCL. Last year at WWDC we introduced OpenCL. And we stated our intentions of bring this to the Khronos Standards Body. Well, I have good news. December 2008, working with our industry partners, OpenCL was ratified as the open industry standard for parallel computing on the GPU. And even better, in your developer preview you have the first conformant OpenCL implementation available to you today. So we can't wait to see what you guys can do with it.

[ Applause ]

So I've talked to you about a number of technologies. OpenGL and OpenGL ES. iPhone OS, Mac OS X. Foundational graphics technologies allowing you to unlock the power of the graphics processing unit. Core Animation. iPhone OS, Mac OS X, lowing you that higher level abstraction. Concentrate on the user experience, let us do the heavy lifting. And finally, OpenCL. New technology allowing you to unlock the computing power of your entire system to take the GPU beyond graphics. Thanks very much, and I'll turn the presentation over to Tim Bienz to talk about media.

[ Applause ]

Thanks, Geoff. QuickTime is a very successful technology. It pioneered video and media on the desktop. Throughout its history it has repeatedly pushed forward the state of the art. You use it in many of your applications, and it is central to Apple's consumer applications, professional applications, and to iTunes.

One of the really amazing things about QuickTime is that its architecture has lasted almost 20 years now. And if you think about the dramatic changes in platforms that have occurred during that time, hardware and software changes, that's almost unbelievable. We're undergoing another dramatic revolution in platforms as you've seen, with the emergence of mobile platforms, the iPhone, and as Phil mentioned this morning, the increased predominance of notebook computers.

And with that transition that's going on we see that the needs for a media system are changing. In particular, to be successful in the future we believe a media system needs to be portable. It needs to be able to be implemented on small handheld machines, as well as large multicore multi-CPU machines. It needs to be modular.

It needs to be able to implement a small, simple playback solution, perhaps for a hand held device, as well as a full blown professional level video support for large machines. And finally, it needs to be efficient. That means the media system needs to take advantage of whatever hardware is on the machine it is running on at that time.

Fortunately, we have experience implementing just such a media technology, that's the video playback stack we've implemented for the iPhone over the past couple years. And so what we are doing in Snow Leopard is taking all that we learned from the iPhone and bringing that back to the desktop as the beginnings of QuickTime X. QuickTime X is a modern framework for media on Mac OS X.

In Snow Leopard, QuickTime will support playback of modern media. In addition, QuickTime will support capture, export to Apple devices, that means iPhone, iPod, an Apple TV, and finally, QuickTime X in Snow Leopard will support an entirely new streaming technology. Of course if we're building a world class media system for the future we need to build it on the best technologies we have on the platform.

And so we're building QuickTime X on top of Core Audio for professional low latency audio. On top of Core Video, for hardware accelerated video. And on top of Core Animation, which you heard Geoff talk about, to provide GPU accelerated compositing. Everything we're doing in QuickTime X, 64-bit native.

So you get the performance advantages and the large address space from that. We are also using ColorSync throughout QuickTime 10 to provide consistent and correct color for playback, capture, and export. You access QuickTime X using the same QT Kit Cocoa APIs that you've been using to access QuickTime for years now. In terms of formats, we're focusing QuickTime X on modern technologies, modern formats, such as H.264 for video, and AAC for audio. Let's take a minute and look at each of these.

For Snow Leopard we're implementing HE-AAC audio. This provides very high quality audio at low data rates. Data rates below 64 kilobytes per sec, for example. And this is really important for applications such as internet radio that need to provide the best quality audio at very low data rates.

It does this through a technology called spectral band replication, which encodes the low and middle frequencies of the data, but does not directly encode the high frequencies. Instead, the high frequencies are reconstructed at playback time from the low and mid frequencies, plus a small amount of control data that's embedded in the file. HE-AAC support is available both on iPhone OS 3.0 and on Snow Leopard. Let's turn our attention to video and H.264.

What's new in Snow Leopard is that your application can access the hardware accelerated H.264 decoders present on most of the machines that we're currently shipping. What this means for applications is that they can support high quality HD playback of rich, immersive HD content, regardless of the CPU they're running on.

In addition, it allows your code and your applications to be more efficient because the work of decoding the video is offloading from the CPU to the GPU, leaving the CPU free to do other work. In Snow Leopard, we're doing not only framework level work, but we're also happy to introduce a completely new QuickTime Player, streamline QuickTime Player. This new QuickTime Player leverages all of the technologies we're introducing for QuickTime X.

Playback, capture, export, and streaming. Now the new QuickTime Player you've seen a little bit this morning provides a very streamlined playback experience. In addition, it allows you to record audio, video, and to record your screen. It provides simple trim level editing to allow you to trim in and out points on your clips. It allows you to convert your content for Apple devices, iPhone, iPod, Apple TV.

It allows you to convert it for formats and bitrates suitable for the web. And finally it allows you to convert your data to formats for HD, such as 780P and 1080P. Finally, you can directly share your content from the new QuickTime player to iTunes MobileMe Gallery and YouTube.

Let's take a quick look at the QuickTime Player. So I'm going to go through much of the same things that you saw this morning with Craig, and I'm going to take a little more time to do this. And so we'll open up a piece of content and start it playing. Now this is playing using the hardware H.264 decoder provided by QuickTime X. You've seen the HUD already this morning.

And the HUD is implemented using Core Animation. Once we're done sort of setting up playback here, when we move the mouse out of the window, all the controls in the HUD fade away, leaving your content front and center so you can concentrate purely on the content. In addition to supporting a windowed mode, we support full screen. And in full screen, we support both original aspect ratio as well as you can zoom in and display in fullscreen if you prefer, using the entire area of your display. I happened to prefer original aspect ratio, so let's go back to that.

Let's just stop for a couple of seconds and watch this.

[ Music ]

That really looks fantastic. So you saw a little about the trim, let's talk about that again. And I should also mention the scaling in full-screen is done using the GPU, so the CPU is off loaded from doing that. Let's go ahead and get ready to trim this content.

We'll select trim here. And I know there's a clip in here I want. Now I can either select the clip or get in the area visually, as you can see here with the thumbnails. I'll show you a little trick here. If you hold down the option key, instead of seeing the thumbnails you actually see the audio waveforms. And depending on what your content is, that can sometimes be useful as a cue. Here the video frames happen to work better.

So I'll go back to use those. So I know there's a segment right about here someplace -- yeah, right about there. What I want to do is I want to put over on my Apple TV in the living room and show it to my friends. So for demo purposes I'll pick a very short segment here. That looks about right. And we'll click the trim button to trim it. And now we'll just preview that. It's only a few seconds long.

In fact, I selected a little too short, it's only 2 seconds long. There you go. And now what I can do is I can take this and select share to iTunes. What will happen when I do this is the new QuickTime Player will use QuickTime Xs new export capabilities to convert this into a format for iTunes and directly share it into my iTunes library. Let's go ahead and do that. It asks what kind of device capabilities or compatibility I'd like. Well, I want to show this on my Apple TV, so I'm going to click on that one. Now let me go ahead and share.

Now it's only a couple of seconds, it should convert really quickly. And it's going -- now it's moving into my iTunes library. And there it actually is. Along with telling me I need a new version of iTunes. We'll just play this back and make sure it's the clip we picked. Sure enough, that's exactly the one I wanted. Couldn't be easier. That's the new QuickTime player.

[ Applause ]

So all of the things that you've seen, all of the features we've shown you in QuickTime Player are available as part of Snow Leopard. Let's change topics and talk a little bit about HTTP live streaming. We believe it is very important to support playback of not only preauthored or stored content, but to support the playback of live content. But we want to do it right. And that places a bunch of constraints on the solution that helps drive the technology we've chosen. In particular, we want a solution that's built on web standards.

Because we want you to be able to post your content very easily to existing servers, or deploy it easily on commercial content delivery networks such as Akamai. So we selected HTTP. We want to format this firewall-friendly so that your end users don't have to worry about trying to configure to get the streaming video through their firewalls, NATs, whatever they may have.

Again, HTTP works really well for this. We definitely want a technology that's suitable for use by commercial broadcasters. We'd like a whole range of clients to be able to use this and developers to be able to use this, and certainly commercial broadcast is one of the places that's important. And so this does two things for us. This means we want a format that's directly compatible with commercial hardware and encoders such as those from Envivio or Inlet Technologies. And secondly, we want to support encryption, and we support AES encryption.

So the content providers for whom protection is an issue can protect their content. Finally, we want a technology that dynamically adapts to network conditions. This is particularly important on mobile devices such as iPhone where the network conditions can change dramatically during the course of playback. And so we provided a solution that allows you to post multiple versions of your content at different bitrates, and dynamically during playback on the client side, the client selects the best one at any given time. This allows your customers to always see best quality video at any given time, regardless what the network conditions are. So of course we're focusing on modern codex, H.264 for video, and AC for audio.

And they're wrapped up in a broadcast-friendly stream format. We've taken everything that we're doing here and we're publishing it as an IETF draft. This allows you to either use our implementation or for some reason you need or want to do your own, you have all the information you need to do that. Let's see how a solution would work for a commercial broadcaster. First as I said, need a commercial hardware encoder, such as those from Envivio, or Inlet Technologies.

The output from that can be fed directly into a small segmenter application that we're shipping as part of Snow Leopard. The purpose of this application is to take the stream coming out of the encoder and chop it up into short, perhaps 10, 20 second fragments, segments, store those as files that can then be directly up loaded to your HTTP server, or to a content delivery network. That's all you have to do. From there, soft run the client, either on a Snow Leopard machine or an iPhone OS 3.0 machine.

Can directly access it and view it on either of those platforms. So that really sounds great in principle, and it looks really simple, but how well does it work in practice. Well to talk about that I have two developers that I will bring up on stage. First I'd like to bring up KC Estenson from CNN to talk about their experience in using this technology for live news broadcast.

KC?

[ Applause ]

Thanks, Ken. CNN reaches over 2 billion people world wide every day. And I am pleased to announce today CNN is coming to the iPhone. We are really excited about the app as you all know the iPhone is an incredible, incredible device. And what we set out to do as sort of a core design principle behind our app is to bring order to the chaos of hundreds if not thousands of events that happen around the world every single day. As Tim mentioned, we are a commercial broadcaster, and we serve on CNN.com alone over 100 million videos every single month.

So we need something that can scale. So with our application we are going to bring live, streaming video to the iPhone, which our head of product development has up here -- what's great about this is how easy Apple made it for us to do. We spent, you know, I think [Inaudible] will show you the app in a second.

But we spent months working on the user interface of the application itself. But really when we went in to build the app it took us two weeks to actually build the application, as you can see here. And when we actually went to set up the HTTP streaming protocol it literally stood up in a matter of hours.

So you can see the clean design in being able to organize all these videos and photos and news of the day is something that we just feel we're really, really excited about. Also with the HTTP streaming protocol and the 3.0 SDK, we're able to push the live video directly from inside of the app. Just something we're so thrilled about. So with this application in the next couple of months, coming to an iPhone or an iPod touch near you will be the global power of CNN in the palm of your hand.

Thank you very much.

[ Applause ]

Thanks Casey. It's fantastic to see what you've been able to do so quickly and hear how easy it was to set up. Very much look forward to being able to download the app from the App Store. Next, I'd like to invite up Tracy Pesin from MLB Advanced Media to talk about what MLB is doing with this technology in the area of sports broadcasting, particularly baseball.

Tracy?

Hi, I'm here with developer Jeremy Shaner to demonstrate new functionality we're incorporating into MLB.com At Bat 2009, leveraging new features in iPhone OS 3.0. The iPhone platform had already enabled us to create an unmatched experience of following a live baseball game on a mobile device. We can quickly check the state of any game in progress.

We can listen to a live audio broadcast. We can follow real-time pitch by pitch data, and we can watch video highlights. Now ever since mlb.com started streaming full length live baseball games over the Internet in 2002. We really wanted to bring the same experience it a mobile device, and iPhone OS is helping us do that. As we select a feed, the first thing the application does is use Core Location and the new MapKit API to determine and display our location, and let us know if we're in an area where we can access the content.

And here you see a live game in progress. And as you know, the player uses adaptive bitrate streaming which means MLB could just provide several versions of the content at varying bitrates, and the player automatically choses the best one for our connection. And the data is served over HTTP, which really simplified set up at our CDN and across the whole delivery platform. And the player also makes it easy to encrypt the data if needed. So we are very excited to be bringing this new experience to baseball fans and happy that iPhone OS 3.0 is helping us to do so.

Thank you.

[ Applause ]

Thanks, Tracy. It's great to see that addition to your app, I look forward to being able to download it, and very happy to hear that the encryption and the dynamic bitrate switching is working well for you. So with the HTTP live streaming, we think we have a great technology here. I hope that the couple of demos you've see today give you an idea of the kinds of things that are possible with this.

We're really excited to see what you're able to do with this over the next year and think it will bring a wide range of content both to Snow Leopard and to the iPhone. And finally, QuickTime X. We are thrilled to be able to ship the first version of QuickTime X in Snow Leopard. QuickTime X provides a solid modern framework for media on which both we and you can innovate in Snow Leopard and beyond, and I very much look forward to that. Thank you, and with that I'd like to turn it over to Meriko Borogove to talk about the iPhone.

[ Applause ]

Thank you Tim. I am so excited to be here talking to you about iPod touch and iPhone today. You've been hearing all day about the run away success that this operating system and platform has been for us, and I think this is in large part due to you, your applications, and the application store. We've been throwing around crazy numbers. 40 million iPod touch and iPhone users in the market, a million applications downloaded to date. 50,000 applications in the application store. These numbers are crazy. They're kind of hard for me to comprehend on a day to day basis.

So I like to think about what kinds of applications you're putting out. John touched on this earlier when he told you that games is far and away the most populace category in the App Store today. But when you start looking at the other categories, education, music, photography, sports, social networking, you guys are taking a ton of applications that rely on graphics and media technology at their heart.

When you think about that, most of the applications are looking at our technologies in graphics and media. This is really exciting to me. Geoff talked to you about our graphics stack in iPhone OS 3.0. And on the iPhone 3G S. Tim Bienz did a great job telling you about our HTTP live streaming technologies.

I'd like to talk a little bit about the camera in imaging. I'd like to talk about our audio stack on iPhone and iPod touch. And I'm really excited to be talking about gaming. So if we can start with the camera. John told you that you guys are making great drawing applications. The photographs and drawings that your users are making are all over main stream print media. They're on the cover of the New Yorker, they're in newspapers, I've even seen museum exhibits of iPhone photographs.

You guys are making some great photography applications that process images. This is an application called Camera Kit. It emulates classic film and film processing techniques right on your iPhone. So you can use cross processing, you can look at vignetting and push pull processing, you can even emulate sepia tone film right on your iPhone. This is pretty cool. Another application. Color splash. This is really fun.

You can use the touch interface to grow your photo incredibly large so that your finger becomes nearly infinitely precise pointing device for out lining and coloring your photos. You wind up with things that really pop. This took me three, four minutes to create a couple of days ago on my iPhone. Finally, we have the iPhone 3G S and the new camera.

We are so proud of this camera and the software that is driving it. As you heard this morning from Philip taking beautiful 3 mega pixel images with automatic focus. We have touch to focus with automatic exposure to make sure that you're getting the correct exposure for the object you're focusing on.

Whether that be the house or the flowers in front. We have great low light performance, and my favorite, beautiful automatic macro, up to about 10 centimeters. The good news for you as Phil mentioned, if you're a still image or a camera developer on the iPhone, you get the advantages of this new camera for free. You don't have to change your application at all. Your users are going to have bigger, more beautiful photographs to work with right inside your application.

As an old QuickTime person, I'm thrilled to be up here talking about video. We've been waiting a long time for this. Much like our camera applications, we are looking forward incredibly to you guys making great video applications on iPhone. So what we've done is we've opened up that API, much like our still image API. You have access to our recording user interface. Your user is going to get the same experience that they do in our camera application.

They can use automatic focus and touch to focus to the point where they record, at which point we lock focus for the duration of the recording. You'll get beautiful 30 frame per second h.264 video, and high quality AAC audio. When your user is done recording you're given a pointer to a file that you can process, store off in your application or up load to the net as you like.

We've also updated our image picker controllers to include video. If you make no changes, you'll still have your plain still images in your camera roll when you expose it to your users. If you opt into video, you can get your videos mixed in with your still images, as we do in our camera application, or if you're a video focused application, you can get just videos. We think this is pretty cool.

We've also given you access to our new trim user interface, so your users can edit video right on their phones, just like they can chose to use your -- you guys can choose to use our crop interface with still images today. And that's cameras and imaging. I can't wait to see what kind of videos your users are going to be posting to the Internet over the next year or so. So if we can turn an ear to audio, sound is at the center of the user experience on iPhone.

The behavior is so intuitive it feels completely natural. And that's because the user always gets to choose where they hear their video -- their audio. I'm kind of hung up on video today -- where they hear their audio, when they hear their audio, and how loud it might be.

Now we don't ever ask them about that, so the system can get fairly complicated. We have four discreet inputs, we have six outputs, including one that's new in iPhone OS 3.0, which is HTTP streaming music. You're going to be able to stream music wirelessly to your favorite A2D2 head phones, head sets, speakers, car kits, those sorts of things.

[ Applause ]

So on top of that you throw on your ringer switch behavior, and contextual per volume, per application, per route volumes. And things can get pretty complicated. The good news is that we've solved the problem. And you can too. And the key to solving that problem is something we call audio session. Audio session is absolutely your friend.

It's going to shelter your application from the environment around it, the users whim, and new features like A2D2 streaming. So what do you do, how do you take advantage of audio system. The point of audio system is for you to tell us the purpose of your audio, which allows us in turn to give the correct behavior given the environment at hand.

So how do you do that? Any time you want to play some audio or record some audio, simply open up an audio session and tell us what kind of audio you're playing. Are you a streaming music application, are you an immersive game, are you a recording application like Shazam, are you a voice over IP application that requires low latency full duplex audio recording.

No problem, tell us what you're doing, we'll take care of the behavior. Last year you told us that you wanted a little bit more flexibility in audio session. That you would like to choose whether your audio interrupts music when you play it, just like a ring tone does, or mixes in like a key click or an SMS sound. In OS 3.0 we opened up those options inside of your friendly audio session. So we encourage you to call them. If you take nothing else away from my audio part today, call audio session, it's your friend.

It will help you. I promise. So I'd like to take you on a quick tour through our Audio API. We have four families of Audio API. We'll run through them pretty quickly. The first is Core Audio's Audio ToolBox and Audio Units. This is a technology that we have on Mac OS X. Our developers know and love it -- most of the time. We use it for low latency professional direct access to the audio hardware in the audio stack.

Our friends at Smule have used this to great effect. They made a great audio engine where they turn your iPhone into a wind instrument, like the Ocarina. Most recently, they put out Leaf Trombone, which not only is a wind instrument, but it's an educational instrument. And collaborative music instrument. They're sort of crazy at Smule, they've actually formed an iPhone only band called MoPhO -- with a Ph -- they have their debut performance on Wednesday in the lunch session.

I encourage you to check them out. I'll certainly be there. Now most of the time you love the Audio ToolBox. But some of you really like Cocoa Touch. And you asked very loudly for some Cocoa Touch interfaces to our Audio ToolBox. We've given those to you. We have the AV foundation classes, they've given you playback, recording, and even a Cocoa Touch interface to audio session, lest you forget to tell us what kind of audio you're playing. Applications like Ruben & Lullaby love the low configuration point and go experience of this.

They're using AV foundation to make sure that you get an immersive experience with this beautiful story telling -- and it -- all of the interaction is driven entirely by music and touch. It's pretty cool. We have OpenAL. OpenAL is our 3D positional au=dio system. It's an open standard, we have it on Mac OS X, there are implementations on Windows and it's been on iPhone OS since day one. OpenAL is a great companion to OpenGL.

It's great for games. Uses localized source positioning and distance filtering to give you a 3-dimensional ROL experience. The next time you play Zen Bound or Super Monkey Ball I encourage you to plug some head phones in. You'll find that you're in the middle of the game surrounded by all of the game audio. I think it's pretty cool.

Finally, you want your users to have access to their music from directly inside your application. We've enabled this in iPhone OS 3.0, you have two ways to access your user's music. One is through a standard user interface that's based on our iPod music, our iPod application on iPod touch. And the other is direct programmatic access to your music. You can access things by album, artist, playlist, track, you can even get the album art work if you like to display that from inside your application.

So iPod library access. To tie up the audio portion there's this great application called Bloom, and I'd like Graeme Divine from our iPhone Game Technologies team to come and play us a little bit of music. Last year -- Graeme has some fans, huh?

[ Applause ]

So last fall -- last fall Brian Eno, the father of ambient music got together with Peter Childers and put out this application that's based entirely on OpenAL with a full 3D mixer. It's using OpenGL to draw these dots, and I'm just going to let us listen for a moment.

[ Music ]

What this application is doing is letting any of you have an interactive generative ambient music experience. It's turning all of us into musicians, and we think it's just lovely. Thanks Graeme.

[ Applause ]

So that's audio. So I am over the moon to be talking to you about gaming today.

This is really close to my heart. A year ago we gave you an SDK full of awesome goodies. 3D audio, 3D graphics, innovative input methods, gorgeous full-screen video. On world class hardware that at its heart is portable, always connected with full time access to your applications. We think this is a great story for game developers. You guys do too. Neal Young who you saw this morning during the Keynote tells us that one of the great things iPhone is doing is changing the relationship between the people making games and the people buying them.

And if you forgive me, I just want to sit and talk about this for a minute because we're an inflection point where it comes to the game industry. And I think this is really interesting. And it's all about the audience. There's an untapped audience that you, our developers, have found, and you've absolutely nailed. And the game industry has been chasing after them, in my opinion unsuccessfully, for some time. I think this is really cool.

The question is how did we do it. Well, I think it starts with the application store. And how it starts with the application store is you make a great game, you send it to the application store, and it gets posted. It goes up in the new games. I see the game, I go pick it up, it's fun.

I show it to Graeme at lunch. He thinks it's a great game too, so he buys it on the spot without getting up from the lunch table. He takes it home, he shows his daughter, his daughter buys it on the spot and takes it to school. Meanwhile, I've gone home -- you're getting the idea.

This is kind of a chain reaction. Enough people start buying your application and it makes it to the popular list. Gets to the popular list, and even more people. The velocity of this just takes off. It's pretty cool. If you want another view on this, you should hit the second floor. Have you seen the application wall that's down there? Yeah? There are 20,000 applica tions arranged by color. And every time an application is bought, it comes to the front, lights up, and wiggles.

If you stand in front of it for a few minutes, it's a little bit blade runner, but it's pretty amazing. Things are being bought a whole lot faster than you think. So this is all great. But you guys took it one step further. As soon as you started releasing games, you started putting up leader boards, you implemented email systems so I can taunt my family and my friends and tell them what I'm playing and that they should try to beat me. You implemented Facebook Connect.

You're posting your scores directly from within your game to Facebook. This is kind of cool. I read the other day that every posting to a Facebook wall of one of these scores with a link back to the application store gets about 140 views within a day. Pretty cool. You're also posting Twitter notification systems.

So this is -- this is great. You've got a whole bunch of social interactions so that your games are spreading. There were a bunch of announcements last week by some big companies. They're really excited about social networking aspect of gaming, and they've been implementing Facebook access, Twitter access. And I think this is profound. Because you guys, the big guys in the industry, they're chasing you. It's pretty cool. So we say that there's an app for that, but I put forth that there's a game for everyone. We just have to make them.

It doesn't matter if you'd like to play a 2D game or a 3D game. If you like cute twitch games, or fast and intense car racing games. If you prefer playing with words to blocks and ropes and physics. Whether you like to master the beat or be the master of your own universe.

If you're a big guy or a little guy, there's a game for you, or there's a game to be made. John Carmack at id Software tells us that right now the iPhone is the best platform in the world for a small team to make an impact with. I absolutely agree with him. Graeme, if used like to come back up for a moment.

So to this end, a small development company in Barcelona called Digital Legends. You might remember them from the Keynote last year, they brought us a game called Krull, have been working on a fantastic new game. They brought a demo by last week, and I'd like Graeme to show it to you.

So what you're seeing here is an OpenGL 1.1 game with OpenAL 3D positional audio. It's visually just stunning. What you're seeing here is a bunch of particle effects. If you pay attention to the smoke and the destruction these are all particle effects written in OpenGL ES 1.1. As the cars come by, pay attention to the reflections on the cars.

Those are all done with environment texture mapping. You can see that the cars are reflecting on each other. If you pay attention underneath the cars, the shadows are being generated in real-time and reflecting the environment as they come under the bridge, watch the shadows. It's pretty neat. They've been playing with iPhone OS 3.0 features as well, Graeme. If you'd play us a tune.

[ Music ]

Fast cars, fast cars, man, pick something else.

[ Music ]

Much better.

[ Music ]

So finally to wrap this up, I just want to talk about the speedometer you're seeing in the HUD. What that's showing you is that on current hardware today Digital Legends is utilizing it in a great way. We're taking about 400,000 to 500,000 polys per second, which is absolutely achievable. You can get great effects with today's hardware in OpenGL ES 1.1. Thanks Graeme.

[ Applause ]

The guys at Digital Legends are really excited about iPhone OS 3.0. They found me just before this session, and were looking at me going oh my God, GL ES 2.0. We have to use it. So I look forward to seeing this game in both its 1.1 and its 2.0 incarnations. So focusing on 3.0 for just a moment, we have a bunch of great features in 3.0 for game developers. The first one is the smart application store.

Geoff told you a little bit about that, and really the key here is as long as you declare the capabilities that your application needs, we will always download the correct copy for your user's hardware. So please, feel free to use the new features, and the new hardware will take care of your users. Parental controls. Scott talked about that this morning. Some of your games are not really appropriate for my six year old niece. Parental controls can take care of that. We have the Push Notification system.

This is going to be a great service to inviting someone to a game, for telling them that it's their move, for notifying them of something happening within the game world when they're out in real life. We have in app purchase. As Scott was thinking, you might use this for level -- levels and asset downloads directly in your application. I'm sure you'll find some clever uses for it as well. We talked about the iPod music library access, and we talked about OpenGL ES 2.0. So I'd like to talk for just a moment about Game Kit. We're pretty excited about Game Kit. Game Kit has three parts.

The first is an automatic discovery service for peer to peer connections. This is implemented using our desktop technology, our zero-configuration Bonjour networking technology that we brought over from Mac OS X. This is implemented now on top of Bluetooth for iPhone and iPod touch. The great thing about this is that your users will never see a Bluetooth pairing dialogue.

Ever seen one of those, when you hook up a head set? Yeah, you're not going to have to see that when you want to play a game, which we think is pretty cool. So we also wanted to give you some standard user interface for when you were inviting someone to a game. So we set that up, we call that the Peer Picker.

There's a standard UI for inviting someone. And for accepting or declining. Maybe they're a friend, maybe they're a foe, who knows. You might want to play a game. Finally, we built in a voice chat engine into Game Kit. We've taken what we learned making iChat AV for Mac OS X, and we applied it to our iPhone OS.

What this is, is a connection set up, we'll pass voice data over Bluetooth or the Internet. We'll use standard servers like Jabber servers. We interoperate well there. Finally, there's a world class echo cancellation and mixing service inside of the voice chat engine so that the voice chat audio will seamlessly integrate with your game's audio. We think this will make it really easy to implement voice chat in your games.

We look forward to trash talking with you the next time we want to play a nice game of Chess. So Geoff kind of challenged us a little bit with his OpenGL ES demos. They were awfully pretty and they showed you to great effect what you can do with a programmable shader. So we challenged Graeme, we gave him a team of about five people and just a few days, and they put together this game that we like to call Shock.

So the first thing to notice about this game is that everything you're seeing on screen is being computed, generated, and rendered entirely on the GPU. Leaving your CPU free for AI, for voice chat, for networking, for anything else you might want to use it for in your game.

The audio is full 3D OpenAL audio. We have Blocks on the ends that are being attenuated on a per pixel basis, depending on the state of the ball. The smoke that you're seeing is a fluid dynamic simulation that's being affected by velocity and density, injected by the paddles and the ball. Again, being computed and rendered entirely on the GPU. One of the great things about a shader is once you have one that you like, you can go ahead and change the appearance of it pretty easily by changing parameters.

I particularly like the fire effect. Can make all kinds of cool lighting effects as well, using shaders. The thing to notice in the next two screens is that there are no geometries being drawn in the background in this game. That's entirely lightly effects. It takes about 10 lines of GLSL code to implement them.

Pretty neat, huh? Mostly, I think the game is a lot of fun.

[ Applause ]

Thanks, Graeme. So to wrap up, it's our first birthday with the SDK. I can't think of a better present than all of the applications that I have in my pocket, if I were allowed to have my phone in here while I'm up on stage. I hope that you guys like the birthday present that we've given you, the iPhone 3G S. It's the fastest, most powerful iPhone yet, and I can't wait to see what kind of applications and games you make in the next year.

That, and back to John.

[ Applause ]

Well, we talked about some of the new and important graphics and media technologies available on iPhone and Mac OS X Snow Leopard. And so hopefully this has given you some ideas that you can then use to plan your week and decide what technologies you want to go and learn more about. I think these are two really great platforms, we're really excited about the technologies. I hope you are too. So thank you for joining us, and have a great week.