Media • 55:48
Apple's Graphics and Media frameworks bring sweeping advances to developers with an incredible array of 2D, 3D, audio, and video technologies for both iPhone OS and Mac OS X. Whether you are developing a media-rich mobile application or a cutting-edge handheld game for iPhone, building the ultimate graphics application or a content production pipeline for Mac OS X, or designing an application that scales to both platforms, come to this session filled with in-depth information and captivating technology demonstrations.
Speakers: John Stauffer, Tim Bienz, Gilles Drieu
Unlisted on Apple Developer site
Downloads from Apple
Transcript
This transcript was generated using Whisper, it has known transcription errors. We are working on an improved version.
So thank you, and welcome to Graphics and Media State of the Union. And thanks for staying around for the last session of the day. So past WWCs have been all about Mac OS the platform. And this year is really exciting because we have a new platform. The iPhone OS platform. And many of the sessions you're going to be going to this week will be talking about both of these platforms.
And for this session, the theme that we're going to use, one of the themes we're going to use, is we're going to emphasize the importance of scalability. And a great example of scalability is display sizes. You want your graphics to look good on everything from the iPhone all the way up to this huge display behind us. So we're going to be talking about that as a theme of the technologies that we are going to be delivering in the iPhone and Mac OS.
So there's other important variables. CPU differences, GPU differences, they vary in power, performance, storage size, memory. You have to be efficient with all these. And with mobile users becoming more and more part of the product line and what we're selling to people, we want your applications to be as efficient and give your users the best mobile experience possible. So you want to be efficient with batteries.
So Apple has a wealth of technologies. And today, we're going to focus in on a few key graphics and media technologies. And importantly, we're going to talk about how Apple has taken these technologies and scaled them across the iPhone OS and the Mac OS. Many of them are common, and then there's a few that are going to be different. And we're going to talk about those and focus on these key technologies. So we're going to talk about graphics, we're going to talk about media, and then we're going to talk about getting access to that great graphics and media technology through the web.
So, let's start with graphics. At the heart of graphics is the GPU. And GPUs have been getting incredibly fast over the years. As many of you saw in Bertrand's session, this chart, GPUs have been growing in power and performance, and they're just an incredible asset to use for graphics and media. But an interesting trend that has been happening along that time has been their features.
Back in 2000, 2002, GPUs were largely fixed function devices. And the way I look at that is that they were largely designed for a specific API, meaning they provided the features necessary to accelerate and rasterize for a particular API. And then came along shaders, the ability to write programs and to upload them onto the GPU.
And this enabled a whole new array of possibilities of what you could do. And we started seeing all kinds of new technologies and new imaging applications, what you could accelerate. But recently, GPUs started taking on some new features. And we look at those as compute features, features that started making the GPU look more and more like the CPU.
So what do we mean by compute features? The first thing is precision. IEEE floating-point mathematics, the ability to accelerate high-precision, IEEE-compliant mathematics on the GPU. And this is incredibly important for many types of algorithms you're going to want to run if you're going to target it for the GPU. The second being flow control. And what I mean by flow control is the ability to manage tasks on the GPU to be able to control their execution.
And third, memory access. Have a better, more flexible ability to read and write to memory. And together, these three things, these three groups of features, we call compute capabilities that GPUs are taking on. And with that, as you've seen, we introduced a new technology called OpenCL. And OpenCL stands for Open Compute Library. So, let's talk a little bit about that.
So OpenCL is designed to leverage the compute features of the GPU. But it's not just for the GPU. It's for the CPU and the GPU. It's optimized for both. It has an easy programming model. That was really, really important when we designed it. We wanted it to be as simple as possible. We made it integrated with OpenGL such that they played well together. They complement each other.
So let's talk about how we have made it leverage the compute features of the GPU. Traditional shader languages allowed you to read from multiple locations and then write to a single location. With OpenCL, the language allows you to read from multiple locations and write to multiple locations. And this is what I meant by more flexible memory access. And this is what will enable you to have greater flexibility, what kind of problems, what kind of algorithms you can accelerate on the GPU. This is a key feature.
Another feature is what we call atomic operations. And what that means is the ability to go and increment or decrement a value in memory in a way that you're guaranteed to get the right result. Because you have many tasks running, many work queues trying to access that memory, and you need to be able to synchronize and to change memory and guarantee that that increment or decrementing of that value is going to be correct. So this is important to a series of algorithms, and we found that as we write OpenSeal algorithms that we use this quite a bit.
So flow control. I mentioned that a little bit. So we offer a feature in OpenSeal, the language, called barriers. And what barriers means is that all your tasks will proceed to a certain point and stop. They will all synchronize, like cars coming up to a stoplight. And they will stop. Once they all get there, then they'll be allowed to continue. So this gives you some control over the execution of the tasks that are running on the GPU. So let's talk about how we optimize it for the CPU and GPU.
First, I want to look a little bit at kind of what, how we integrate this into your application. So the first thing you have to do when you integrate this into your application, you have to find the compute-intensive chunk of code in your application. And you take that and you write it into an OpenCL kernel.
You take that along with the data, and you send it to the OpenCL framework. And the OpenCL framework will take that kernel, runtime compile it, and then send that data down to the compute device with the kernel that has been runtime compiled. Now, the compute device can be CPUs and GPUs.
and the complexities of what a compute device can look like is actually quite complex. It can have a hierarchy of memory. It literally can have hundreds of compute units. But OpenCL, gives you a programming model for talking to this hardware that makes it easy. It abstracts that complexity away so that you can get at that computational horsepower. So, let's look at a demo. Now, this demo was one that you have seen in Bertrand's session, but I thought it was so good, I had to show it again for those who may not have seen it.
Now, what this demo shows is taking a complex problem like a gravitational simulation where you have 16,000 stars, each one feeling the gravitational effects of the other. So it's an N-squared problem. And what we're doing right now is we have taken that code and we've just compiled it like you normally would, and we're running it on the CPU. So a single core, and we're running it, and it's getting two gigaflops of compute power.
So gigaflops stands for billions of floating-point operations per second. Now, what we do, as I just showed in the diagram, is we take that code and we move it to OpenSeal as a kernel. and OpenSeal runtime compiles it and optimizes it for the compute device that you're targeting.
Now, OpenCL, with the language that it enables you to write the kernel language in, is it targets the vector instructions of the CPU. And by doing that, that's how we've gone from 2 gigaflops to 11 gigaflops. Now, OpenCL is built on top of Grand Central Dispatch. And what that allows us to do, then, is very easily tell OpenCL to run this across all the CPU cores. So by doing that, now we've gone from 2 gigaflops to 67. So now we're running across all eight cores of the system, and we're running with the SSE instruction set native to these CPUs.
So OpenSeal runs across CPUs and GPUs. I can take that same piece of code now and have it execute on the GPU. And by doing that, we go to 105 gigaflops. So that's a pretty incredible performance boost. And further, what we can do is we can have it run across CPUs and GPUs. So I can have it run across both.
And that takes us up to almost about 240 gigaflops. And remember, we started off at 2 gigaflops. Now we're up at 240. And we did this by not writing any assembly, by writing still C code. So here's the demo I've shown, which is the collision, simulating the collision between the Andromeda and the Milky Way galaxy. This is actually real data that we got. And this is simulating Earth, the blue dot. And I'll let it run a little longer so you can see that things don't go well for Earth.
So let's go back to slides. Okay, so easy programming model. What do I mean by that? So the kernel language in OpenCL is just C. Based off of C99, plain vanilla C, so that's really easy. So I mentioned it's runtime compiled and optimized. We have real compiler technology underneath OpenCL that gives you optimal code. And that's optimal for whether you're targeting the CPUs or GPUs.
So we took C99 and we added some data types. We added vectors for getting access to the vector instructions of the compute device, whether it be a CPU or GPU. We added image data types, making it easy to use for imaging or graphics. And we added address qualifiers, giving you some access, some control over the memory hierarchy of a device.
We also provide a full math lib, so the same math functions you're used to when you're programming C normally. The same IEEE compliant math.h. So let's look at what a compute kernel looks like. It just looks like C. I've highlighted the things that are not traditional C in orange. Let me zoom in a little bit so we can get a look at that. For the most part, it just looks like C.
And this is basically what I was just showing you in the demo. Okay, so as I said, it's fully integrated with OpenGL. And let's look at that a little bit closer. So traditionally, OpenGL takes two types of inputs. It takes textures and it takes geometry. So your application traditionally will be taking static textures and static geometry and passing it to the OpenGL framework, and then OpenGL will be using that to render using the GPU.
Now, what OpenCL allows you to do is you can take that and have a compute kernel with your data, and you can use that to feed OpenCL and have OpenCL dynamically generate texture and vertex data. So you can have a scene that is dynamic and interactive by using OpenCL as the front end for OpenGL, and it can all be accelerated either on CPU or GPU. So I'm gonna show you a demo of doing that.
So, what I have here is I have a bunny, and he's sitting in a nicely rendered room. And traditionally, OpenGL has been good at rendering, right? So this looks like something you could do with OpenGL. But what is really interesting about this is that I'm using OpenCL to generate this fluid-like looking bunny.
And I can actually prove that he's fluid because I can melt him into a puddle on the floor. So what we're doing is we're actually using OpenCL to simulate fluid and make it interactive, make it an interactive part of this environment. So what I can do now is I can actually interact with this fluid.
So this is a fluid simulation running on the GPU. So that bunny was actually a liquid that we held in a little container, and we released the container, and he flowed into this puddle. So this shows you how you can use OpenCL to simulate real physical effects and accelerate all of that on the GPU. So let's go back to slides.
So OpenCL runs on all Snow Leopard platforms on the CPU. And it runs on a select set of models, GPU accelerated, and that is the select MacBook Pro, iMac, and Mac Pro models. So we talked a little bit about the details of OpenCL. What else can it do? Well, just to go over some things, it's really good for graphics and imaging.
It's good for scientific visualization, physics, you name it. Really, any data-intensive algorithm, you can use OpenCL to accelerate and to optimize. So we think it's going to be really good for getting access to all that computing horsepower on the system. Now, OpenCL is an Apple-developed technology and Apple worked in conjunction with industry leaders to define the OpenCL specification.
And we're taking that specification, Apple is taking that specification as a proposed open standard to Kronos. So we think that making OpenCL an industry standard is going to be great. We think it's going to be fantastic for you, developers, to be able to access this across all the platforms.
So that's OpenCL. Leverages the GPU, optimized for the CPU and GPU, easy programming model so you can make use of it, not a lot of work, fully integrated with OpenGL. and it's new in Snow Leopard and on the disc that you can get Snow Leopard Seed.
[Transcript missing]
So core animation is at the heart of the iPhone user experience. It is what drives the iPhone interactivity. The UI in the iPhone is built directly on top of the core animation that is on the iPhone. So let's look at some examples. So CoverFlow. CoverFlow is built with Core Automation. It's a good example of an immersive, interactive user experience. You can use it for heads-up displays over media and other content. So what I want to show is a demo that we showed a couple years ago. at WWC when we introduced coordination for Snow Leopard.
So what we have here is we have a core animation application where every tile here is a core animation layer. And these layers can be animated and animated in 3D space. The animation is being automatically controlled by core animation. It's what is making these things move smoothly through space. You just set the beginning and end states, and it animates through all the positions that you need to to get the smooth animation.
So one of the things that we are showing here is really pushing the limits of core animation. This is over 300 layers. And this is very similar to the demo we showed two years ago. And what we want to show here is that all the power that we've been giving you through Core Animation on the desktop is available on the phone. So you can make great animations.
So Core Animation is also on the desktop, Mac OS. And new for Snow Leopard, we've added a particle systems engine, transform layers and gradient layers, so some new ways to make new layer behaviors. So that's Core Automation, and we think Core Automation is a really key technology for building immersive user experiences. So let's talk about OpenGL. OpenGL was traditionally a desktop technology. And recently, a smaller version of it has come into existence called OpenGL ES, and that's what we have on the iPhone.
So let's talk a little bit more about OpenGL ES. So OpenGL ES is a scaled version of its big brother. It's a simplified API. But it's an industry standard, just like the desktop version of OpenGL. And on the iPhone, we deliver OpenGL ES 1.1. It's GPU accelerated with a lightning-fast GPU, as you saw in the games in the keynote. This thing is going to be great for building really great graphics.
So, Core Animation is the layers that OpenGL ES renders into. So what that means is that when you render with OpenGL ES into a Core Animation layer, you can use that to do all the great things that we were showing with Core Animation with OpenGL ES rendered into that layer. So it's integrated into the heart of the display system on the iPhone.
So let's do a little comparison between OpenGL ES on iPhone and OpenGL on the desktop. First, we have all the great optimized paths of OpenGL, vertex arrays, multi-texturing. The simplified API part of that is that we've removed the APIs that are less efficient, like immediate mode paths for passing geometry into OpenGL ES.
It has depth buffers, stencil buffers, and multi-sampling, the same types of buffers you're used to on the desktop. It does not have shaders. It's a fixed function API. So let's look at a demo of this in action. So this is a demo that some of the engineers at Apple wrote to kind of test and to really push the limits of the iPhone. This is a little game, which I'm traditionally pretty bad at.
Okay, so OpenGL ES is really great on the phone. It's going to enable all kinds of great immersive high-performance graphics. So, let's talk about optimizing for the iPhone. Memory is going to be limited. On the iPhone, you have 24 megabytes of memory for use for textures and for surfaces. To help optimize for that, we've enabled compressed textures. We highly encourage you to compress all your texture data. That'll give you a large savings in how much texture space you're using with your content.
And the OpenGL ES is enabled in the iPhone simulator, letting you write all the same OpenGL code on the simulator and test it and run it on the simulator before you upload it on the device for really seeing how it performs. Now, when you upload it on the iPhone, We have instruments. Instruments give you real-time tools for profiling and optimizing for iPhone. It's the same types of tools that you're used to on our desktop, such as the OpenGL driver monitor. Same types of great tools are built right into instruments. So that's OpenGL and OpenGL ES.
Now let's talk about color. So Apple has been a leader in color management and color technology. And I want to talk a little bit about what we're doing to make that even better on Mac OS X. So display gamma. So when Mac OS was first introduced, Display Gamma of 1.8 was chosen. And since then, the web has come along and has chosen a display default gamma of 2.2. Media has chosen a gamma of 2.2. So we've decided that it's time to switch Mac OS to gamma 2.2.
So we're going to be doing that in Snow Leopard. And in fact, the seed you have, we've already switched it. So now the question is, what do you do? What does that mean for you? Well, it means nothing. You don't have to do anything. If, with some caveats, you're already using modern APIs and your application is tagging your media and graphics. So if you're not doing that, we encourage you to move on to modern APIs and tag all your content because we want Mac OS X to be a color-managed system from the top to the bottom.
So now let's talk about what's new for Snow Leopard. What are we adding to our color management? First, we're adding the ability to tag windows with a color sync profile. Second, we're adding 64-bit and 128-bit window depth. So 64 and 128 bits per pixel You'll be able to get that into a window now. And what this means is we're driving color management and deeper pixels deeper into Mac OS X, making it an even better platform for color.
So that's ColorSync. So we've talked about OpenCL, we've talked about core animation, talked about OpenGL, and we've talked about color. And that's the graphics that we want to talk about today. So now let's talk about media. And to do that, I'm going to invite Tim Bienz up to cover media.
Hi, John. Good afternoon, everyone. I'm Tim. As you heard John talk about iPhone and desktop Mac OS and scaling technologies between them, most of the scaling you heard about was bringing technologies from the desktop down to iPhone. However, it's also possible for technologies to scale the other way from iPhone back to Mac OS. We'd like to take a look at today at Quick Time and Core Audio which provide examples of scaling in both directions.
Let's start with Quick Time. To do that, let's go back two years ago to the graphics and media state of the union here at WWDC. At that time, we told you the following. Look at the first point on the slide here. Using our high-level Cocoa APIs is the best way to ensure compatibility with the future. Secondly, we were undertaking a bunch of work in our low-level engine to modernize it. And although we couldn't tell you at the time what the reason was, there was a reason, and the reason was iPhone.
The desktop code base we had was not really suitable for scaling down to iPhone, and so we had to undertake development of a new code base for use on iPhone. In doing so, we had three guiding principles. First of all, whatever we did needed to be portable. We knew it had to run on the iPhone processor. In addition, we knew it had to run on the Intel processors in the desktop. Secondly, whatever we did had to be modular. We knew we needed a system where we could pick and choose pieces and features to deploy on individual products.
We couldn't have a system where we're sort of all or nothing. Third, whatever we did had to be efficient and performant. And this means a lot of different things in a lot of different contexts. But for battery-powered systems, which as you know are becoming more and more important, one of the extremely important attributes of efficiency is battery lifetime.
So, we implemented based on these three principles and hopefully you've all had a chance to enjoy the really great video quality we have on iPhone, be it displaying content from the web, or from the iTunes store, or from any other source. We're really happy with the way the video work on the iPhone turned out. As you heard Bertrand talking about earlier today, we're bringing what we learned from that work back to the desktop in Snow Leopard as the beginning of QuickTime 10.
Now, QuickTime 10 is a really exciting and important announcement for us here at WWDC this year. It represents a significant ongoing technology investment by Apple to provide a modern foundation for media. Let's take a look at three areas of focus for QuickTime 10. First of all, seamless Mac OS X integration. Secondly, a focus on modern codecs. And third, high-level APIs.
Let's begin by looking at seamless Mac OS X integration. And several examples of this are 64-bit. The code base for Mac OS X is fully 64-bit native. Secondly, color sync. Whatever we do in QuickTime 10 is fully color managed throughout by color sync. Third, core animation. You heard John talking about the importance of core animation on the desktop and the centrality of core animation on the iPhone. We've recently done a lot of work in the rendering pipeline of QtKit, our Objective-C layer, to better integrate with core animation. And in fact, you can really do some things easily now that used to be really hard to do. So, let's talk about another side of integration.
Sometimes for integration, it's better just to let other parts of the system do what they do best. First example is still image support. Mac OS X has a robust framework for doing still image support, and that's Image I.O. So rather than doing image support in QuickTime 10, you should use core Image I.O. for your still image support. Secondly, let's talk about interactivity.
Apple has been working over the past years on web technologies for interactivity. And as you heard in the keynote last year, the graphics and media keynote, and you'll hear a little bit more in the section in this keynote, State of the Union Following Mine, You can really use web technologies to do some really great interactivity. And so we believe that interactivity does not belong within the media system. Instead, you should be using web technologies such as HTML and JavaScript to do interactivity.
talk about modern codecs. And as you know, Apple is in the business of selling content. This has three interesting implications. First of all, it means that we really need high-quality video and audio codecs in order to provide the kind of content that we need to sell.
[Transcript missing]
Secondly, it means that we work with our content providers in movie studios and music labels, and that has the effect that we get a lot of feedback from them as to the quality of the content, and so we are able to improve the quality of the codex significantly from that. In addition, we receive an awfully broad range of content from our content providers, and that also feeds back into improving the quality of our codex. Third, it means that we have an interest in having codex that can last for many years.
Neither we nor you are interested in having to re-encode your content periodically just to update to the latest standards. Another thing that's true, as you know, is that Apple has a very broad hardware product line, all the way from iPods and iPhones up to high-end desktop machines. We need a single audio and video codex that can span this entire range of products.
The answer to this is standards, and standards are central to media. In particular, there are a few standards that are of critical importance to us. First is the MPEG-4 file format, which is based on the QuickTime file format. Secondly is AAC audio codex. Third, H.264 video codex. In addition, there are other standards, such as closed captions, that are quite important to us.
Let's take a look at audio for a moment. AAC does not, as many people seem to believe, stand for Apple Audio Codec. It really stands for Advanced Audio Coding, and it is the premier audio codec in the MPEG-4 standards. It's a variable bit rate codec, which means it is very efficient and only uses the minimum number of bits necessary to encode the complexity of the local region of audio you're encoding.
There are many uses for AAC, all the way from iTunes Plus, which is the approximately 256 kilobit per second audio we sell through the iTunes Store. We've had really positive feedback on the quality of the content in iTunes Plus. In addition, there's a low delay mode for AAC, which is very useful for real-time communications. Another example is multi-channel audio. AAC supports multi-channel audio to allow you to do surround sound.
On the video side, H.264 goes by many names. In addition to H.264, it's also known as MPEG-4 Part 10, AVC, and JVT. The reason for this is primarily historical. The initial work on H.264 started in a number of places, none of which seem to have been able to afford to buy a vowel from what I can tell, and was later unified to produce the final standard.
H.264 is an extremely robust and rich standard. And to understand what this means, it's useful to go back and look at a somewhat earlier standard that was less rich but still a moderately rich standard, and that's MPEG-2. If you look at what happened over the history of MPEG-2, as implementations took more and more advantage of the full breadth of the standard, it was possible to encode video of the same quality at lower and lower bit rates as time went on. We've seen the same thing happening with H.264 already. We expect it to continue happening, and I expect it to happen to a degree even more than it did in MPEG-2 because of the extreme richness of the H.264 standard.
In addition, H.264 can be used across a wide range of display sizes, all the way from small QSIF 176x144 sizes appropriate for edge networks on mobile devices, all the way up to full HD sizes useful on desktop. With the introduction today of the iPhone 3G and the already existing edge support, it's useful to focus on this and just take a look at examples of a couple pieces of content and what they look like on the phone, so let's go do that. So as you can see, H.264 is usable down to very low bit rates. However, if you're able to allocate additional bits, for example, for a 3G network, you'll be able to get significantly better quality. So, let's... Talk about high-level APIs.
And for us, primarily, this means Cocoa APIs and specifically QtKit. In QuickTime 10, what we're doing in Snow Leopard is having QtKit actually bridge across QuickTime 10 and QuickTime 7. Bertrand talked about this a little in his talk earlier this afternoon. What this means for you is that you can use the same APIs you're already using for playing back content using QuickTime 7 and use them to take advantage of the additional efficiencies and color correction and other modern features we're introducing in QuickTime 10. In order to do this, you do need to opt into the QuickTime 10 code path. It's actually really easy to do that in your application.
We strongly encourage you to do that in order that you take advantage of all the modern features we're adding in QuickTime 10, as well as the ongoing tuning we're going to be doing in that code path. Along with our focus on Objective-C APIs, we are deprecating QtJava in Snow Leopard.
So QuickTime 10, three areas of focus-- seamless Mac OS X integration, modern codecs, and a focus on high-level APIs. QuickTime 10, shipping the beginnings of QuickTime 10 in Snow Leopard. That includes support for playback of H.264 and AAC content. Why playback? Because it's what most of your applications and most of our customers do. Why H.264 and AAC? Because it's the codex that plays back across the entire range of products we have, and it's what we're focusing on. Let's turn our attention to audio for a moment, core audio. Let's again look at iPhone.
iPhone has an extremely complex audio environment. In addition to a number of audio sources on iPhone, there are a number of places, headphones, Bluetooth, so on, where content can be played out. We needed technologies that could support the very complex audio environment on iPhone. It turns out we already had them on the desktop. Core Audio and Open AL were just what we needed to support the iPhone. Let's take a look.
On Mac OS, on the desktop, we support OpenAL, technologies you heard earlier in Bertrand's talk for rendering sound in three dimensions, shares a coordinate space with OpenGL. In addition, we have system sounds, which support key clicks and very simple system sounds. Audio file and audio cue, very useful for doing audio playback and recording. And finally, audio units, which allow you to get at low-level things such as mixing and other attributes. On the phone, we've brought all of these things across.
In addition, we've implemented a new API, Audio Session, used for managing the complex audio environment on the phone. And this allows you, in your applications, to provide the same kind of seamless audio experience that we provide in our own applications. One feature I should mention that's shared between the phone and desktop, because it's been very much requested, is the phone does support the ability to do full duplex audio, both inputting and outputting audio simultaneously, which is really useful for real-time communications.
So that's Core Audio. Putting everything together, as you just heard, the way we've scaled from the desktop to iPhone on the audio side is to bring the Core Audio and Open AL technologies down to iPhone so that you can use your knowledge and leverage that that you already have on the desktop in implementing great applications on the iPhone.
QuickTime 10, a really important and extremely exciting announcement for us here. QuickTime 10 represents a significant ongoing investment from us to implement a modern framework for media. With that, I'd like to turn it over to Gilles Drieu, who will be talking about how to use graphics and media on the web. Gilles? Thank you.
Thank you, Tim. Good afternoon, everyone. All right. So you just heard from Tim and John about our latest and greatest graphics and media technologies. I'm here today to tell you why they are so essential on the Web. Today, We'll see how web standards help shape graphics and media, and how these same graphics and media can be leveraged on iPhone. First, web standards. Over time, graphics content has become as commonly used as text on the web. But really today, it is not just about standalone graphics anymore. It is about integrating them in rich and dynamic experiences.
And that means defining a rich presentation for your content. And you guys, as web designers and developers, are very familiar with that concept. With CSS, you know how to style your HTML web pages. CSS stands for Cascading Style Sheet. And CSS was standardized over 10 years ago by the W3C. And it has become a very powerful tool. And it's supported in all major browsers. But we think that more can be done. Apple has been working with the W3C on a brand new set of rich, compelling features for graphics: CSS, Visual Effects.
CSS Visual Effects, such as the ones you can see on the screen here, have been designed to take full advantage of our graphics stacks on modern platforms. Now, you can polish your design with masks or gradients or reflections, and you can add life to your content with transforms, transitions, or animations.
And I'll show you what it really can do for you in a demo in a couple of minutes. Now, that's about graphics as standards. What about media content? As you all know, audio and video are everywhere on the web today. And traditionally, we've been using plug-in to handle that type of media. But a natural evolution would be to integrate our media within these rich experiences in a very seamless way. As seamlessly as you integrate images, for instance. Well, for images, we have an image tag.
So we believe it is time to make audio and video first-class citizens in your web pages. How about an audio and a video tag? Apple has been very actively working with the W3C and the Watt Working Group on a new set of media elements for audio and video. And it is now part of the HTML5 specification draft for you.
OK, so audio and video tags. Well, they are page elements. As such, they are styleable with CSS and scriptable with JavaScript, like any other element. They work beautifully with H.264 and ASE, our state-of-the-art video and audio codecs that Tim mentioned earlier, but they also offer a very elegant fallback mechanism.
In the current shipping version of Safari 3.1, we implemented the audio and video tag on top of the QuickTime technology. We also implemented the CSS visual effects and specifically transforms and transitions. Because Safari is built on top of our graphics and media stacks, respectively Quartz and QuickTime, you get the absolutely best performance and quality. That was about web standards and graphics and media. Now you know what you can do with them. Now we're going to talk about some of the innovations we've been working on on iPhone.
As you know, less than a year ago, we introduced iPhone, and with it, a full-fledged web experience. And as you heard, it's been an incredible success. And you guys have been awesome. Like really, in less than a year, you implemented over 1,700 web applications for iPhone, for Safari on iPhone.
and your applications are awesome too. Like, they are dynamic and interactive and they take advantage of the full large touchscreen. We love them and we believe we can help you do even better with more access to graphics and media features. OK, so as you heard from John and team, our graphics and media features are scalable and at the core of iPhone. So we thought about it, and we decided to integrate Safari with the same graphics and media stacks that are currently used and leveraged by native applications.
First, we implemented CSS visual effects. And specifically, we focused on transforms, transitions, and animations, because we know that these are the ones you need to build your next generation rich user interfaces. And as you heard in a previous session, in Dash Code, you can leverage very easily in the tool the same technologies.
As you heard from John, though, we also have a beautiful compositing system on iPhone, and it's called Core Animation. We took the same technology beyond core animation and we built it right inside Safari on iPhone. That gives us access to the full power of GPU acceleration, which opens the doors for 3D transforms, for instance, which you can control over the same CSS transformations we talked about earlier. You still have 2D transformations, which also are GPU accelerated, and you also have now 3D transformations.
So let me show you what it really means for you. What you see on the screen is an example of an animated scene of falling leaves. And this has been implemented with traditional JavaScript techniques. And as you can see, it looks pretty good. Now, side by side. On the right side, you see the same animated scene leveraging the GPU acceleration and CSS visual effects.
As you can see, there's a pretty substantial performance improvement here. As a matter of fact, almost ten times faster. Not only that, but it took us 100 lines of JavaScript for the first version, and now it takes us ten lines of CSS for the exact same animated scene. What does it really mean for you web developers and designers? Well, it means that you can not only benefit from the maximum performance on a device, but also gain very, very much in productivity. That's what it means for you.
Okay. So that's -- So we talked about the new graphics features on Safari for iPhone. Now, on iPhone, we have a really seamless playback system here. It's very easy to embed your content, as you know. It leverages beautifully H.264 and ASE. It's high performance, built right in. You guys told us you love it. But as usual, you want more. And you want more control over your media. And we listen to you. And for this reason, we implemented the JavaScript methods and DOM events that will help you control your media within the playback.
These APIs are compatible with the same APIs that have been used for many years on QuickTime plugin on desktop browsers. That's right. That means that all the content you've been working on for the past 10 years, you can leverage on iPhone today the exact same content. It's compatible. But we didn't stop there. There's also a need to trigger the full screen playback from your custom web applications. And so we implemented also APIs that help you trigger the playback from your custom UIs as well. All right.
So, There are two things I'd like you to remember today. First, web standards are the backbone of the web. And we at Apple are very committed to evolving further in particular when it comes to graphics and media. Second, On OS X, we have great graphics and media technologies. And as you heard, they scale beautifully onto iPhone.
And we expose them for you in Safari on iPhone. That means your web graphics become fast and dynamic, and your web video is simple and very immersive. So check them out this week, you know, and start tuning up your web apps. I'm still waiting for you. I would love to show you these demos because that's a bit hard to visualize without them. Okay. Okay. All right. No demos. As you can tell, I'm really happy right now. Because I was really excited to show you these demos. So that's really unfortunate here. Okay. So that was about web. And now I'd like to turn it back to John.
Thank you, Gilles. So just before the show, some of my engineers came to me. They had this really fun demo. And we decided that It shows the ability of OpenSeal to do some really fun stuff. And we decided we had to show it. So to do that, I'm going to invite Jeff Stahl on stage and Nick Burns, and we're going to take a look at this demo.
So this demo will take a few minutes to look at something that we can do with OpenCL on the GPU. And if we switch to the demo machine, please. There we go. Okay. Now you've been waiting for this. Let's show you what we can do with OpenCL on the GPU.
What you have here is an instruction level emulation of an Apple II 6502 processor running on OpenCL. And of course, with an Apple II, you get full AppleSoft basic support. So, but... We're graphics folks, so we thought we'd do something a little bit more immersive, 3D graphics circa 1982.
standing 4-bit graphics. So -- Obviously, one of the most recognized games of the Apple II era, but it is interesting to take a second to talk about exactly how this was done. So this is an OpenCL kernel, the emulators -- an instruction-level emulator running completely in OpenCL, about 2,300 lines of code.
This is then -- the data stream input is the processor state, the memory, and the disk representation, and this OpenCL kernel iterates on that, obviously, to produce the frames. We then take a back-end, custom-written OpenGL fragment shader to take the exciting 4-bit graphics and convert them for the big screen here. And we'll see what luck Nick has with the princess.
I don't think this is going to go very well. Oh, that was not good. In any case, you know, when you have one Apple II, that's great, but how about having maybe 64 Apple IIs? So this is 64 Apple IIs running on top of OpenCL on the GPU. Thank you very much.
So we've talked about graphics, and we've talked about media, and then we've talked about how to get access to all that great technology through the web. was our demo. Okay, so Apple has spent a lot of time taking all these great technologies and scaling them across both iPhone OS and Mac OS, and we've optimized them for those platforms so that you can build great applications on top of this technology.
and Apple has a wealth of other technologies that are going to be covered this week at WWC. We encourage you to go and see all of them and learn about all the great technologies on both iPhone OS and Mac OS. We have labs downstairs. We have the Graphics and Media Lab with engineers there all week that will answer all the details that you want about these technologies. And we have an iPhone Lab downstairs with engineers there all week long to answer any questions. So thank you for staying, and welcome to WWC.