Configure player

Close

WWDC Index does not host video files

If you have access to video files, you can configure a URL pattern to be used in a video player.

URL pattern

preview

Use any of these variables in your URL pattern, the pattern is stored in your browsers' local storage.

$id
ID of session: wwdc2002-500
$eventId
ID of event: wwdc2002
$eventContentId
ID of session without event part: 500
$eventShortId
Shortened ID of event: wwdc02
$year
Year of session: 2002
$extension
Extension of original filename: mov
$filenameAlmostEvery
Filename from "(Almost) Every..." gist: ...

WWDC02 • Session 500

Graphics & Imaging Overview

Digital Media • 1:00:34

This overview of the exceptional 2D and 3D graphics technologies in Mac OS X provides an introduction to other graphics and imaging sessions. Find out the latest information on Quartz 2D, OpenGL, ColorSync, printing, and Image Capture, as well as the latest Quartz Compositor developments.

Speaker: Peter Graffagnino

Unlisted on Apple Developer site

Transcript

This transcript was generated using Whisper, it has known transcription errors. We are working on an improved version.

Good morning everyone. I'm Travis Brown, I'm the graphics and imaging evangelist, and I'd like to welcome you to session 500, which is a graphics and imaging overview. Hopefully you all had the opportunity yesterday to see an exciting new announcement that we made, Quartz Extreme. Quartz Extreme is a fantastic new technology that leverages the power of Mac OS X to take our compositing window model, the Quartz Compositor, and drive that in conjunction with our really excellent OpenGL implementation to do something that's industry leading. In fact, we're two years ahead of the competition, delivering a fully composited GPU leveraging windowing system, and a lot of those tools and techniques that we use to do that are available to you as developers.

The interesting thing is the Quartz Compositor is not the only big announcement we have. All throughout the graphics technology stack in Jaguar, we have new developments, and that's what the purpose of this session is, to communicate the new things that we've been up to over the past year, and how we've taken the power of Mac OS X and harnessed it to do new and interesting things. That's what the purpose of this session is, to communicate the new things that we've been up to over the past year, and how we've taken the power of Mac OS X and harnessed it to do new and interesting things.

That's what the purpose of this session is, to communicate the new things that we've been up to over the past year, and how we've taken the power of Mac OS X and harnessed it to do new and interesting things. So what I'd like to do is have you welcome Peter Graffagnino, the Director of Graphics and Imaging Engineering, to stage, and he'll take you through the presentation. Thank you. Thanks, Travis.

Hey everybody, welcome to WWDC. Hope you're going to have a great conference. We've got a lot of interesting stuff in the graphics areas, as Travis said, so please attend all the sessions I'm going to talk about here and point you to, and enjoy the demos we have for you today.

The basic agenda is to do a loop through all the technologies we have in terms of graphics and imaging on the platform. That's Quartz 2D, OpenGL, the Quartz Compositor with the new accelerated implementation called Quartz Extreme, ColorSync, Image Capture, and printing. I'm going to do a brief overview of each technology. Some of it might be a little bit of a review, but I think we have some new people here this year who haven't seen it before. Then I'm going to focus on what's new in Jaguar. and also show you some demos.

So let's get started. Architecture diagram, you've all seen this for a few years. The graphics stack sit right above the Core OS and below the Frameworks layer. And the three primary interactive technologies we have, APIs you can call to get bits on the screen, are Quartz, the 2D graphics library. For 3D, we have OpenGL. And for video and multimedia, we have QuickTime.

So first I'll talk a little bit about Quartz. Quartz 2D, as we call it, is our 2D imaging model that's based on the industry standard PostScript and PDF imaging model. This imaging model's probably printed every page you've read in the last 20 years, 15 years or so. But it's really been industry proven. It's got a very robust model for fonts, line art, graphics, sampled images. And what Quartz 2D really is is just a straightforward C library that's an implementation of that imaging model. So there's no real language built into Quartz 2D. It's really just an immediate mode graphics API.

What we did do was we added PDF read and write capability to the library because we wanted a metafile format. In fact, we kind of arrived at the API by working backwards from the metafile, knowing we wanted to record and play back 2D as PDF. And then what kind of images we wanted to have.

And then what kind of C API would we want on top of that that would be relatively straightforward to use. The other thing we knew is we wanted to use a lot of nicely anti-aliased content in the user interface, have alpha composited icons, have nice looking text. And so we knew it had to be really fast.

So we have a really fast anti-aliasing algorithm in there that's pretty state of the art. And we're pretty proud of that. The other thing I didn't mention last year but I think is important to understand about Quartz 2D. Is it has the concept of destination alpha. Which basically means that it records the coverage information as it's drawing. So this is pretty powerful.

Not only do you get the RGB value for every pixel you draw, you get a coverage value. So for example, you can clear a canvas to a clear color and record in the alpha just the bits that are shown. So you can create, for example, an overlay pretty easily using the destination alpha.

The other thing that's pretty important about a 2D graphics library is the font support. And fortunately, Apple's been at this for a while, and we have a really good type system called Apple Type System, which is built in the Quartz 2D. So for all of the PDF font handling, Apple Type System handles both TrueType, and we also have a Type 1 scaler that we got from Adobe. So whether it's TrueType, Type 1, or whatever, we can render it with Quartz 2D.

And finally, another thing Apple's been at for a while is color management, and we have ColorSync built in. The color processing model of PDF is pretty similar to the ICC model that has been a standard for a while. And so Quartz 2D, when it has to do color calculations, just builds a color world and uses ColorSync to do that.

So what's new in Quartz 2D? Well, there's some new features. There's a full PDF 1.3 support, which includes gradients and patterns. We, in fact, have API for linear and radial gradients, and we have patterns. These are full vector patterns, not like quick draw patterns you might have been familiar with, but basically arbitrary graphics can be, you know, step and repeat through the page.

We have transparency in PDF. We don't have the full 1.4 transparency model yet, but we have the alpha and alpha image calls that we've had in Quartz 2D for a while now get recorded and played back using the 1.4 transparency operators. We also have PDF X3 support. PDF X3 is a graphic arts standard that's emerging for file exchange and graphic arts based on the PDF 1.3 spec. You have to add a little extra metadata to the file to get it to work, but we have that coming in. Jaguar.

The other important thing I wanted to make, if you went to the keynote yesterday, you heard Avi kind of touch on using native services in your applications. And I think a prime example of this is Quartz 2D. We've actually made it pretty easy from a Carbon application to intermix your QuickDraw and Quartz 2D calls. We don't necessarily expect every large Carbon application to rip out all of their QuickDraw, but for certain areas it might make sense.

And one example of this is Excel and what the guys at Microsoft did with the Excel charting engine. So, for example, here's a graphic the way you might see it on OS 9 Excel. You'll notice you have jaggies and the rotated text and kind of a straightforward looking graphic. But then with Quartz 2D, they were able to add transparency. And I apologize for the scaling of the slide. PowerPoint didn't do a great job on that. Probably not using Quartz 2D yet.

The rotation of the text you can see is much nicer. In fact, you know, when this prints, the rotated text shows up in the file as real fonts rather than bitmaps. And so all of that is very nice and gets them better printing as well. So they took their chart engine, which is obviously not all of Office, and were able to bring that over to Quartz 2D pretty reasonably. So anyway, keep that in mind. There's a couple of things we've done to make Quartz 2D a little easier coming from the Carbon world and QuickDraw world. We've added Quartz.

We've added Picked rendering with Quartz 2D. This has actually been in for a little while. But we basically take the same logic in the LaserWriter 8 driver on OS 9, if you're familiar with that, which basically takes QuickDraw calls and converts it to the PostScript imaging model. And we've converted that to take the QuickDraw calls and make Core graphics calls or Quartz 2D calls to enable the playback of Picked through a Quartz 2D context.

And so that gives you the WYSIWYG fonts and allows you to convert Picked to PDF and all those features. So that's, you know, if you're used to Picked, that's one thing you can do. One thing I don't have on the slide that I should mention is EPS support as well. You can also get EPS through Core graphics.

So if that's one of the reasons why you feel you need to stay with QuickDraw, we do have API in Quartz to handle embedded EPS that will pass through to a PostScript printer. QuickDraw text rendering with Quartz 2D is another thing we've added, which is the ability to, you know, use Quartz 2D to run a lot of the work that we do.

So that's one thing we've added. And then we also have the ability to take Draw text calls and have them call Quartz 2D's rendering facilities to get anti-alias text through QuickDraw. There's a couple of modes that can run in. It can run in a metric-compatible mode with OS 9 if you can't afford your layout to change. In that case, you know, the text doesn't look as great as it could if you turn that mode off and you just use the natural metrics of the font. And in some situations, you're able to do that, That's the way you get the best text.

So we have some sessions on Quartz 2D. In fact, later this afternoon, I think it's in this room, there'll be Quartz 2D in PDF to kind of give you an update on what's new in Quartz 2D. If you want to hear more about ColorSync, there's a session 509 on Wednesday. And finally, the Graphics and Image Performance Tuning session on Friday at 3.30 is going to be a pretty good one. A lot of tips and techniques on how to get your application to run faster and take advantage of Quartz 2D and the graphics system.

So with that, I'm going to move on to OpenGL. OpenGL is our 3D graphics library. It's, again, an industry standard technology. It's been around for a long time. It was originally the GL library on the SGI machines. It's been kind of an open standard for maybe about 10 years.

We work pretty hard with our partners, our OEMs, NVIDIA and ATI, to get really good drivers into the system. And those guys are really pushing the state of the art in terms of what can be done with graphics hardware. And actually, I have a demo of some of the latest stuff I can get demo 2 up.

So this is a Wolfman demo from NVIDIA running on the GeForce 4 Ti card we have in here. And you can see this is all using OpenGL and some of the latest extensions that NVIDIA added that we've put in Jaguar as well to allow per-pixel shading. I think this guy's about 100,000 polygons, 100,000 triangles. He's got self-shaded fur. I can figure out how to stop him here. Let's see, shift.

[Transcript missing]

Can I get back to the slide machine, please? So, and obviously GL, one of the big reasons why we have it is for great game support, and this is 4x4 Evolution 2, which has some really great graphics. But it's more than just about games.

I mean, we feel pretty strongly that there are a lot of applications out there that want to use 3D that, you know, are not just, you know, take over the screen immersive experiences. So there's a lot of packages out there. I won't go through the whole list here, but a lot of people are kind of jumping on OS X because they're realizing the combination of kind of Unix and OpenGL, which, you know, has been a traditional market for them, is actually here on OS X.

And makes it pretty easy for them to bring things over. So if I get the demo machine back again... I'll bring up the TweakWaves demo, which I don't know if you saw at the Macworld session in January. But this is a company that is in San Francisco. They do special effects. They specialize in physical simulations of pretty hard phenomena for the special effects industry.

So they have a good wave simulator. And what they wanted to do, they have this engine is all just straight ahead, pretty portable, Unix C code. And what they did is they spent a little bit of time with Cocoa and put a straightforward user interface on top of it, where, you know, they have some pretty basic controls here. You can change the behavior of the model, crank down the resolution, the crest sharpness, things like that.

They have an appearance thing where you can have an environment map and throw in some geometry there. But the nice thing about this is they have, you know, this physics engine that is very sophisticated. And you can imagine taking an application like this and wrapping it up and kind of, you know, if you're trying to bid on a job or something, you say, "Here's what I can do. Here's all the controls.

You know, you can go tweak my algorithm and tell me if, you know, you can get a good effect for your film." And he doesn't have to, you know, sit over the shoulder with the art director or whatever to kind of get the effect. He can just say, you know, "Tell me the settings you like and I'll go run you some frames." So I think that's a pretty powerful story to have, you know, just an engine code written in straightforward C wrapped in a Cocoa user interface. And if you went to the session yesterday, I think you saw an IB demo where they actually turned this into an interface builder palette and ran it in Photoshop as a plug-in. So pretty impressive. Let me go back to slides. Thanks.

So let me spend a few minutes on Mac OS X OpenGL and talk about the architecture there. It's really a state-of-the-art architecture. We approached the whole problem as more of an OS resource management problem rather than kind of a dedicated, hardwired game console architecture because we knew we wanted to host a lot of applications. We knew we wanted to accelerate the whole desktop with OpenGL, so we had to be very careful about our resource optimizations, make sure when an application wanted everything it could out of the GPU, everything else paged off and it can get entire access to it.

When there are a bunch of applications going, they share it pretty reasonably. So there's a lot of optimizations in the resource virtualization. There's data flow optimizations as well to try to get textures to the screen as fast as possible. And with some of the Quartz Extreme technology that you've seen, it's really a great tool to bring to people who are used to doing overlays.

The other important thing is that Apple co-develops the drivers. We work with NVIDIA and ATI on their drivers. We're in constant contact with their engineering teams. We have the source code. We can make changes, and that allows us to do single-stop shopping for developers. If developers need a certain path tuned through the system, we can do it once, and it'll work across all the hardware. Which is really nice. And so we can respond very quickly to your request.

If you need a new feature, for example. It takes a few years to get things in the hardware sometimes, but if there's some application you think would be really cool, we can talk to the vendors. And since we write them checks, we can say, "Hey, could you throw this in the chip?" The other thing that's important is it allows us to maintain consistency across the product line.

I mean, we do support vendor-specific extensions, so we don't take a new product line. We don't take an extremely hard line on that, because we realize there's a lot of innovation in things like pixel programming and stuff like that. But we do want to maintain things as consistently as possible. So for example, non-power-of-two texturing. NVIDIA has an extension. ATI has an extension. We just did one, and we made it work on everything.

The other thing is, since we work closely with the vendors and write them big checks, is that we have a lot of visibility into what's going on with 3D graphics hardware. So we can help push your requests, as I was saying, and sometimes we can find a unique solution if you have a problem. Since we can kind of see the whole stack, we can understand exactly where the best place to meet your need would be. And we're also starting to take more of a leadership role in the OpenGL ARB.

The ARB is the Architecture Review Board for OpenGL, where all of the standard is set. And Apple has been extremely vocal recently with trying to get the vendors to converge on a vertex programming specification. And we think that's going along really well. In fact, that's in your CD.

So what's new in OpenGL for Jaguar? Well, there's a whole lot. There's programmable shaders, so we have programming at the vertex and the pixel level. A lot of system integration features. We've got real great improvements in texture upload performance. Lots of new extensions and some amazing tools. And a lot of the work that we did for OpenGL, we had to do for the Quartz Extreme effort.

And we kind of made the bet that, hey, you know, if we're going to try to run the windowing system through OpenGL, I'll bet all the performance optimizations we have to do to OpenGL are just going to make OpenGL even better. And that really turned out to be true.

So for example, programmable shaders. We have vertex shading based on the Arb Vertex program proposed standard. Pixel and texture shading, the vendors have not quite converged on that yet, so we're currently using the vendor-specific extensions. On the NVIDIA side, there's Texture Shader 1, 2, 3, and register combiners. And on the ATI side, I think it's actually ATI, not ATI-X, fragment program.

We have a great new tool for Vertex programs called the OpenGL Shader Builder. It's kind of a mini-IDE for developing little shading programs. I've got on the left my program window where I can talk about the-- enter code and have it syntax-checked in real time. I've got a preview of what I'm doing.

I've got a complete register dump of the state of the vertex processor. We can actually run it in software emulation mode, so you can actually examine registers as you step through the code. We've got documentation for the various commands. You can, you know, click on a command and see a little syntax of how to use it. So it's a real powerful tool. If you go to the session on vertex programming, I think you'll be pretty impressed with that.

System integration-- again, a lot of this work came from doing the windowing system on top of GL. So quartz on a texture. We've got really high-quality 2D anti-aliasing in quartz, and it matches up well with the ARGB texture formats that we have optimized in our OpenGL. So you can basically get quartz to draw into an ARGB texture, and that's a pre-multiplied texture, which you can then use in a scene with the appropriate blending modes, and you get really nice text in your OpenGL, and it's very straightforward. The other important thing, obviously, was video if we're gonna be running the windowing system through OpenGL.

So we have non-power-of-two textures. We have YUV texture formats, a variety of formats required there. The other thing, now that the windowing system is on top of GL as well, is it allows us to do things like overlays in a much more natural way, rather than having, you know, separate planes of the frame buffer. You actually have just another off-screen surface in video memory that you can render and update at whatever frequency you want, and you can render your GL underneath at whatever frequency you want, and the windowing system will just keep compositing those every frame update for you.

And that allows, you know, pretty interesting new heads-up user interfaces for 3D apps, I think, where, you know, you're not constrained by an overlay system where you have to give, you know, concrete per-pixel ownership. You could have a translucent user interface, like a drag selection over 3D content, that sort of thing. So we think that'll be a good thing to take advantage of.

Texture upload performance, we put a lot of work into this. We have direct EMA from all of the native kind of Mac texture formats with no per-pixel CPU involvement. What this means is when you call GL Text Image 2D or something like that, GL does not actually copy the data. If you put it in client texture mode, it'll just refer the data and then dynamically map it into AGP when it needs to use it.

Now this obviously introduces a synchronization issue. If you're going to go and touch that texture, you need to know that the hardware is done with it. So we have a fence primitive that you can insert after that texture, and when the pipe clears that texture, you can wait on that fence and have it go on with your work and update the texture. So these things were particularly important in doing the Quartz Extreme work.

And here's the, I think, pretty accurate list of 30 or more extensions that we're adding in Jaguar. You can see there's a combination of Apple extensions, where we tried to synthesize things that were out there already. There's ARB extensions from the Architecture Review Board. Then there's some multi-vendor extensions, those are the EXTs.

And then there's some proprietary extensions, a bunch of things from ATI and from NVIDIA for doing all the real low-level pixel programming stuff. So anyway, we think people are going to be pretty excited about that. We had an OpenGL early bird session on Sunday, and people were pretty blown away by the new features there, so great.

OpenGL Profiler is another really interesting tool that we have. It's kind of a performance analysis console that allows you to attach basically to anything that's running, any OpenGL application that's running. You can attach to it. You can get a complete list of the OpenGL calls and their frequencies, how often it's being made. You can get a call dump of all the arguments to all the calls. You can get a hardware performance monitor. You can trace VRAM used. You can trace hardware wait time on various queues.

You can trace, I don't know, command buffer wait time. You know, there's about 30 different little items you can measure. And you can attach to applications and get all these great statistics. And we've had some developers in and gave them a sneak peek at this, and people are able to get incredible, I mean, the low-hanging fruit out there on some of the apps is pretty amazing. So I won't embarrass anyone, but the tool like this.

The great thing about this tool, too, is, you know, there are some performance profilers out there that some of the hardware vendors do, which are quite good. But the issue there is then you've got to learn, you know, ATI's tool set or NVIDIA's tool set to be able to do that.

And, you know, they don't always have the time to port those things to the Mac. But now we have one central tool that you can use in your apps. And whether you're running on ATI hardware or NVIDIA hardware, you can see where you're waiting. So we think that's a pretty nice thing.

Great. Keep it up. OpenGL guys are in the audience, I like that. So we got a bunch of sessions on OpenGL going on. We've got the programmability session late today in this room, I guess. We've got two sessions tomorrow on what we call integrated graphics. And these are all things in the area of doing video on the texture, doing 2D on the texture.

In the 5.06 session, we're gonna, in fact, show you how to build kind of your own little Quartz extreme using all the stuff we learned and all the extensions for doing really fast layer handling in OpenGL. And that's gonna be a sample code that we just give away. And in 5.13, there's an advanced 3D session. I think we're gonna get a guest speaker for that. Is that confirmed? Shout it out? No? Yes? Maybe. Well, go to that session. It promises to be good.

OpenGL performance optimizations are going to talk you through some of the common pitfalls in OpenGL programming and also look at the tools, the profiler, show you how to use that. So the last sort of drawing API that we have is QuickTime. I'm not really going to talk a lot about QuickTime. You saw some of the demos in the keynote. QuickTime with QuickTime 6 and MPEG-4, obviously really big news. There's a whole track tomorrow on QuickTime in this room, and there's also some sessions on Friday, so don't forget about those.

Okay, so those are kind of all the APIs you can use to get data to the screen. There's another important piece of technology on OS X, which is the windowing system itself. And that's what we call the Quartz Compositor. We talked about this last year, but the basic notion is you've got application content from arbitrary drawing APIs, and the Quartz Compositor is responsible for presenting those on the display.

And so the nice thing about this is it realizes the orthogonality between the composition of the desktop and a graphics API. So all graphics APIs are peers, whether they're hardware accelerated or rendered in software. We get the pixels to the screen in the fastest way possible and present them to the user. So we think the Compositor architecture could outlast any graphics API that's out there. It gives us a lot of flexibility moving forward. If a new graphics API comes along, we can just add it in and it'll get composited.

So this borrows some principles that have been known in computer graphics for a while called digital image composition. In the paper in 1984, Porter and Duff introduced the alpha channel and alpha compositing. And back in those days, it was you had one program that could do spheres and another program that could do terrain. And if you wanted to do a scene, you didn't want to run them both all the time.

So you needed a way to kind of mix the content. And, you know, that's sort of analogous to, you know, fast forward 20 years. What we're doing today is we've got applications. We don't want them all redrawing every screen update. We want to be able to recomposite the display as one element is changing. So we're just applying those techniques in real time with the Windows system.

So that allows us to do, you know, a windowing system like this where we've got QuickTime. We have the Wolf demo you saw. We've got the Translator. We've got the Transparent Terminal with the volume control all being composited together. You know, obviously the great looking Aqua icons and all that stuff clock in the corner.

You see that CPU monitor's up pretty high. What are we gonna do about that? So just to review the model, So applications drawn to buffers, and then the buffers are composited together onto the screen. So, as you saw the CPU meter there, the model can be computationally expensive in some cases if there's a lot of transparency, particularly if you're trying to blend, say, video, say, the volume control with video or DVD or something like that.

And we also get issues with people who are concerned that we've taken away the frame buffer. But we haven't been able to tell the whole story of why we're going in this direction until this conference. So, we've been working on Quartz Extreme for a long time, and the whole driver architecture on OS X is based around it.

So, it's kind of nice to be able to finally, you know, tell you guys what we're up to. Which is, we really see GPUs kind of taking over the presentation aspect of the desktop. And so, we went and we implemented the Quartz compositing logic on top of OpenGL. The nice thing about that is that it completely removes the transparency tax for video in 3D. So, you can have layers of terminal. On top of 3D and video, and everything just works.

It frees up the CPU to do useful work while the screen composite's going on. And it allows us to kind of showcase the GPU and the user interface. So, if you have, like, some screen animation, maybe it works a little better on a higher-end machine. Maybe there's some little flourish to it that we put in for our high-end users. But we really think that we're going to get a lot of advantages out of something like that. And as I said, since it's all available to you, you guys can do that too.

So, with that, let's do a demo. If I can get the demo, demo two. So kind of the first thing to see is the translucent terminal demo. Oh, these are great. That was a great demo. Let me get my CPU monitor going here. One of the things I have is a little frame counter.

So this just kind of shows you the frame rate we get on the display. So I've got about three layers. It subtracts out itself, so that's why it sits at zero. We've got about three layers of terminal going on here, and you can see what happens when I start to drag a window.

[Transcript missing]

Add a few more layers and there it is. So, that's pretty cool. Thanks. But wait.

So one of the things that we weren't able to do before Jaguar was be able to really have 3D take its proper place in this scene. I mean, you always had 3D with a black rectangle around it, and it was just blasting to the hardware. But now, with Quartz Extreme, we can just put 3D in a layer, and then add-- you can see that gear is actually transparent, and it's animating, and it's in its own layer in the system. And you can see I'm getting pretty good frame rate.

So the 3D is somehow able to get over 60. I'll have to ask the 3D guys how they pulled that one off. Here's a couple more little widgets we can get going here. And we're pretty much pegged at 70 frames a second here. So, and they're all on their own layers, as you would expect.

What was I going to do next? Let's bring up another 3D widget on the desktop, which is the NVIDIA Chameleon. So the guys went in and they took the NVIDIA Chameleon demo and kind of stripped out the vine he walks on. I don't know if you guys have seen this demo. And made him render on a clear background. And then Quartz Extreme comes in and composites him in. So he's actually transparent. You know, he's running with, you know, full vertex shading and all of that stuff, just getting blended into the desktop, you know.

[Transcript missing]

We were talking to the guy who works on the screensavers, Mike Trent, about the Quartz Extreme, and he's like, "You know, I could take the screensavers and just run them in the background because everything's getting composited. So what's the window number to get between the desktop and the icons?" And so we're like, "Eh, well..." So we told Mike.

[Transcript missing]

Cruising along about 60 frames a second, everything's pretty interactive. This is a GeForce 4 TI in here, so it's not breaking much of the sweat yet. I think if you go to Ken Dyke's session later, you'll try to see where it breaks down. This thing's got 10 gigabytes per second of memory bandwidth on the video card, so probably a number you haven't really dealt with before. So I think that's enough on that demo. Great, thanks.

So a few things I wanted to talk about on kind of a different level than you might see in the kind of keynote demo of this kind of stuff, but kind of why architecturally we're moving towards this fully accelerated desktop. And it kind of goes back to just first principles of computer architecture, looking at programmed I.O. versus DMA. You know, in programmed I.O. model, traditionally the CPU, you know, maps in some registers and pushes on data to send it to a device.

The problem is that the difference, you know, in the compute capacity of a, say, gigahertz CPU and a 100 megahertz I.O. bus is pretty inefficient. So traditionally what people have done is they've moved to a DMA-based I.O. model where you tell the device you want it to take its commands or its data from a certain area, and you tell it to go, and then it just directly pulls the memory to the device, and you can check when it's done or get an interrupt or however the driver architecture wants to do it. And that allows the CPU to proceed.

And do something else while the I.O. is occurring. Because I.O. is really basically a pretty straightforward thing, and there's no reason to, you know, spend your CPU doing it. So if you think about it, the CPU drawing in the frame buffer, and the reason we took it away, is the CPU really is, it really is just programmed I.O. is what you're doing. That's not a very efficient way to do things.

And the cards have plenty of bandwidth to pull the data out of system memory while you're doing something else. And you can get much higher frame rates DMA-ing your content up to the screen. and you can actually try to reach out over the bus and touch pixels anyway.

Another interesting kind of thing that's going on in the industry is the difference between CPUs and GPUs. CPUs typically have a higher clock rate, but they do less work because they can't be parallelized as much. You know, you think of them hitting little 4K pages and taking little sips of water out of the memory system, whereas a GPU is just sort of gulping along, chunking through, you know, gigabytes per second.

The other thing is people in the industry have kind of quipped this Moore's Law cubed, which is the fact that if you look in recent history, the GPU performance is doubling about every six months versus the standard rule of thumb for CPUs, which is 18 months. So that's kind of a cubed factor and 3X the exponent there. The other thing is to look at the transistor counts. Since graphics is such a parallel problem, you can just duplicate out.

You can just duplicate out the same pipeline and do more and more pixels at once. And so that's an extra degree of freedom that the CPU guys don't have. So for example, GeForce 4 Ti has 63 million transistors in it compared to 10 million in a G4. So pretty impressive what's going on. And obviously, you know, again, it's not just for games. We think there's a lot of things that we all can do with it, both in the desktop user experience and in your applications, to take advantage of all that hardware.

So a block diagram of Quartz Extreme. Basically what happens is the application draws its content, whether it's software rendered or hardware rendered, into some kind of buffer, whether it's a window backing store in system memory or a surface in video memory. And then the Quartz Compositor programs the GPU with OpenGL to just pull all of that data up into the GPU and composite it onto the display.

So for surfaces that are in system memory, like the window backing store that's happening over the AGP bus, for surfaces that may already have the gears rendered in it, it's just turning around and treating that GL buffer as a texture to feed into the final composite. We're going to talk more about this in the Quartz Extreme session later today.

So the other thing that's interesting to point out is Apple is leading the industry here, but everyone else is going to do this because it's sort of the natural evolution of windowing systems. And there's been lots of sort of hacks out there to try to do transparent menus and little flourishes in the UI that need this kind of performance. But we really wanted to just do it right and get it done with so that we could just ride the headroom of the GPU.

And we think it is kind of an inflection point, to use a fancy word, in platform graphics architecture, that now that GPUs are all kind of doing similar architectures in terms of DMA in both command data and texture data, we think we can just treat them a lot more like we treat regular I.O. devices and really just kind of use traditional OS and operating system techniques to manage them and bring out their performance to their fullest potential.

And the great thing about all of this... The advances we've made in OpenGL to kind of make all of this work and be well-behaved are directly usable to you if you're an OpenGL app. All of the extensions we use to run the windowing system are in the headers, so you can use it.

So there's a session this afternoon that talks about if you want to leverage the windowing system, for example, to do transparent overlays and just understand the model in more detail of what we're doing, there's the Exploring the Quartz Compositor session in Hall 2 this afternoon. If you want to understand, well, how did they actually implement Quartz Extreme and what are the GL extensions I need if I want to blend three layers of video, then you go to 506, which is the Integrated Graphics 2 talk, where Ken Dyke's going to walk you through a whole example of how to do all those layer calculations.

So that's it for the OpenGL, Quartz Compositor, and all of the interactive APIs. There are some other... Other technologies that I want to cover here as well. These are not technologies that necessarily get pixels on the screen, but there are other things in the graphics and imaging area, and there's some exciting new opportunities going on. The first one is ColorSync.

ColorSync is our standard color calculation engine based on the ICC standard for color matching, which Apple helped introduce. And you can read more about it at color.org. But it basically is a framework for color calculations that represents device transforms as profiles. And you can string a bunch of transforms together into a pipeline chain and then send pixels through the chain.

And ColorSync will concatenate all of those and create what many times turns out to be a big 3D lookup table to get f of RGB equals RGB out. It's built into Quartz 2D, so all the color handling, as I mentioned before, goes through ColorSync. It's also been successfully used in the print industry for a long time. And it's emerging in the film and video markets. There's been some activity in the ICC.

Where people from that industry have come and said, well, here's some special things about the real dark surround you get in the theater. And some new techniques for the color workflow in film production. And I think that there's a lot of synergy there that could be pretty interesting. So what's new in ColorSync is we have a velocity engine implementation of the color matching engine. We have ICC4, which is the latest profile spec.

We've added a bunch of convenience color spaces to Quartz 2D. So that if you want to get, for example, the user's default RGB color space, you can make a straightforward call into CG. And get that color space and start drawing with that. And the last thing which is kind of interesting, and this is just sort of emerging now, is the ability to do real time color correction.

Because of the GeForce 4 Ti, which admittedly is right now a high end card, they have a feature called 3D Dependency. And it's a feature that allows you to do 3D dependent texture reads. And what that means is you can give a 3D texture to the hardware and have it send RGB, use RGB as indices into a 3D table.

And then produce an RGBA value out the other end. So obviously since that's a lot of what color management calculations have to do, I think there's some really interesting opportunities here. And we're going to have a demo of actually taking a profile and converting it to a 3D dependent texture.

And rendering it in hardware in the ColorSync session. So I thought what I would do here is recreate the Keynote demo, which uses the, if I can get the demo machine, uses basically the same, let's see. Where did I start off that frame counter? So I'm going to go into a little more detail than Richard Karras did in the keynote, because it's kind of a fun story behind this demo. So we knew we were going to be showing

[Transcript missing]

So that's what's going on there.

The other thing, we were also like, what about the stage? You know, what do we do about all that stuff? And the guy said, oh, there's this thing called garbage mats, which are these things you paint and they composite in to get rid of things you don't want. So that's just a static image that you have to render through. So we went and did that. And again, the hardware, that's pretty easy. See, I can tune out that green a little bit more.

And then for another piece of the color cube, although we do the green extraction and the matte extraction and this color correction all in the same table, because we just throw it all together into one big RGB to RGBA transformation, we made like a little brown hue adjustment to try to get these guys fit into the floor. These layers also scale, so I can kind of scale them back there.

This layer is kind of funny. We got this layer another three gigabytes later. It's like, "Okay, so this is a CG train rendered with film noise in green screen." We're like, "Why couldn't they have just rendered an alpha channel from the renderer instead of putting noise in the alpha channel?" I guess this is the way film guys work.

So the unfortunate thing about that is you can't, you can see how the greens are all over the map here, and in fact in the corner you can't really get rid of the, I'm pointing at this, you can't really get rid of all the green the way we have the tolerances set up, but, you know, someone who spent more than a day trying to get this to happen could probably do a better job.

So you can see how that works. They also put film noise in it. You can see the little pops as the train goes by. That's all rendered in the CG layer. I mean, I could drag more layers in here too, but one thing I wanted to do that they didn't do in the keynote is another little...

[Transcript missing]

What I'm going to do is do a print preview of the Word document. And what that does is create a PDF file that then is opened up in preview.

The nice thing about Word is it doesn't actually draw the white page. It just leaves it clear. So what I've actually got on this PDF file is just the text. And so what I can do with that is just drag it in on a layer over the content. That's pretty fun. It's a little jerky, and I'm not sure why, but we can fix it in post, as they say. Let's see, what else did I have? I think that's it for that demo.

So the ColorSync session is Session 509 on Wednesday. You can go there and hear about how ColorSync fits into OS X and also another peek at how you might be able to use some hardware to do some color correction. Another important technology we have is Image Capture. Image Capture you're probably familiar with if you've ever used iPhoto.

It's the application that basically manages all the cameras attached to a computer. We introduced this last year, and we knew the iPhoto team was working on their application, so we worked pretty closely with them and all the camera vendors to get camera modules for a lot of the consumer-level cameras.

The way the architecture works, basically we have an input architecture, which is a camera module. It flows into an Image Capture framework, which can run as a daemon. Then your application attaches to the daemon, and it can enumerate all the cameras that are attached. It can ask the cameras for pictures. The Image Capture framework isolates you from the details of whether the camera might be a PTP, which is a USB camera standard, or some cameras are vendor-specific proprietary protocols.

Some cameras are just mass storage devices, and they look like disks. But all of those are kind of hidden from you if you just ask Image Capture, "Hey, are any cameras attached?" "Yes, there's 30 images in it." "Okay, give me image 5." That kind of API, pretty straightforward. But the new thing that we've been working on is scanning.

So we've got scanner modules which fit in in an analogous way to the camera modules into the image capture architecture. So again, you write a module for image capture, it can recognize it, and you can pull it into your application. One of the nice things we're doing is we're integrating with Twain. So, yeah, go ahead. No Twain.

Twain is a good standard a lot of vendors have adopted for developing their scanner modules. And we have kind of a bridge between Twain and the image capture framework that allows you to examine any Twain devices on the system as well. We've also worked with the Twain folks on making the DSM.

If you go to their website, they have one that's CFM only. We've been working on a fully native version so you can call it from Cocoa. And it works across... across the board. And so that'll be coming out soon. In fact, I think a copy of it's on your CD.

We also have a basic scanner UI. Just like for image capture, we have the basic image capture panel that comes up when you attach a camera. And we have FireWire camera support. So some new professional cameras coming online which have FireWire interfaces. Here's a picture of the scanner UI. I mean, it's pretty basic. Just meant for like a consumer plugs in his camera.

Wants to take... Get a scan... A picture off the scanner. You can see we got a little translucent cropping rectangle. You do a preview scan and a full scan. Pretty straightforward. Not a lot of bells and whistles. But, you know, something my mom could use kind of thing.

So if you want to hear about that, go to the Image Capture session. That's on Friday at 2, session 515. The last thing I'd like to cover is printing. So there are some changes in printing, but let me set the stage a little bit here. So in the beginning, QuickDraw did it all. The Mac was revolutionary in the sense that QuickDraw was the same API. It unified paper and display and had nice-looking fonts and did the whole... You know, even just the fact that it was black text on white, you know, not all computers were doing that then.

Just to try to get that simulation of paper really nice with the graphical user interface. But the reality was that as people tried to really do... The publishing industry saw it and said, you know, that's really cool. We can do new applications. But the reality was QuickDraw kind of fell down a little bit in terms of what it was capable to do. It didn't have outline, didn't have outline fonts, didn't have a very sophisticated processing model behind it.

And so Adobe invented PostScript. And what happened is, you know, QuickDraw and the platform graphics architecture kind of got sidestepped. By QuickDraw. And we've been living with this kind of combination world where, you know, you can do some things in a QuickDraw printer and you can do some things in a PostScript printer. But the real professional imaging model is this, you know, total bypass shunt where people are just sending PostScript to PostScript. And then people who buy inkjet printers don't get nice outline fonts and jaggies and all of those problems.

But, you know, PostScript was a great invention and, you know, really proved that this is an imaging model that, you know, is good enough for professionals and can describe all the pages. So really one way to look at what we've done with Quartz 2D is kind of go back to that and say, well, you know, let's just have a 2D API that's kind of good enough to describe every page ever printed and use that and go back to the original idea of having a unified imaging model.

And so that kind of tells you where we're, what our thinking was going behind the printing architecture on OS X. And so what happens is, an application like a Carbon application in this example, Internet Explorer, calls Quickdraw to draw its pages. The Quickdraw commands are converted to Core Graphics calls or Quartz 2D calls.

And those files are, those calls are recorded into a PDF file. That PDF file flows through what we call a spooling system to the back end of the print architecture where the PDF file is decoded. It's either converted into PostScript for PostScript, or rendered for a inkjet printer using our Quartz 2D rasterizer.

So what we've done in Jaguar is completely replace the spooling system. So we have a new spooling system that's based on open source, so we think it's a really great addition to what we have and allows us to do some pretty new features. But don't worry, it's binary compatible with your apps and with the drivers that are out there, too. So we've kind of ripped out most of the kernel of the data flow processing, but we've kept compatibility.

And this allows us to do a couple new features that we're really excited about. Printer sharing with IPP, which is a better replacement to the old USB printer sharing we had on OS 9. And the Cocoa Print Center is another little thing we put on there. We took the old Print Center, rewrote it in Cocoa, shrunk the code size by about a factor of six, and it actually looks nicer and works better. So that was a good thing.

What are we using for a spooling system? We're using something called the Common Unix Printing System, or CUPS. It uses printer sharing via IPP. There's a book out about it. You can read it. You can go to CUPS.org. IPP is a kind of dialect of HTTP. It uses a different port.

But basically, the way to think about it is you're kind of doing put of your print data, and the server on the other side is taking it. So it's a replacement for kind of the LPD kind of socket wire protocol. But on the flip side, as far as the Unix user is concerned, CUPS also comes with a full set of LPR and LP emulation tools. So finally, you can LPR to that inkjet. It's sitting right next to you, which you couldn't do in 10.1.

And if you like LP, you can do that, too. And so it's a really flexible architecture. If you go to this session and hear about it, at the back end is a really nice kind of MIME conversion table where you can take source formats and convert them to destination formats. And the spooling architecture figures out, you know, how to get your source document to a format that the printer can understand. It's been picked up by several Linux distributions.

And people have been really successful using CUPS in a lot of environments, and we're pretty happy to be working with those guys on the OS X version. So with that, I'm going to do the ever popular printing demo, rarely done on stage. My lovely assistant Rich will come up and help. Rich Blanchi.

So if I could get demo four. So what I have here is a PowerBook. You can see it's not attached to anything except power and video for you guys. But it's running a wire, it's on a wireless network. We've got an airport card in there. These two guys, these two computers are on a private, you know, little airport connection. And what I have here is an iMac data sheet. And you'll see, oh, by the way, this is the new preview. You can see we've got little page thumbnails now. Anyway.

If I bring up the print dialog, I don't have any printers because I'm not attached to anything. There's nothing on the network right now. And so Rich's machine is not attached to his printer. He's going to attach the printer. And what's going to happen is that machine is going to get the USB hot plug event, understand that there's a printer attached. He's got printer sharing enabled, so it's going to send a broadcast packet out on the network. And sometime in the next 30 seconds, DeskJet 840 is going to show up here.

Now the real hard part is can I actually print? We have an HP printer here. Yep, it's starting up. So we'll leave that to print. It's a little bit of an older printer. No, we're on a private network. Stay away. Let me get back to the slides for a sec.

to tell you about the printing sessions that are coming up. So there's the printing in Mac OS X. Well, you'll hear about printing APIs we have and the architecture update. That's on Thursday. Tomorrow, there's a Darwin printing session. We've invited the guy who wrote CUPS, Michael Sweet, to come and talk to you about CUPS. And all of CUPS is live in the Darwin repository, right, Rich? Starting now.

So you guys can go see what the spooling system done and the few minor changes we had to make to make it work on OS X. And what do you got there? You got half a page or so?

[Transcript missing]

So the other piece of feedback we got from all you guys was documentation.

So we're trying really hard at this WWC to tell you about all the documentation we've done. We know there's still more work to do, but in the area of general graphics technologies, there's been thousands of pages of new documentation. I'm sure there's still more you want, but there's a bunch of revised books, a bunch of QA samples, some new tech notes, and there'll be more showing up after the conference.

We've got a bunch of new code samples, and there'll be more showing up after the conference. We've got a full set of OpenGL man pages. And don't forget about a lot of the technologies in our area, like PDF, PostScript, OpenGL, CUPS. These are all things that you can just go buy books on and read, and that's really good, too, so you can understand what's going on there. Peter Graffagnino So with that, I'll just point you at our feedback session, which is this Friday, last thing of the conference.

Peter Graffagnino So with that, I'll just point you at our feedback session, which is this Friday, last thing of the conference. I'm sure you've all been there. I'm sure you've all been there. I'm sure you all are going to hang around to give us lots of feedback. But we'll be there Friday at 5 in room J1. And that's all I have. Thanks a lot for your attention, and have a great conference. Thank you.