Graphics and Imaging • 58:36
Apple's Keynote application redefined "presentation graphics" by fully leveraging the power of Mac OS X's graphics stack. View this session to gain insight into how next generation applications, like Keynote, take full advantage of Quartz 2D, OpenGL and other platform technologies.
Speakers: Travis Brown, Brad Vaughan
Unlisted on Apple Developer site
Transcript
This transcript was generated using Whisper, it may have transcription errors.
I'm here to sort of announce a sort of special session that we have planned for you guys today, which is Session 202, Technology, Magic, and Keynote. And we're actually going to do something a little bit different for WWDC. What we're going to be doing is we're going to be basically talking about an application, one of our applications, Keynote. And for many of you, you're probably aware that in January of this year we launched our own presentation product, which uses a lot of Mac OS X technology to do some incredible work in terms of bringing really high quality, high production values, and ease of use to the act of doing presentations. And as an evangelist at Apple, I do a lot of presentations, and I can tell you I've enjoyed using this tool, this application, every time for every presentation I've done since January. So we wanted to actually have an opportunity for the keynote team to sort of tell developers about how they made Keynote happen, and particularly which technologies they adopted. Because I think it's a very, very interesting story. And so the really interesting thing we have here is that we're going to be talking about an application rather than a technology. At WWC, there's 175 sessions.
Many of those are going to be talking about new technologies, technologies that have been improved for Panther. And it's really specific, down low level to specific technologies. And one of the things that I encounter often and with working with developers is that it's great to see the technology, the API in a presentation, but in many cases it's difficult for you guys to see how that technology actually unfolds in a real living application. So by talking about an application, it allows us to show you essentially technology in action.
The technology actually doing what it's intended to do and add value to what a user would be doing using the software. And then another point is there's a tremendous opportunity in the technology portfolio that's Mac OS X for different technologies to synergize. For example, you'll see in Keynote during the course of the presentation how the type system and, for example, Quartz 2D, our 2D drawing API, go together to really create incredible typography, or how Quartz and OpenGL can play together for interesting transitions and just great-looking application content. And I think the key thing that we wanted to do is to help use an application as an example to articulate the possibilities that are behind every technology that's available in Mac OS X. Because we've really sort of reset the technology button when we moved to Mac OS X, because we brought together a lot of brand new technologies that aren't familiar to a lot of the developer community. And in many cases you have your existing code base and it's tough for you to see what the horizons of those technologies offer you and your users.
So why do we want to talk about Keynote? And I think the main reason we want to do this is because Keynote is very platform compelling. It's an application that uses a lot of technologies, and at the same time, it's really simple and it's really powerful. And it's kind of like Mac OS X. Mac OS X has got the power of Unix at the core. It's also got a very simple expression in terms of the Acura user interface, and that combination is fantastic. One of the reasons Keynote is really able to do this is because Keynote uses the core technology that's built into the operating system. So the keynote team was able to really sort of think about how do they make a great app? How do they make a great app that adds value to what the user is attempting to do with it rather than have to worry about architecting graphics engines, architecting 3D transition effects using their own engine? And that's a very important point as moving forward, as you guys really engage the platform and develop your own sort of platform-compelling Mac OS X applications, is where does the engineering that Apple does every year that we improve the operating system What advantage does that give you in terms of if you adopt that technology, what can you do somewhere else inside your application to add value to your user experience because you don't have to worry about the dredger of inventing your own technology portfolio. And then one of the key points is a lot of the technologies that Keynote uses are very important technologies for Apple. They're things like Cocoa, Quartz 2D, our new 2D drawing API, QuickTime, our multimedia architecture, and obviously we spend a lot of time talking about OpenGL and how that's That's the gateway to accessing the power of the GPU. And then also we're going to talk a little bit about XML. XML is a way to contain document information. And then I think the key point is these technologies are available for you to use as well. And then one of the other things we wanted to do is essentially enable you to learn a little bit from the development cycle of an application that's being brought up with all these new Mac OS X specific technologies. And part of what the keynote presentation is going to communicate, in addition to which technologies are and Keynote is to actually provide areas where, hey, the technology wasn't a perfect fit, and we had to do some working around, and we had to do something special to make it work for application. Because many of those cases are significant for you as developers as well as you adopt those technologies. So it's my pleasure to invite Brad Vaughan, the Keynote Engineering Manager at this stage, who's going to take you through the presentation. Thank you.
Thanks, Travis. Brad Vaughan, I manage the Keynote engineering team. And we'll do a quick demo of the application, in case you're not familiar with-- actually, can I get a show-- who's used the app? And I know you've all seen it, because you're at WWDC, you're looking at Keynote all day, right?
But first I want to emphasize the two things that Travis kind of touched on that I want to kind of make themes for my talk here today. The things that make Keynote great are the technologies that are available in OS X. I'll talk about those in depth. And the second point is that in some cases, the high-level APIs, maybe the out-of-the-box functionality isn't exactly what you need. I'll show you. In fact, in some cases, I'll show you code that explain how we worked around those issues, how we extended the available frameworks, et cetera. So let's take a look at the app.
So the first thing, this is Keynote. First thing you see is what we call the theme chooser. You have out of the box 12 beautifully designed themes that are created by Apple designers, each of which contain nicely coordinated background graphics, fonts, colors, shadows, et cetera. So let's create a new document with this leather book theme, for example.
Create the document. The first thing you see is on our main window, it's very simple. This is Aqua. This is standard UI controls that everybody's used to, something that's easy for users to get into and quickly manipulate their own documents. Over here we have... what's called our slide navigator. So I can easily kind of move around my slides, organize my slideshow like this. It's very simple.
Got other views, we can edit outlines if you've got kind of a text oriented presentation, but let's take a look at this one. I can change masters. Let me show you a feature we call alignment guides. So this is a bad example, let me show you here. So you can see our alignment guides a little better. And you notice these little yellow lines snap.
These are actually static alignment guides. When I create new objects on my slide, I get dynamic alignment guides, so I can easily manipulate my graphics and I get a new alignment guide so I can center these guys. If you do a lot of graphically rich presentations, this is very nice.
Couple other features, let me see, make another blank. Make a new slide and show you our charts. So charts and tables are built into the application. You don't have to go out to an external utility or another app to create a chart. So I just click my chart button there. I've got a data editor. Can go in and edit my data live, updates my histogram.
So that's kind of the gross control of my chart. I can move my legend around, resize, et cetera. Over here is my chart inspector. So I get kind of more fine-grained control, and I can pick from several varieties of chart. Here's a pie chart. As you see, the content that's created, when I create text or I create a chart or create a table, it's all nicely integrated with the template, the style that I've chosen, which in this case is leather book. So it's really kind of hard to make a presentation that's ugly. You have to do some work. So yeah, so I've got, you know.
A lot of control over how I manipulate my chart there. So let's take a look at another feature, tables. It's as easy as clicking the table button. Over here it's automatically switched to my table inspector. Let's make a table of two rows and I'll put some content in there. Let's see, I can't have all caps.
Try a few shapes in there. This will invariably be in the wrong column, but I'll put it in there. Researches microscope. I missed one. As I can resize my columns, it's very fast. The graphics smoothly scale, text wraps. And let's do a little build. So as I do my actual presentation, as I'm showing the slides, I can build the content onto the slide. And we've got several options for that. So I'll pick my table. And for instance, this is my build inspector.
I can pick, for instance, a flip build, we call it. Here's my little preview. This would build the entire table all at once. I can also deliver it, say, by row or by column. So let's take a look at what it looks like if I build this guy on by column.
One, two, three columns, very smooth. Finally, well, let me show you next to last, our export formats. We support QuickTime export, so you can generate a QuickTime movie that contains all the transitions, 3D builds, etc. Ship it off to a user on another platform, and they can share it. We also support PowerPoint export and PDF.
So let me show you finally a sampling of some of our 3D transitions. This is a simple kind of fade transition, a push, a move. This is the set of 2D transitions that we provide out of the box, wipe transition, and a pivot. Finally, a drop, so you drop one slide over the other. Getting into the 3D transitions, that's called twirl. This is a large mosaic. And most of these transitions you can control, no, all of the transitions, you can control their speed. Several you can control their direction. Perhaps they come from the center, they move out or in. There's a flip, a 3D flip. horizontal 3D flip, and finally the cube, everybody's favorite. So that's Keynote.
We want to talk about the technologies, the things that make Keynote great. First we'll talk about application frameworks. Keynote built on Cocoa. We'll talk about why we think Cocoa is a great basis for your applications. One of the things Cocoa provides is very simple high-level access to the text technologies that make OS X great. Things you get from ATSUI, Quartz, et cetera. We'll talk about Quartz. Keynote really excels at creating graphically rich presentations, and we get all that power from Quartz. I'll show you some code that we use to manage images in Quartz. Obviously, OpenGL, the examples I just showed you, great 2D and 3D effects that we use OpenGL. We'll talk about some of the kit support for OpenGL as well. Rich media support that you get from QuickTime, and it's not just about playing movies on slides.
We also use QuickTime to generate QuickTime compatible output. Finally, we use XML as our file format. Really excited about that. There's been some community support for our format, and I'll talk about the operating system support that's there for XML. So first, application frameworks. When we chose, when we sat down to decide how we're gonna start Keynote, the main thing we wanted to do was kind of emphasize the power that's in OS X, the graphical, the power behind the graphical technologies that are in there like OpenGL, Quark, Time and Quartz. So we wanted first-class integration with those technologies with this framework.
We choose a high-level framework, we wanna be able to get the maximum out of each of these technologies. We want to be able to grab powerful components from the kit. So we want to be able to use a lot of these things out of the box. We don't have to reinvent the wheel for things like UI components, et cetera. But we want the components to be flexible and extendable enough that we can tweak the parameters, tweak a few things, get the look and feel, the capabilities we want, without having to extend everything. But we want to be able to extend it and override certain behaviors too.
We want great performance. We don't want layers and layers of software between us and the application and the core components, the core technologies in OS X that we want to use. For instance, OpenGL. We want to make sure we have very fast access to OpenGL. And ease of development. We started Keynote, started with a small team, limited amount of time, as I'm sure developers in the audience are familiar with. So it's important to have the best tools, the best frameworks available to get our app to market. And for us, Cocoa was a great choice. Cocoa contains the best frameworks available High-level access to, again, the graphic capabilities, the graphic technologies that come with OS X. Quartz, OpenGL, and QuickTime.
With Cocoa, you get the standard flexible UI controls that everybody's used to in the OS X operating system, the Aqua user interface. And as Aqua progresses, as new releases of Cocoa come out, we're kind of reaping the benefits of those. If you saw the Scott Forstall talk yesterday, the tab view is improving. The switch view is improving. So we're going to get those benefits. NS Image with Cocoa gives you access to not only native formats, that in this image, can decode and display, but also compatibility with everything that QuickTime can do, all the image formats that QuickTime can display. Complex views like NSOpenGL view, as you saw in my little build inspector, I was getting a little preview. That's an NSOpenGL view for rendering OpenGL content right within a Cocoa application. NSMovie view for the same thing with QuickTime, rich media in a window. One thing you get with Cocoa, the NSDocument architecture. This is something that makes it easy to provide to your users the kind of features that they expect from a modern application, like undo, multiple documents in a running app, multiple windows per document.
Cocoa provides macros and classes that make it very easy to localize your app, provide your app for multiple languages. And with Cocoa, you're using Mako. You've got great compatibility with the runtime throughout the system. And you've got NSBundle, the ability to load running code, executable code, and resources into your app. And in Keynote, all our transitions, all our builds, our 3D builds, I think our exporters and our inspectors are all loadable bundles. So in theory, people can load new ones in.
For ease of development, we love Objective-C. Objective-C, an object-oriented language, very, very powerful dynamic runtime. But it's simple enough. You can take, you know, it's got a kind of a small surface area of specific features that are in the language. So you take a Java programmer, an ANSI C programmer, it's a very small learning curve to bring them up to speed on Objective-C. And this says project builder. it should just say tools, right? Because it's not just Project Builder, now it's Xcode. Looking forward to using Xcode quite a bit. Project Builder, a full-featured integrated development environment with features like indexed code, syntax, coloring, documentation built in, source code management integration. And the other tools like malloc debug, sampler, things that let you ship a high-quality application.
So why we chose Cocoa. First class integration with the graphical technologies that we really wanted to highlight in OS X really was gonna make Keynote a great application. Powerful components, things you can grab, use out of the box, they're flexible enough that you can tweak them to your desires, and extendable enough that you can tweak them even further. Best of breed tools like Project Builder and now Xcode.
So let's talk about specific features that Cocoa provides, like Cocoa Text. Well, with OS X, you really get capabilities from several layers of the operating system. And Quartz, you know, anti-alias text, something we've all come to expect now. ATSUI, Apple's type system for Unicode imaging, provides Unicode, multiple input handlers, styled text, and the typographic features that I'll display in a minute here. And what you get from Cocoa is just some of the application level features like undo, key bindings, you have things we don't use so much in Keynote like formatted text input. So let's look very quickly at some of the text capabilities in Keynote. Starting with a blank slide, just click, create a new text object.
Bring up the font panel. You've got access to all the fonts available in the system, TrueType, PostScript, etc. I don't have my slider, so I'm not showing exactly how smoothly it can resize, but you know. It's anti-alias, razor sharp, looks great. Let me go to my text inspector here in Keynote. So I have control over simple things like alignment, color.
I also can manipulate the character spacing. This is all built into Cocoa, no code required here. And you have fine control over character spacing within a certain range of text. And designers love this capability. So if I go to my next slide, I've got some multi-line text. I can control the line spacing, negative line spacing.
And I want to talk about ligatures. So this is an example of a ligature, which is just an elegant way of combining characters that's built into these fonts. This is the Zapfino font. So T and H are actually, it's one glyph, one character glyph. When I type T-H, it combines it.
But you can also manipulate those ligatures. And this is built into Cocoa. So if I turn off the ligature on that combination, changes the look. As I said, this is Zapfino. I wanted to show you something that's just kinda cool. Z-A. In Zapfino, the word Zapfino is one lecture. Looks very hot. Of course, if I turned it off, I think it would look. So, a little boring. This is all in Cocoa. This is all free.
So again, advanced features with Cocoa Text, kerning character spacing, Unicode character sets. So we've got some kanji here in the really nice Hidagino font, ligatures, There was one feature we needed in Keynote, and essentially you've already seen it, that wasn't supported in Cocoa Text, and that's this kind of cascaded style sheet. We wanted to be able to let the user pick a theme, pick a master slide, and it's got a certain design. They get those design attributes.
They make a change, then they may pick another theme. We want to preserve the style changes that the user made. Let me show you how that would work. So the user creates a new show, They use the white theme. Type a little text. Make an attribute change. So they've gone from-- plain text to bold. Finished their presentation, well, white's just not quite flashy enough for me. Let's go to the crayon theme. What we've done is, sure, we've changed the background. We've added some graphics. We've also changed the font of their text, added a little shadow in this case. But we've preserved the attribute changes that the user made, so we've still got bolded text in the last word.
Another capability this gives the user is they can kind of revert. All additional changes, all overrides that they've made can be reverted out. We'll just go back to the standard style sheet. So continuing from the last example, here we are on the crayon theme. The user goes to menu and says reapply master to selection.
and the overrides go away. So we want to go back to the standard master or cascaded style sheet set of attributes. Cocoa Text doesn't really support this. And here's a... kind of simplified diagram of the architecture of the Cocoa Text system. You have NSTextView, that's what you type in, controls undo and key bindings and all that stuff. NSTextContainer controls where the text is drawn. LayoutManager manages putting the glyphs in different positions.
And NSTextStorage is your data store behind your text. This is the set of attributed runs of text that the user sees. Now, when a user types into NSTextView or they take the menu and they say bold, what happens is the text view tells the NSTextStorage set attributes in range. So in that case where the user made bold, they would say, you know, the weight is bold, the range is characters 7 through 15 or whatever it would be.
That wasn't going to quite do it for us. That's a flattened set of attributes. We needed to be able to say, you know, which attributes existed on the master, which ones were created by the user. So how we did this, we extended the NSTextStorage by creating a subclass. NSTextStorage is called a semi-concrete class. When you make a subclass of NSTextStorage, you're expected to provide your own data storage. In this case, it's on the bottom. It's the attributed string, which is just an NSMutable attributed string. That contains all the attributes that the user sees, both inherited from the style sheet and added by the user. What we also added was this master attributes dictionary. This is the set of attributes that existed in the dictionary when the user created this text. So it would be the last example, the gil sans font at size 14 or something.
And when you create a subclass of NSTextStorage, again, it's the semi-concrete class, you have to provide the data storage, you have to override certain methods. This is the list of methods you have to override. String returns just a plain string with no attributes. Attributes at index effective range tells me, you know, that at index 4 I've got -- or at index 7 in the last example, I've got 10 characters of text that are bold. Place characters in range is what happens when the user types or text is pasted in, set attributes and arrange again, which I actually covered last time was, on the last slide was just applying attributes to a range of characters. And what we added in our subclass, we have an initializer that just takes the set of styles that exist on the master slide. We have a method that returns whether an override exists. Excuse me. So that is used for menu enabling or menu validation. Does this menu need to be turned on? Is there an override? Can the user revert this string? Reapply master attributes, which is what happens when the user clicks that menu. That's basically the way that's implemented is we take the dictionary that exists, we apply it to the entire range of the string. Set master attributes is kind of the money method here. What this guy does is he looks at the existing set of master attributes in the string, compares them to the new set. So this guy's called when, set master attributes is called when the user changes the master slider, changes the cascaded style sheet. So we say, for each run of text, for each effective range of styled text, Does an attribute exist in the old master? If so, let's replace it with the attributes in the new master. That preserves the stuff that's been changed, the stuff that didn't exist in the old master attribute list. Thank you.
So Cocoa Text, you have a great set of features, typography, Unicode text, out of the box. It just works with Cocoa and the set of kind of application level undo and key bindings. Minimal amount of coding to get at those features. And they're all supported in the UI. You know, the standard menus for cut, paste, manage kerning, ligatures, fonts, et cetera, just work out of the box. But as we all know, after watching some of these slides, great presentations are not built on text alone.
All the graphical features you get in Keynote come from Quartz 2D. Very quick overview of what you get with Quartz 2D. Path-based drawing, vector-based drawing, so we draw our shapes with paths and they resize perfectly and all that. Control of stroke and dash settings. Bitmap image rendering. It's based on PDF. You can actually have control over drawing PDF documents or sections of PDF documents or PDF clip art. within your application. Transforms, I'll show you in the demo, but rotation, Translation, scaling, shearing, et cetera. And transparency, awesome alpha blending, which again I'll show. Let's take a look.
There's a blank slate. I'll just throw a couple shapes on. And these guys come on with an image fill. Let me look at my graphics inspector. So I can change to say a gradient fill. If this is supported by quartz, I can change my colors, change the angle, make it semi-opaque, It's very fast. Add a stroke, various dash styles. opacity here. Changes the order. And here's some images. So, these images are very high quality, but they rotate very fast. Change opacity. Performance is fantastic.
So again, Quartz features path-based drawing, PDF drawing. We've got a shape with a standard color and stroke and an image where now I've added a gradient fill and a shadow. The thing I didn't mention about shadows. So... What we wanted in Keynote was shadows that are a lot like what you get from the menu in OS X or the Windows, and didn't really have that. We had to actually add some code to do that. The good news is in Panther, the shadow APIs are available, so you guys hopefully won't have to do quite as much as we did. Again, back to the features. Anti-alias text. image drawing, alpha blending, and great performance.
Quartz 2D, you have a PDF-based imaging model, the same imaging model that sends information to the display, that draws what you see, is what's sent to your printer drivers for raster and postscript output. but Color sync, color syncing both to the display and to your output device and on compatible hardware. Quartz Extreme for GPU accelerated window compositing.
Now, what do we need in Keynote that we didn't get from Quartz? We wanted professional quality image manipulation. As you may know, if you heard the Keynote at Macworld San Francisco, it was designed for Steve Jobs, and he's using these very, very high-quality images, multi-megabyte, and we can't sit around and wait while we're rotating and resizing these images. So manipulation has to be fast, but we also have to quickly move. It's a presentation app. We have to quickly go back and forth through slides.
Well, with Quartz, high quality resize is built in. So here's an example of how you do it in Cocoa. You just grab a graphics context and set high image interpolation, set NSImage interpolation high. And Quartz goes off and does, you know, kind of more complicated scaling. If you're not using Cocoa, you can still do this in Core Graphics. Grab the graphics port, use a different flag. It's available.
Now, talking about image manipulation, when a user rotates or moves an image around, we want the interaction to be very, very fast. So what we do is we cache a smaller representation of that image. So if the user is using a 50 megabyte TIFF file, we'll actually save a smaller version if the user is resizing, for instance. Now, this is a kind of abbreviated UML. Our image view is what the user actually sees and manipulates on what we call the slide canvas. The image model... In this instance, the instance variable is called my image. That's kind of the canonical image that the user brought in from the finder or they pasted. So we're keeping track of the original image. But in many cases, and I'll show you where we create this guy, we have a cached image that we use to render instead, and it's much faster. Okay.
So on our canvas, if the user mouses down, we determine whether an image was selected and tell it to dynamically resize. Within that method, when the image view is told to dynamically resize, it determines whether the model has changed enough that it's actually worth doing. So we say, if this image is, say in this case, 90% scaled, then we'll create a cached image, we'll just make a copy, sock it away. Subsequent, on subsequent renders, we'll just render the cached image, and this actually makes a big difference in interaction performance.
Another challenge, so when creating multi-page PDF, as you see in this example, all these slides use, I think with the exception of the first one, they all use the same background image. The last two use the same piece of clip art, this calendar. The one and the eight are actually text, so it looks pretty good. The charts in this example use an image fill, so it's kind of a repeating image. When we generate PDF, we wanna make sure that we're not sending these images out to PDF, to our PDF stream multiple times. How do we do that? Well, PDF supports this optimization. You write the image once and then use this x object reference. And PDF will go back and read it out of the original stream. CG image ref supports this capability. Let me show you how we create these CG images.
Again, back to our UMM model. The image model contains the original NS image that we've instantiated from the finder or pulled in off the clipboard or whatever. We also keep track of a CG image ref. Uses the same data. As long as we use that CG image ref to render, we'll get this optimization in our PDF output. So to create a CG image ref, you have to create a CG data provider. So you can create it. If you've got NS image, if you have an NS bitmap image rep, which image rep in this example is, image rep is an NS bitmap image rep. We just pull the bitmap data out, tell the data provider what the depth is and how big the image is, create that provider, hand that off to CG image create. CG image create, again, you tell it how big the image is gonna be, how big you wanna, actually, how big you wanna render it. The depth, a couple other flags that control the color space. These are just kind of the default flags here, what the alpha value is.
And when you want to draw the CG image, the CG image variable that I've got here, which is a CG image ref, to draw that, you just draw it to the graphics port, the current graphics context, and tell it how big you want to draw. As long as you draw that to your PDF output context, you're going to get this optimized output.
So to summarize Quartz, rich 2D capabilities, path-based drawing, PDF model, all available to you. Layered APIs, if it turns out you're using Cocoa or whatever, Java, and the APIs aren't available, you've always got core graphics, which is the Quartz kind of standard layer of APIs. And phenomenal performance with Quartz Extreme.
Now, that was one of our 3D transitions. All the editing on a keynote slide canvas is Quartz, but how do we get this core graphics, these Quartz objects, up to the screen in OpenGL so that we can do these kind of cinematic effects? So there's, again, the 3D effect. We've also got the 2D effects, so there's a drop. And we can not only apply these effects to a slide transition from one slide to another, but also objects that are going onto the slide. So there's a 3D text build, a little flip. We can also build objects off the slide. So here's a scale. Okay.
As you saw when I was creating my builds earlier, I got a little preview in my inspector. So this is an NSOpenGL view. It's available in Cocoa. Basically drop this view into your Cocoa application and off you go. Your programming OpenGL is actually a really good way to get started with OpenGL programming. So in this case, what the user sees in this preview is exactly what they're gonna get when they go to full screen mode.
How do we get the quartz 2D images and text and all that? Path-based shapes into OpenGL. We use this abstraction that's in the center here called a textured rectangle. So I'll walk you through how we go from something like an NS image. We grab data from the image, apply GL operations to it, as a textured rectangle. Let's look at the code for that. The interface to this textured rectangle, it's not just one kind of blob that we're manipulating in OpenGL. We actually, in order to get the kind of maximum compatibility with-- hardware that's available, in some cases you want to chop your textures up on power tube boundaries. So this textured rectangle kind of abstracts the fact that there are this array of smaller textures that we may have created. And we also track the target, the thing that we're going to actually render the textures to.
The way you initialize this guy, in this case, there are several, it's a much bigger API than this, obviously. You can create it with a bitmap, and when you want to draw, there's an API for drawing all four corners. If your model view matrix isn't transformed, you actually just want to specify exactly where the texture is going to draw, with opacity. Thank you.
Now, in NSBitmapImageRep, the data is actually RGBA. OpenGL will render this stuff, but the most efficient way to do it is using ARGB. So this is just a very simple method that swaps the high order and low order bytes. And it's very fast and now with ARGB, we've got kind of the highest performance manipulation of the textures on the graphics card. We want to draw the image. This first page here shows how we turn on the opacity support. So we hand off the GL blend method to GL enable.
pass some flags to the GL blend function, and we apply the opacity using GL color 4F. There's a, I'm actually not the OpenGL expert, but there's a session on OpenGL optimization where they kind of address why that's important, that the image is pre-multiplied. In order to render the textures, as I said, we've got this abstraction that hides the fact that this thing's been chopped up into several textures on power of two boundaries. So we walk through the texture array and blit it all out. GL text, chord 2DF, and GL vertex 2F to draw at the corners based on the offset from the original texture. So we've chopped it up, and we're keeping track of where the offsets are. Turn off opacity with the disabling blending release the target.
So OpenGL, we're not just using OpenGL to do our flips. The GPU is available. We found a good way to use it to accelerate 2D effects, accelerate coarse 2D output. With Cocoa, you have these high-level abilities like OpenGL view to do previews to get started with OpenGL. And there are several levels of APIs available with OpenGL that are all there in OS X. CGL we use in Keynote to manage the display and GL for the lower-level operations.
So the last graphical technology we'll talk about is QuickTime and I've got a little demo here. This is a... a movie that's set to loop, and I've applied a 50% transparency, I think, to this guy. And I'll build a few additional movies on. Now, each of these movies comes on, they're set to a certain poster frame. This is the frame we wanna display when you first see the object on the slide.
I think you can see, you know, you've got excellent frame rate while the builds are happening. and this opacity, and this is really, you know, the power of quartz extreme kind of work here. So with QuickTime, you've got, you know, you're really leveraging all the advantages that are there within the system. You've got all the image formats, and those are available through NSImage and Cocoa. All these sound formats, all our, you know, the sound that we use in Cocoa, we kind of treat them as movies. We just play a sound and, obviously, full-motion video. And when I say, you know, you're leveraging the power of QuickTime, updates to QuickTime come out. They supply new file formats. They support new image formats or movie formats, and that just works. QuickTime 6.3 came out a few weeks ago. Suddenly Keynote supports exporting 3GPP movies. So we can export a 3GPP movie of slides and it's like a little slideshow preview on your cell phone.
The image file types that are available on your system, so it depends on what codecs and what version of QuickTime are available, but you can ask NSImage what image file types are supported. This returns a list of file extensions and HFS file types. So again, the image formats are available. It's integrated into NSImage and Cocoa. You have NSMovieView, which you can use to preview a movie and control the playback and speed and scrub through it. And a very flexible, extremely flexible core API for not only showing movies, grabbing content from your media, but also export QuickTime movies.
So when you use NSMovieView, this is kind of just the simple look at how you can use NSMovieView. To actually build kind of a small, simple version of QuickTime. The API is you start to start the motion. Set the rate if you wanna fast forward or reverse through the movie. You can control volume and mute the movie.
What we wanted in Keynote, as I pointed out, these movies I was building on, they're all set to a certain poster frame. So in this example, the... The inspector, the top slider controls the poster frame that the user wants to see when the movie comes on. So you can kind of move this guy back and forth and you're scrubbing through the entire movie. So let me show you how you do that using NSMovieView.
The slider sends an action called take poster frame from the slider. The slider value would be a percentage. You want to start at zero or go to 100% all the way through the end of the movie. We determine the floating value, 0.5 would be halfway, from the slider. determine the time offset within the movie that that corresponds to, that percentage corresponds to, and then set it on this subclass of NSMovie that you'd create. You'd say set poster time to that offset. tell the movie to display that offset, obviously, as the user scrubs through.
I do believe I missed a slide. The implementation of those methods. Well, yeah, actually, I did miss a slide. Within the implementation of set poster time, you would use QuickTime API set movie poster time, which takes a QuickTime movie and a time value to set the poster frame. So that's a case where you're kind of mixing the high-level APIs for NSMovieView and the lower-level APIs of QuickTime. Amen. So to summarize QuickTime, phenomenal compatibility with all these image formats, media formats, sounds, et cetera, Cocoa integration with NSMovieView and NSImage to support the image formats, and a rich, comprehensive set of APIs to control media playback and export.
Finally, XML. We love XML. It's really created kind of a community already around Keynote. There are applications available that manipulate and use the Keynote format. We think XML is a great way to encourage innovation, get other parties involved in supporting your app, encouraging use of your application. And one of the things that's not that obvious is it's really good for development. You can validate a set of documents that you generate.
You can generate these documents before your app's complete. You've got your data model, you've got your XML schema or document type definition ready. Generate a set of documents that your app ought to be able to handle. Very simple way to do some testing. Validate the output. Make sure that your archiving or your unarchiving code is actually generating documents that conform to the standard that you've specified for your XML.
What do you get in OS X? Core Foundation has several layers of XML support. If you're just using NSCoder, NSCoder, if you're not familiar, is kind of the standard way of serializing objects in Cocoa. And you can tell NSCoder to use an XML format. The standard format's binary. If you pass to NSCoder this flag, property list XML format, then the output will be XML, and that property list format is defined on some website. I think it's on ADC someplace. You have within core foundation, a high level parser, which you just handed a document or a stream of XML text. It hands back the structure called the CFXML tree. So you can then walk through elements, data, and attributes for each node, construct your data model that way, or just maybe manipulate, search for data, et cetera. There's also a low-level parser, which is kind of a SACS-type parser. Elements are encountered, attributes are encountered, and you can define callbacks. So your code is called into it, then you're gonna create your data model that way.
Well, for Keynote, we use the low-level parser. You know, we encounter a slide, and we go off and create a slide. But when we generate XML, we actually just use a mutable string. We just generate this NSMutable string on the fly, and this is kind of the class that we use to do that. It's called a DOM writer, a document object model writer. So you start an element.
You say, here's what I want to start something that's a slide or a bullet point. end it, write attributes like what color it uses, control formatting with indent, increase and decrease indent, and you can also pass in this structure, this XML element similar to CFXML element to write an actual subtree all the way out to the string.
So our file format is open. The scheme is available at TechNote 2067. There's already third-party developer support. FileMaker has an application that generates and a library of code that generates keynote documents. 4D has the same thing. We think there are a lot of possible applications. You can obviously pull data off the web, pull data from your enterprise data store, generate kiosks, presentations with great 3D transitions. Migrate your legacy data into our tables or builds. I'm sorry, our tables are charts. And I'm gonna actually give you a demo of the 4D presentation builder. What this does is the user interacts with this UI to create a query, generates the XML, and the output is a presentation. I'll show it to you.
So here's the UI. They've set up this kind of sample database of properties. They've got this, I think they include the code to generate these, our format, but this is just a sample database of, for instance, real estate properties. User can go pick a buyer. So the idea is I'm a real estate agent, I'm gonna edit properties of this or that.
Edit attributes of this buyer, what they want to spend, where they want to live. And then create a new presentation. So it tells me I'm gonna show you all the properties in this price range at this point. this price and this zip code, and I want to open the new presentation. So the document's created, Keynote opens, and I'll just play it. So now I'm a real estate agent. I've got this very simple, simply created little kiosk that I can show my client who's been sitting there with me for three minutes. So you get the idea. Yeah, yeah.
The slides, please. Thanks. So XML, it's great for building a community around your application. It's really great for development. Validate your documents, generate your documents before your app's even ready to generate your documents. And there's support in Core Foundation for several levels of XML usage. So Keynote, as we develop the app, We found that Cocoa was the best framework for us. We made great use of the Cocoa text capabilities to make it very simple to get at the very powerful technology that's available.
We use Quartz for creating beautiful, graphically rich presentations. OpenGL for 3D cinematic effects as well as 2D effects. QuickTime, we generate QuickTime movies. We display QuickTime movies. We have great control over the QuickTime movies that are in the presentation. And XML, again, an open file format that you can really build support around.
So what can you get out of this? All these technologies are available. It's a modern operating system. Make use of the technologies. If one way of getting at the feature set you want doesn't work, there's probably another set of APIs, and they're all extensible. First rate tools available on the operating system. Take the people you've got in your organization. I look forward to seeing your apps. Thanks a lot.
So what I'd like to do is just go through the roadmap of the remaining graphs imaging related sessions. Starting tomorrow morning, Wednesday, in the marina, we have image capture update. What we're going to do is talk about the APIs that are available in the system that handle both digital cameras and also scanners. So if your applications need to import or bring graphics in from the real world into the virtual world inside your computer, this is the architecture and the APIs you're going to want to know about.
Later in the day on Wednesday, we have Vertex programming with OpenGL, a big theme for graphics and imaging this year, this track this year at Apple, is leveraging the GPU to do exciting things. And with Vertex programming, we're going to talk about essentially how to use the GPU's ability to transform geometry to do interesting geometrical transformations because there's a lot of interesting effects and complex geometry that you can generate using the GPU. Then we're going to go into a color sync update, And we're going to talk about a lot of the new developments in ColorSync. And again, this year, as evidenced by information in the QuickTime session this morning, the sort of QuickTime overview, where there's been actually a behavior change inside the system with regard to color management. Obviously, in the newer APIs inside the system, we've been building color management automatically into the platform. So for example, Quartz 2D, also Cocoa are very color management aware, but a lot of applications that leverage, you know, Carbon apps that leverage like QuickTime image importers were not handling profiles correctly.
So we're continuing to build out the support for ColorSync by making now QuickTime more ColorSync aware, and this session we'll talk about that amongst other things relating to color management. Then we're going to have a session that will talk about Quartz 2D in depth. If you liked what you saw in this presentation with regards to the graphic capabilities of Keynote, and when you, for example, are using maybe QuartDraw as your key API in your application.
This is number 207 on Thursday in the morning. It's going to be where you want to go to learn about Quartz 2D. And then we're going to have a very interesting session, which is 208. It's sort of the companion session to vertex programming. And this is fragment programming. And this is doing essentially per pixel operations, filters, if you will, using the GPU. And this is a very, very interesting session. I have to say this is probably my favorite session out of the graphics and imaging track just because the capability of doing per-pixel operations on the GPU is so unlimited right now. And now with the advent of programmable hardware, it's sort of a really interesting time in computer graphics. So that's going to be Thursday at 10.30 a.m. Thank you.
Then another key session, if you're an OpenGL application and you want to learn how to optimize your application or you're a beginning OpenGL programmer and you want to learn how to do the right things the first time, you want to come to 209 OpenGL optimizations. And this is really going to focus on bringing a lot of information about the fast paths that we put into Mac OS X's OpenGL. Because it's interesting, our OpenGL stack is optimized and tuned in certain ways that once you put your application on those fast paths, you're going to get unbelievable performance.
So that's 2.09 on Thursday is going to go over that. And then we have 2.10, Mac OS X printing update. And this is going to talk about a lot of the new developments in Mac OS X printing. We're actually going to talk about one exciting one, which is the ability for the system to convert PostScript into PDF now, which is a new Panther feature. And that has significant ramifications for people in printing and imaging. So it's an excellent session to attend.
Then we have 2.11, which is introduction to court services. This is our first session on Friday and this session is going to talk about essentially the parts of the Quartz architecture which do not actually draw anything. These are the parts that control and manage displays. So this is an important session if you have a full screen application or you have an application that needs to find out what the display environment is. Because there's a lot of API that developers have had trouble finding and understanding how to use it we're going to talk about in this session. And then we have on 2.12 our hardware partners, ATI, coming this year to talk about essentially the latest techniques you can do to do interesting visual effects using programmability and their product, the Radeon 9700. So that's session 2.12, cutting edge OpenGL techniques.
We have a partially related session. If you're an application developer, you might want to check out the Mac OS X Accessibility Session, where we're going to be talking about how to make your applications accessible to users with disabilities. It's an important session for anyone who develops an application to attend, because there's a large audience out there who, through simple changes in your application, you may be able to reach new customers who happen to have a disability. Then finally, we have the main opportunity for you guys to give us your feedback and let us know what you thought of the sessions, what you want to see inside the OS. Essentially, let us know what we need to do to continue to involve Mac OS X as the most innovative and graphically powerful operating system on the planet. And that's the feedback forum, which is in our traditional slot, Friday at 5:00 p.m. So, thank you very much for attending this session. Appreciate it.