Configure player

Close

WWDC Index does not host video files

If you have access to video files, you can configure a URL pattern to be used in a video player.

URL pattern

preview

Use any of these variables in your URL pattern, the pattern is stored in your browsers' local storage.

$id
ID of session: wwdc2000-304
$eventId
ID of event: wwdc2000
$eventContentId
ID of session without event part: 304
$eventShortId
Shortened ID of event: wwdc00
$year
Year of session: 2000
$extension
Extension of original filename: mov
$filenameAlmostEvery
Filename from "(Almost) Every..." gist: ...

WWDC00 • Session 304

QuickTime: Interactivity

Digital Media • 1:20:30

Not only is QuickTime a multiplatform media engine, but it also has the ability to add interactivity to your media. Learn to enhance the overall user experience of your product using SMIL, Flash, wired sprites, and QuickTime VR.

Speakers: Kaz Ohta, Eric Blanpied, Sean Allen, Bryce Wolfson

Unlisted on Apple Developer site

Transcript

This transcript was generated using Whisper, it has known transcription errors. We are working on an improved version.

My name is Kaz Ohta. Myself and a few more people will talk about the interactive aspects of QuickTime in this session. I was told that there is simultaneous translation going into Japanese, which is my primary language. So I'm kind of curious how many of you actually understand Japanese. Just raise your hand.

Quite a few. Well, I guess I'm going to try speaking Japanese. Ohayou gozaimasu. Nihongo de ikimashou ka? Well, I don't think the translation goes the other way around, so I'm going to speak in English. Anyway, I'd like to spend a few more time talking about some of the Halberd overview of interactivity.

So in this WWDC, the interactivity has already gotten a lot of exposure. How many of you went to the keynote on Monday? Good. Where Steve Jobs showed a demo of our new fully immersive panorama projection engine. How about the QuickTime overview session on Monday afternoon? Very good. Where Tim Schaaf talked about interactivity and Jim Batson showed a few very interesting demos.

And I'm sure those of you who went to the software technology and scripting session yesterday were amazed with the stuff Matthew Peterson put together, which actually pushed the interactivity technology to the extreme. I'm sure nobody in the QuickTime engineering team had imagined that you could do that much with QuickTime.

So I think you already have a pretty much good idea about what a QuickTime movie can do in order to deliver interactive user experience. So what we're going to do in this session is to pick a few specific features and take a closer look, and then talk about how you can take advantage of it.

So what we're going to talk about is the kinds of user experience you can provide through the interactivity features. And we also are going to talk about some of the tools and techniques you can use. Some of these things are already available to you, so you can try them out today. And some are things we're working on right now. Of course, we try hard so that these new things become available to you as soon as possible, but don't try them at home yet.

So for those of you who are already familiar with some of the details of QuickTime, these are kinds of things we call interactivity. And how those pieces fit together can be shown conceptually as in this diagram. We have the foundation of QuickTime Core at the very bottom. And we have a few services which provide additional interactivity to your media.

And we have a set of standard interactive media like sprite, flash, VR object, and VR panorama. And on top of all of these, we have the movie controller, which works as the interface to the user. By having the standard movie controller here, we can provide consistent user experience across various kinds of media.

And I think you learned some of the techniques you can support streaming movies better in your playback application. And I think Kevin Kofun did an excellent summary of some of the techniques and tips and techniques in your application. And the important thing is that many of the things which allows streaming, which supports streaming movies better, these kind of things works for interactivity movies better. So please remember that.

So what is interactivity? Here's how we define it. It is not driven by time, but driven by events generated by user. A good example is a slideshow like this. It goes forward. and Backward based on your action through the keyboard or mouse or a remote control device like this.

[Transcript missing]

The key element here is that the media react in response to your action in real time. A good example is a QuickTime VR panorama, where you can change the view in panorama based on your action. One more important thing I'd like to talk about is the ability to navigate through a media composition. One of the strengths of QuickTime is to be able to combine various kinds of media into a single composition or single container. And in that case, it is a natural consequence that you would like to have control over combined media.

In that case, the way those media are composed is specific to a presentation. The presentation has to provide a way to control these media. This is where the interactivity comes into play. You can think of a presentation which has a couple of video clips, each of which has its own controller. Movie embedding is something you can use in order to provide this kind of user experience. We're going to talk about movie embedding later in this session.

Now, I'd like to spend a few more minutes to talk about the levels of interactivity. which will help us classifying broad range of interactive user experiences and help us set the right focus across these user experiences. Examples of media which have least level of interactivity are things like traditional TV broadcasting and cinema.

You basically sit back and relax in front of your screen, and all you do is to push one of a few control buttons. And I think you would want to have a large cup of soda and popcorn. I guess you'd call it a couchy potato style, is that right? Well, I think that in this case you would need potato chips instead of popcorns, but that doesn't really matter.

Surfing the net would require a little bit more interactivity. Nothing would happen if you were sitting stationary in front of your computer. You locate the links on the web page and then click them in order to get things going. I think that requires a little bit more calories, so I guess you want to have a couple of pieces of pizza in addition.

And things that require the highest level of interactivity and engagement are video games. I think it is likely that you are leaning forward towards your screen and tightly holding your control device. I don't think you have time to bite something in the middle of the game. So this may be a kind of starving experience.

So where are the QuickTime focuses? We think our focus is mid to high level of interactivity. Our goal is to provide the level of interactivity which goes beyond the traditional web browsing based on hypertext. We'd like to have that level of interactivity with more dynamic kinds of media. And we want your experience not to be so starving.

And the other thing is the media composition. As I mentioned, combining multiple kinds of media into a single container is one of the strengths of QuickTime. And we continue to focus on that. And you can add some sort of interactive behavior to your media object through wired actions. And I think this is another big advantage of QuickTime. And of course, the primary means of sharing your media content is the Internet. So we continue to work on this area so that you can provide more interactive experience through the internet.

So what exactly you'll learn in this session? We're going to talk about a couple of the interactive features you can have access to today. The first thing is some of the tricks and techniques you can use within QuickTime VR and Flash. And the other thing is the movie embedding.

In which you can do some interesting things combining your media. And then we're going to talk about a couple of new things we are working on right now. The first one is the extension to our Panorama projection engine, for which Steve showed a demo on Monday's keynote. And the other thing is the extensions to Word Actions, which helps your movie to communicate with the server. We're going to show some demos using WebObjects on this topic.

So I'd like to have Eric Blaenpied to come up and talk about some of the techniques you can use in VR and Flash. Eric? Thank you, Kaz. Good morning. So I'm a content guy in the QuickTime team working on interactive stuff. And I wound up, right today I'm going to be talking to you about things that we can do already, stuff that you can go home and work with.

First off, existing QuickTime VR panoramas and object movies Since we released QuickTime VR, the main thing that we've stressed is the ability to do these 360 degree views from the inside and from the outside. Pretty much referring to them as scenes or panoramas when you're looking from the inside. And we've referred to them almost as objects when you're looking at a thing from the outside in.

And lately we've been sort of trying to point out that on the web with all of the product work going on, thinking of objects and products is a really good approach. Of course, panoramas or scenes can be of things that are products for sale, such as houses and all. But generally speaking, the stuff that you're selling with this sort of interactivity would be... products you can hold, or, well, cars you can't hold, but there's a lot of work done with cars as well.

The other major things that we use in the interactive parts of QuickTime are sprite movies, which consist of mainly bitmap artwork, or sometimes vector artwork as well, that winds up getting downloaded and used and transformed and so forth, as opposed to having multiple frames in a movie. And flash tracks, which involve Macromedia's excellent graphics engine, allowing us to do vector-based artwork that's quite sophisticated and composite in interesting ways with other QuickTime media types.

And then finally, we build wired movies using any and all of these technologies, adding actions to the various different elements, adding URL links and other sorts of things, adding deep interactivity in many ways, the kind of stuff that a few years ago we really had to use dedicated authoring environments and playback environments for. But now, with QuickTime's interactivity, it can happen anywhere.

A big deal about VR movies on the web has been People have complained about their size, especially with the objects. And a big issue with a lot of it is actually more of a perceived speed than an actual speed. We've got various different ways of adding preview tracks and reordering the data in the file so that it will come down.

As it comes down, you get to see a really meaningful representation of the scene or the object right away. With the panoramas, we've got preview tracks, so you get a low resolution view of the whole thing. And then the tiles come in, alternating in front of your main view so that very quickly you get to see what's there. And hotspots are active almost immediately as well.

With objects, people take advantage of it. With objects, we can reorder the views so that as the object comes down, the first four views are the front, back, left, and right, so that you can see all the sides of the object, pretty much immediately. And then as more data is downloaded, the panning nearly becomes smoother. You see more and more incremental views between them.

So these are ways in which a slightly larger size file can wind up giving you a much higher quality experience for the user. And the sort of thing that you can only do with the photorealistic renderings or photographic media that we make up VR movies with, as opposed to any kind of a 3D sort of VR kind of approach. And then there's a lot of interesting things going on with adding value to VR movies.

We can add links to the hotspots that can be simple URL links, or we can do more sophisticated things with links and frames and so forth. People are adding sound and maps to help navigate, and just building them into more complicated presentations, either using HTML and various sorts of techniques like that, or by using the QuickTime interactivity that we've got to build movies that are sophisticated and can play right in the QuickTime player or anywhere else that QuickTime plays.

So, I guess I just talked about the FastArt exporter a bit. The FastArt exporter is a free component from Apple, which is available in our developer tools section. Drop it in your system folder and use it with QuickTime Player Pro to export your VR movie, whether it's an object or a pano, as long as it's a single node, and get those sort of FastArt benefits that I was mentioning. There are approaches to do it for multi-node movies using... First off, you can take a large multi-node movie and run it through a third-party tool such as Deliverator or Converter, which are from VR Tools, to provide the sort of same preview tracks and so forth.

In my mind, that sort of defeats some of the power of the internet and all, because still the user's got to download the entire movie to view it. And if it's a multi-node file and they don't choose to visit every node, they waste a lot of download time.

So it's simple to link the multi-node movies as single, separate nodes using the URL links within your page. And then the users only view the nodes they will go to. They get the hotspots pretty much immediately upon seeing a preview track. And you can have a pretty responsive internet experience there. So let's go over to this demo machine and look at a bit of this stuff.

[Transcript missing]

Davo Photographic does, for instance, he's put a VR object right in here, a rather unusual VR object into his main VR page. You may recognize if you spent time looking at out at the various sites on the web, his work, such as the... and this PT Cruiser by Chrysler and all. He's done a somewhat non-standard view here.

Let's look at one that's more of a normal rotation. This one isn't standard either. Okay, so he's done some interesting crane work here to make it rotate around. But these are the sort of are all using the same technology, but they're using a different kind of media. The same kind of media that's used in the past.

That involves an object as the starting point. And then he's got some panoramic views in here as well. So we come around, we can check out the engine, come around, zoom in on the interior, and then all the way around it.

[Transcript missing]

VR is obviously excellent for architectural work. Here he's done it without the preview, so we saw the

[Transcript missing]

extend the experience by really building it into a more complete interactive sort of thing. I'm going to a site done by Dennis Glicksman, a French VR author. are all very good at this.

I'm going to give you a quick overview of the features that So this one, well we were on T1, so it was hard, or T3 or whatever it is, it was hard to tell, but there was a preview track there for a moment. The tiles loaded on over it so quickly. Let's pop to another view. And he's got a map over here with JavaScript kind of stuff going on to determine what node you're going to see.

So this is a good example of using the QuickTime VR with existing HTML approaches to try and, um, to make for a much more interactive experience, which is giving a lot more information to the user than just... than... Then just the panorama on its own as an example. From there, to sort of go up the path a little bit into more

[Transcript missing]

and other engineers have been involved in the development of QuickTime VR.

There's some directional sound here that they added probably with the Squamish tools. Squamish Media makes a tool called SoundSaver for adding directional and ambient sounds to VR movies. And down here they have another QuickTime movie. This whole rectangular region at the bottom is another QuickTime movie, which is interacting with the panorama above, which can be changed either through... I think there's some hotspot links in here.

And the movie down below doesn't change, but it continues to interact with it. The compass is there, the controls, and the map, which we can click on to... I believe we can click on the map to change modes. Maybe not. So this is, again, we're not in a separate player, but we've used multiple QuickTime movies talking to each other with intermovie communication to really up the interactive experience and make it less than just, let's look at a movie there.

Another good example of that by a Swedish developer, a lot of European stuff going on that's pretty fascinating here, is... are all part of the QuickTime team. They're all part of the QuickTime team. So, we're going to be working with them to develop a new version of QuickTime VR. So, we're going to be working with them to develop a new version of QuickTime VR. So, we're going to be working with them to develop a new version of QuickTime VR.

The VR Movie is embedded into a sprite track which has got the meters for the sound and rollover information. In fact, and again a controller here. You click on info and get some rollover information about, I believe we got some Right, we can click on these people and get information about who they are.

It doesn't zoom on them. The movie on the right, the compass, is giving us feedback similar to the one we just saw. So this is an even more complicated example of intermovie communication. And then they've got some other things like... are all working on the new QuickTime games.

and Eric Blazow are all here. I'm going to take a look at the other slides. all animated. I believe they're building their stuff mainly with Live Stage Pro, which you guys are probably hearing a certain amount about. So that's, um, Let's take a quick view of that sort of thing. Let's go a little further here. This one's from a German developer, Tim and Hample.

and he is building his interactive experience with... This is all one big QuickTime movie. It's launching in the browser, but it could just as well be playing in the player. And he's got a number of segments of movies that jump from one to the next. They're loading different tracks at different points here.

Here we've got a VR movie that's masked. You'll notice the corners are rounded. He's got a track on top with an alpha channel providing for a non-rectangular frame. That's something I think is really unique to what we can do, is the ability to layer up the tracks and do that sort of masking. Not only can we break out of the 4:3 window that people seem to think video has to be in, but we can make it non-rectangular as well.

Composition would be much more interesting. The sprites over here allow for panning and so forth. Then we can navigate the different parts of the hotel this way. I believe that we go outside here and he's got linear movies of beach activities and so forth, which are also going to play framed like that.

This is clearly a high bandwidth demo. So the arrows don't do anything because the frame has changed the same. But the idea is that you've got an interactive environment here that is really the sort of thing that a couple years ago you'd be building in director or something. Finally, a last one, also by Tilman Hample of Actum Film in Germany. This is a simple demo, but it shows something that I like to show about just combining these different media types. We start out with a walkthrough into the store.

We get to the end and he uses an action at the end of the movie to tell it to load a different movie. It's a panorama of the interior of the store. We've got an object in front of us that has a hotspot we can click on once it's loaded here.

We click on that object and the object movie is there composited into a frame of the panorama so that you get the object in place as opposed to being a real jarring place where you got out of it all and then there's a way to jump back into the store and pan around there. Alright, so maybe we'll go back to the slides now.

So, a bit about Flash. A few of those movies had some Flash elements, but mainly that was bitmap work. Macromedia's done an excellent job of developing a really first-class graphics engine for doing vector artwork. This is great because of the small download footprint. Also, so many of the creative artists out there have a lot of work and skill with vector artwork programs like Illustrator and Freehand that they can leverage into doing work in Flash.

And it's even becoming a bigger deal with Adobe releasing their Live Motion, which will allow people with familiarity with the Adobe sort of tools to leverage that into doing motion graphics. And the motion graphics people use After Effects and all to leverage that into doing Flash work as well. So I think Flash is just going to keep going, and we've announced support of the Flash app. We've announced support of the Flash 4 format, and we're going to just keep working with it.

It also, we've... have done a fair amount of work to make the Flash movies composite really nicely with QuickTime. All the transparency that you get in a Flash movie is there in the Flash track when you've imported it into QuickTime. And if you layer it with other QuickTime media, you can set the transparency to alpha. You get all that selective transparency and partial transparency in your composition there, and you can have things play through it and all.

Um, so, that, there's a lot of work you can do with the animation there. Um, and not only-- and again, I think that's an exciting thing about the Adobe Live Motion thing, is that, um, the power of the, the work that Adobe's done with After Effects and all in developing animation tools will be available now for, um, as an alternative approach to the Flash authoring tools. So there's gonna be two good tools now for doing animation with Flash.

Um, also, it scales beautifully, and you can build big interfaces with QuickTime using Flash elements that don't take any more time to download than if they were scaled small. So you can quit having to think of your thing as going in a small window. You can have a nice big window with maybe the video. You've gotta worry about the bandwidth 'cause you're going for a modem speed. But that doesn't mean that your presentation has to be small.

Um, and then-- Flash has a fair amount of scripting in it. Um, there's the various actions to allow you to do various different things with Flash elements. But, um, And that stuff is carried across in QuickTime. When you bring a Flash movie into QuickTime with actions, those get mapped to QuickTime's own actions, and you get the same interactivity there. And you can get into combining it with other media types and adding further interactivity to the Flash track as well.

So, as you may have been hearing, I'm big on combining the media types. I have a multimedia background and doing the CD-ROM work and all. And I'm really excited to be able to really make these things as complex as possible and use the elements that you need, use the technologies you need for the elements that they make sense for. So we've got the sprites that I was talking about that are great for having buttons and bitmap kind of things or doing stuff that is best done in bitmap artwork.

We've got VR for doing, QuickTime VR for doing the scenes and the objects. And Flash, providing a good scalable animation tool that's vector-based. And then all these other kinds of QuickTime media as well. Like the streaming media, whether it's fast start streaming or the RTSP streaming, whether it's coming off a regular HTTP server or a QuickTime streaming server. And then text and other things. All of these media can very nicely go together in QuickTime.

There's plenty of ways to control these different elements once they get put together with interactivity. It's very possible to use QuickTime now to create very custom players which will play with a very minimal amount of the branding from QuickTime, which a lot of content developers really want. They want to have their identity be the thing that gets put forward.

And pretty much once you finish seeing the QuickTime queue load, you can take over the presentation from there and have it be your thing, the way you want it to be. And then use the QuickTime wiring to make these into wired movies and make them really do a whole lot. So let's look at a little bit of that.

Let's see. So the Swedish company Berserk, I showed earlier, has done something which is perhaps not the This is going to be seen as incredibly... it's not like an MTV kind of demo. This is one where the city council of a small town decided to use the web to broadcast their council meetings. And they hired the company to build an interface for this.

So they're using MoviesOffice streaming server, which this is archived, but it was done live. And the track on the left here with the text tracks can allow you to index around within the presentation. The agenda is shown here. There's information about the different parties involved. They've got interactive control of the movie in a straightforward way over here.

And there's a setup for commentary here. The users, upon clicking this bit of text here, I guess it says click here, to launch into a form for sending an email to the presentation as it's going on to discuss... are all using QuickTime to add their input. So this was a very interactive use of QuickTime to get some work done. Then we'll look at a couple of things.

So this is a couple of flash tracks in a VR movie. We can pan around here and view this stuff. We've got Flash composited over the QuickTime movie. are all over the VR movie and with animation going and all, we've got interactive buttons over here which are in the flash track which send actions to the VR movie to change to a different view.

This is just sort of a... A simple art example, but you could get much more sophisticated than this. I mean, There's nothing going on here but sending actions from the VR movie to the flash track. That could easily be done where we could do things based on pan angle or hotspot clicks or whatever to make stuff go on back in the flash track to change what you're seeing there, give information about the view. This consists of primarily three tracks. The background, which is Flash, and the VR movies layered on top of that. And then there's another Flash track on top, which is set to alpha.

If you want to just put together a movie with A QuickTime track on top of a Flash track, you can very easily do that in the Flash 4 authoring tool, importing a QuickTime movie and working with it there, and then using Flash's publish settings to publish this QuickTime. And that will make a Flash track and QuickTime tracks with whatever QuickTime you had imported.

If you want to get into this sort of thing where you're layering it up, then you need to assemble it somewhere else. You can do that in Live Stage Pro, which allows you to bring in the VR tracks, the Flash tracks, and so on, layer them up, set their graphics modes, and add things like the actions on these buttons, which change the node of the VR movie at some point. This is a simple demo.

So that's a simple sort of player... I think we'll go from here back to the slides and I'll talk about another way that we can get more sophisticated with our players. The thing that happened there was we had to... We had to have the media that was being played in the player really be an integral part of the presentation. We had to author it all together. And had to get the rates all to match and so forth.

In 4.1 we added movie tracks through the movie media handler, which allow you to make a new kind of track in the movie which uses for its media another movie. And these movies can be linked together without their time bases being slaved. In other words, we have separate clocks finally.

And that was something that in the past was always necessary to go to a separate authoring and runtime environments where you could have multiple movies. And then there was, the other thing about this is reference media. It doesn't have to be built into the movie. It can be referenced in a number of ways. So we'll have a demo of that.

This movie was shown earlier in one of the QuickTime sessions as an example of putting something into iShell to get a non-rectangular mask. But it was just dropped in there as a movie. The movie itself has quite a lot going on with it. We've got the movie region, quite obviously, has got a movie track in it, which we can control.

Fast Forward, Stop, Play, Step Forward. I just downloaded these movies before the presentation, threw them in the folder here, and named them Content 1, Content 2. I can have up to eight movies here with the names set up with those sort of names, and they get controlled sequentially by this sprite here, which chooses what track you're getting. It's one movie track, but it's a number of different references.

This control determines which reference we're picking, and we'll get the same one over and over again for the rest of the clicks, because I only put two movies in. We also made this player something where you can change the interface just for fun. So that's an example of the kind of player that you can't build.

I think it would be pretty hard to build anywhere else. And you can deliver it in QuickTime. Obviously this one is non-rectangular. You would want to have them these days in a more integral sort of appearance to make it work better, but that's how that can work. So if we go back to the slides now.

The final thing I want to talk about is something that a lot of content developers have been worried about for some time, which was the ability to reliably detect whether QuickTime was installed on a machine in the browser before they gave the user the content. We've incrementally added a number of pieces.

With JavaScript, people can do a fair amount of testing. I've always been able to in some of the browsers, in the Netscape browsers and so forth.

[Transcript missing]

Hi there, I'm Sean. I'm going to be talking about two things today. First thing is embedded movies, this new feature with QuickTime 4.1.

There's a lot of things you can do with embedded movies that aren't maybe obvious. QuickTime has always supported multiple spatial tracks. In a way, a lot of people ask, "Don't you already do multiple movies at once?" But the time bases there are all synchronized to the same clock. That would be one major difference. Some of the advantages of embedded movies.

One thing is that it's sort of like frames in a web page. I mean, frames, you can download just part of a web page. So with a movie with embedded movies, you can bring in just part of a sub-movie, basically. That also gives you, you know, redrawing. You're not redrawing the whole movie. So you can swap things in. You can have less redrawing happen.

It also lets you factor out sort of reusable components. Like you can create movie controllers that have custom looks and reuse them with any movie that you want to. I mentioned multiple clocks, so you can have, say, Eric just showed an example with a movie being loaded and controlled within the same movie.

There's also some technical implementation things that are going on with the movie media handler that make it nice to actually go between two different movies in a row. So the movie track is what is used to implement this. The movie media handler is a new media handler with POR1.

Basically, each sample defines properties. Since movies are complex, I mean, they have spatial audio and timing properties. There's properties in the movie media handler that you can set to deal with all these things. Another thing about the movie media handler is that it has a list of data references. So basically one sample in a movie track can have a whole list of movies and you can just tell it, "Okay, I want to use this movie at this time." So that's how you can swap things in. And it's actually dynamic as well.

So if we take a look at a... Sample Movie here. This movie, the whole slide's one movie. It's got four tracks. The first track's a movie track. We've got three samples, so that could reference three different movies. And then we've got a background picture, a sprite track, and a flash track.

The yellow timeline shows what's actually alive or active. So we see that you have multiple movie samples in a way that's sort of like a HyperCard page or something. At this time that is current, you only have the background picture and the first movie alive. If we go to the next time, well, the first movie's gone, but the second movie sample activates. And in addition, now we can have sprites. QuickTime VR Of course, these things can be layered how you want and arranged on the page however you want in the movie.

Of course, and then third time, the sprites are gone and we have Flash. So if we kind of zoom in on the movie layout here, We see that, well, the first movie sample actually is referencing a movie that has a video and audio track. Now notice there's two timelines now, because these don't have to be slaved together.

You can actually have them independent. So that SubMovie can have its own clock and can be controlled via wired actions if you want to start and stop it, change the rate, and so on. You can opt to actually slave the time bases if that's what you want to do.

So a movie sample, as I mentioned, has multiple data references. So here we have a movie sample that's pre-authored to have three different movies that it can reference just by changing a property. But it's also dynamic, so we can add another movie. We'll be showing some demos later of how this can be useful.

But at runtime, you can actually add or replace any data reference in its list, and then you can actually say, "Hey, set the current movie to that." Since all the movies are data reference, they can be pointing in sequence to the same media, actually, if you want. It doesn't have to be all loaded into memory.

So the movie media has some settings that you might want to check out if you're building movies like this. Basically, you can opt to slave properties the time, the duration, meaning the sample's already got a duration, and the movie's got a duration, so how do you... Do you scale? It's like adding scaled in QuickTime Player.

Audio properties can be slaved, like the volume or the balance. Graphics mode can be slaved, so it just... it gets set to the parent movie. And then there's some settings for spatial dimensions as well. And I'll be showing you some examples of that later. So actually, now, so if we go over to the demo machine.

So first of all, I'm just going to show you-- here's a simple movie that just has color patches. And these are the sizes of the movie tracks in this movie. And then we're going to take our mascot here. And we're going to basically see what happens to him when we reference him as a sub-movie using various scaling modes.

So I can close those down. I'll leave that one up. So first of all, the scroll mode is in some ways probably the most, what you'd expect the most. It basically does not scale the nested movie, but does clip it. So here we see that that's going on. Second mode.

The camera actually just stretches the movie to fill the entire thing. So he's very full down here. He's not so full over here. There's sample code out on the website right now that you can check out that lets you build embedded movies and actually play with all these different settings. So, meet, basically we see that it gets stretched until it meets an edge. And slice is fairly similar, except it will stretch until it fills the whole region and gets cut off.

[Transcript missing]

If we have a Flash movie and a separate video movie, People for a long time have wanted to have video on a TV and an animation, and the problem was always, "Well, we don't have a non-slave time basis." Now that we do, we can combine these two into one movie, so both are being referenced. Notice that we have... Control over the outer movie, which is where the Flash movie is. The Flash movie can actually control the embedded movie. Of course, you can start and stop the other movies clocked too.

So we support, in Flash we support all the Flash reactions right now. But to do this type of thing, you can also, we support on Flash buttons all of the QuickTime wired actions. So this is using wired actions to actually tell the nested movie to play and pause. Okay, now, one last demo here.

So we've got--this is actually a sprite movie, and we can drag around the pieces, smuggle the photo model. So I'm going to create Smubble here. You're probably wondering, what does this have to do with movie and movies, since this is a sprite track and actually down here a text track? It's a good question.

So what I've added to Smubble is a "take a picture of Smubble" button. So what this is doing is actually, in the wired movie, it's asking each sprite for its position. It's putting those positions, it's passing them to, in this case, a HyperCard CGI that's running over here. And what we get back is not a sprite movie, actually. In fact, it's a movie with a lot of sub-movies in it. Each sub-movie is a picture. This is actually, and he's got a funny ear there, so let's take another picture. Maybe a side view here.

Step on his hat. Yeah, there we go. So anyway. So this kind of thing is sort of interesting because this file that's coming back is a SMIL file, so it's really small. All the media is actually on the server. So you could imagine doing this kind of thing with iCards or something like that, where it's very interactive client experience with...

[Transcript missing]

So, we briefly take a, I don't know if we briefly crash.

[Transcript missing]

Movie in Movie, how do you create these things? You can actually use SMIL, like I just mentioned, and what was being returned there from the server was, in fact, a SMIL file. You can author them in a program like Live Stage Pro or build your own application as sample code on the server.

And I want to quickly talk about how the and Cesar Jaramillo, and we're going to talk about how QuickTime can be used to enhance the overall user experience of your product. So, we're going to talk about how QuickTime can be used to enhance the overall user experience of your product.

We're going to talk about how QuickTime can be used to enhance the overall user experience of your product. So there are definitely some advantages of using Movie Media Handler. basically covered most of them and running low on time, so I'm going to come back later and talk about some new stuff. But Bryce Wolfson will be up here to talk about some new stuff.

Good morning. I'm here to talk to you about something you might have seen already. It showed up in our keynote, panoramic projection extension. So far, what we've had for use is traditional cylindrical panoramas, as we introduced several years ago. It provides a nice immersive photographic experience, but you don't get to look all the way up and all the way down.

It's hard to explain this, so I actually want to go ahead and go straight to demo machine one. For those of you who might have not made a 10:00 AM keynote on Monday but managed to come to a 9:00 AM session, I suppose there's some chance you haven't seen this.

So we have some movies that were taken out in Las Vegas. Well, I shouldn't tell you that. But here's the Eiffel Tower, by the way. It's Las Vegas. That's the Phil Schiller surprise. And I'd sure like to be able to look up there, but of course that seems to be a problem here.

Now we can. And we can even get up and see the moon and get a little dizzy if we want. So that's really the fundamental introduction to QUBES and what it means. It takes us where we haven't been before. Can we go back to slides, please? So what is this? It's a new rendering engine. Provides fully immersive views, so we can look all the way up and down. And in fact, as I'll show you a little bit later, we can do more than that.

And Very importantly in some cases, there aren't artificial bump stops at the top and the bottom. And I'll show you that a little bit later too. It can be a little jarring when you're in an immersive experience to be looking up and maybe there isn't anything interesting to see, but you run into an artificial limitation.

We use a cubic texture map, as I mentioned. I'll show you some of the details of that in a little bit as well. And we want to sort of make a point that this is an addition to cylindrical panoramas. There are several reasons that this isn't actually a replacement.

In many, many ways, it's just the same. It's exactly what you're used to as far as user experience and several other factors. But there are a few things that are different. So, just like cylindrical panoramas, we've worked really hard to maintain very similar playback speed and quality for comparable content.

Hotspots, just as you have seen them now. Multinodes, you can do links between nodes within the same movie. Use URLs in the browser, trigger JavaScript, anything like that, and wired hotspots as well, which allow you to trigger many of the interactive things that have been shown elsewhere in the session.

We have fast start previews, which gives you a better user experience on the net. About 10% of the file comes down and you have some version that you can interact with, typically. We provide tilt constraints, which I'll also show you in a little bit. Sometimes you don't have all of the content, for various reasons, usually relating to capture. And the file format is amazingly similar. As a result, many tools interact with cubic panoramas just the same way they do with cylindrical panoramas. If they don't do anything that's really specific to the cylindrical format, it works out really well. It just works.

A couple of exceptions, as I noted before, starting with the second one. First, actually, you can look straight up and down. That's sort of the underlying theme here. Also, there's real benefits in the file format size, especially in things such as automotive industry and architecture, and there's a few other examples as well.

People have been working really hard to use QuickTime VR to do high field of view, where you can look very far up and very far down. But you run into some real issues with file size. Cylinders don't like to go beyond about 100 degrees before every little bit more starts to cost you a lot. You wind up with a lot of detail up and a lot of detail down, where it's not all that interesting necessarily, and it's just buying you a few more degrees. So if I can go back to demo, show you a few of these things.

First, just to take a look at the format, this is just a cylinder exterior view, and we can take a look inside, and this is the cylindrical experience as many people have seen it already. Hotspots takes us right back outside. Similar effect with a cube. This is different content.

This is what's fundamentally going on, is there's six surfaces involved. And, uh... When you hop inside and get the all the way up, all the way down effect. And again, we have hotspots. It's a multi-node movie. You can mix and match cylinders and cubes in the same movie as it makes sense.

Fast Start. We have a 100 megabit network up here between machines, so trying to scale down for bandwidth is a little interesting. So I have a tool here called Deliverator. It's from VR Tools. Not specific to VR. It's actually a Very useful tool for a variety of reasons. It knows how to Understand how your file is laid out and deal with that. Set up for optimal fast start. Many of the interactive medias, if not all of them at this point, don't really deal with streaming well because streaming is inherently time-based and interactivity is not.

So fast start remains very important to us. Here I can say, pretend I'm on a dual ISDN connection and show me what this would look like coming down. And this is the grid. In this case it happens to look very much like a holodeck. And you can watch the tiles coming in.

This is what you would get if you don't put a preview in, if you happen to be looking at this over a network connection. In this case we have some interesting content up here, which doesn't all come in on the first tile. And if you have a fast start preview, isn't all there immediately. There's an option, which I'll mention in a minute, to actually change where the faces are located.

For, uh, are all here today to help solve that problem. So in this case, I went ahead and put a fast start preview on it. This one just is for good visual effect. You might not choose to do this colorized edge detection. Oh, I'm sorry, this one's low start, low res, same thing.

So you can actually As the tiles come in, if we look here, we'll be able to see a little bit of a delineation between lower resolution information here and higher resolution down here. And this one shows the effect much more prominently. This was the end edge detection colorized. So again, this provides a neat visual effect until tiles actually start coming in, but when they do, it's a little bit jarring.

And there's one more thing that we've worked on as well, which is backwards compatibility back to QuickTime 4.0. So if you--not everybody updates immediately, that's just how it goes, as much as we might like otherwise. And so, Rather than forcing people to have to go get a new QuickTime, you can actually open these movies up in a cylindrical viewer, and they behave like cylindrical movies.

In some cases, the content doesn't really show it. There aren't any real visual effects in many cases that are very obvious. Sometimes it's a little more obvious. Here, the top of this building, for example, this is supposed to be round. But the effect is much better than "Can't play this movie, go get a new QuickTime." Of course, you can make people do that if that's what you want. Go back to the slides, please.

So texture map, six faces of cube, they're ordinary 90 degree by 90 degree perspective views you can get out of any renderer. Many renders today support cylinder output. This is even simpler because they're just standard photographs. Smaller memory and file footprints. The memory footprint being smaller is nice too as well for high field of view because cylinders, when you get up high, you're eating memory as well. And as I noted, cylinders do still have their use.

Primarily, they're backwards compatible much further back. They're proven. And there's a lot more tool support for things that do care about cylinders. And sometimes the ceiling isn't interesting or a high field of view, such as outdoor scenes. So this is just simple side-by-side illustration. An example, again, outdoor scene maybe all the way up isn't interesting. This is a file size comparison. And what we see happen here is cubes stay constant. They've got six faces for a given quality of source image, given horizon size. This is 2K pixels around.

And what happens is we get a crossover at about 100 degrees field of view, and by 135, a cylinder is double the size. 135 isn't actually all that high. And 160, which some of the automotive content uses, we're up to almost five times the file size of an equivalent cube. And that file size goes somewhere. You have detail, but it's all the way up at the top and all the way down at the bottom. And if you're zooming in on the carpet on the bottom of a car, that's not near as interesting, perhaps, as the dash.

And what's going on here is this is a cross-section of a cylinder on the left, cube on the right. You can see on the cylinder, the first 125 degrees takes the same number of pixels as the next 25, or to look at it a different way, you know, 25 at the horizon is... uses far fewer pixels. And I just want to have one more demo here.

So here's a cubic panorama taken in LA's Union Station. And it's really nice. Good stuff to look at as we come around. Nice ceiling. But the floor here is pretty nice too, which is interesting. But, oh my, there's a tripod here. And that's just a fundamental inherent issue with doing photographic content is you have to be there. Trying to hide the photographer in an immersive environment is hard. Not too bad when you can take it in sections, but there's some fixed equipment that just stays there.

In this particular case, because the floor is interesting, some creative Photoshop work takes care of the problem and the floor gets reconstructed. Or perhaps take a picture just straight down and use a little bit of work to get that in. There are some capture techniques that are on the horizon perhaps that don't have these issues.

Not really addressing capture at this point because we're just looking at how to deal with some of these issues in runtime. Here's another one. This is a lobby where this was taken with a Panascan camera. It doesn't quite get all the way down, so we have a bit of a hole in the bottom.

And this maybe isn't interesting. It's the stairs. You could Photoshop, put in some carpet, whatever. Some people put in a logo. Or perhaps there isn't that anything real beneficial in coming down here. So we do have tilt limits just like what you would expect with a cylindrical panorama where we can limit the tilt down.

In fact, I cheated. The first cylinder I showed you and followed by the cube, that was actually a cube with limits on top and bottom to make it look like a cylinder. So in this case, there are some other things we can do as well because cubes do have all the information all the way up. And we can actually go a little far and the world gets upside down. So... Which helps get rid of the stop up here.

And then... Last, just to look briefly at the bump stops issue again. Here, you saw this movie a little bit earlier with Eric. And this is one where maybe there isn't anything interesting to see, but you just run into an arbitrary limit at the top, and again, at the bottom. Not really missing anything. The floor, maybe a little bit of the center console. So take a look here.

This is large and high resolution, so it takes a moment to load. This still has a bottom drop, which maybe isn't such a big deal, but when you're looking up, you don't run into a barrier at all. Again, nothing interesting, and in this particular case... There's not only no boundary up at the top, but also in a car, again, it's just a center console, but there's nothing to stop you at the bottom either. Back to slides, please.

So, just real quick, we're running a little bit long here. Making cubic nodes, images, talked about 90 degrees field of view, the edge pixels are shared through the centers. The CD you got should have had file format information on it, which should cover much of this information. Just like a cylinder, panosample, but there's some backwards compatibility information in there, so that the older versions of VR can read it. There's a new field that you say, "Hey, this is a cube." New view atoms, which actually contain the data describing where things are.

and optionally where the individual faces are, which allows you to rearrange things if that's interesting, such as for where the primary content was in that one panorama. Standard face ordering clockwise is just like cylinders, which allows us to have that backwards compatibility and can easily be changed, for example, to a diamond face orientation. So, in summary, I'm going to go ahead and hand this back over to Sean, and he'll talk to you a little bit more.

Okay, finish the restart. We don't have to show that part. Let's go to the slides really quick. Okay, I got 12 minutes and I'm going to put it into three. Let's see if we can do that. Yeah, okay, right now we have wired variables and all it can do is basically hang off of a sprite track. So people have wanted more than that.

We basically want to do more in the future. So we've been working on some stuff that will allow us to have name variables, attach it to any sprite or movie, handle event parameters, have structures, and even local variables. So how are we going to do this? We're going to actually introduce an API, QTList. It's going to be a toolbox API and it's going to be scriptable.

So we're going to be able to build these hierarchical lists and import and export XML. So we're going to actually be able to deal with XML on the client side to some degree. So these lists are going to live on movies and tracks, so they're always available from wired scripting there. And also when you're in an event handler, you're going to be able to access the events such as mouse.x or something like that.

They're basically like XML nodes that are going to be living in the movie. So here the movie's calling movie_get_element, and we're going to have some string. You can actually just say my_color.red to go down the hierarchy of the XML. Again, mouse.x or h. Anyway, besides doing, KT lists will be useful for other things besides just variables and structures and so on. We're going to be able to use them to exchange data from a client to a server. Right now when we say go to URL, we can pass a little bit of information up to like 5, 12 characters I guess.

And what you get back is always a movie. So this is going to let you actually say, hey, here's some information. I want information back. If you've read about RPC or something like that, it's basically passing XML and an HTTP request and getting back XML. So we'll be able to support that kind of thing as well. I'm just going to go to the demo now.

So what we've done is set up a WebObject server with media, and in a database we have some movies, some images, some audio. We've defined two basically API, if you want to call it, routines. One is search media, and you can pass the type and even a substring to search, and WebObjects will handle that. And what that is actually returning is XML that the client movie can take a look at and use. So we just launch Netscape here.

What I'm going to do is just paste in a URL which is going to be handled by WebObjects. So if I view source on what came back, I just did a search media for media type video. So from the database it actually has returned all this information into the movie. And the client movie can actually get to all that through wired scripting. So we can actually find out that here's a URL, here's an ID, a duration for each element, and a title.

Let's try... The other API that we defined was buildSmile. So what this is going to do-- is building a SMIL file, and we have a simple little encoding here. Here it's just a single element. So if I execute that, WebObjects is returning a SMIL file, and it's just a single little video clip here. I just asked for media ID 3. So let's just do one slightly more complex one where we have audio in parallel.

So here WebObjects is returning a little slideshow with several pictures and some audio, and it's got timed information. The only thing we passed was originally the idea is you can say, "Hey, what do you have?" Do a search or something like that, and then construct a movie. So it only had three images, so it's already gone. So what we did was we put together The crazy client-side movie, it's not quite finished, but it's exercising some of these new features. And the idea was, it was going to be a media browser, so you can browse for movies, audio, or images.

And so if we browse for images, say with The letter F, and we find this flower. So that actually got returned through the list, and what it returned was the URL. So then we did a set on, we have four nested movies here, and on the first nested movie, we did add URL, and we got that out of the XML that was returned. And now we can actually see that movie.

If we want to search for maybe a different one. You'll also notice we're typing in text fields here. and which is something you'll be able to do in the future, not now. And then basically I hooked up a build it button which is just building right now a canned movie that should load in.

It's taking a long time. So I'll go back to slides. Oh, there it is. So basically, really briefly, the Exchange List action that we will have lets you do in one HTTP request response loop, lets you ask for information. I actually send XML information and return XML information back. So I think I should stop now. Thanks.

Well, looks like this session was a little bit busy, but I just wanted to say some of the closing remarks. That the Vis-a-vis interactive stuff is tightly integrated in part of QuickTime, so you always have access to them. So I strongly encourage you to try some of these things out. And then we're going to have a feedback session in this afternoon. So I strongly encourage you to come to that session, and then we'd like to hear some of your feedback. Thanks again, and enjoy the rest of the conference.