Configure player

Close

WWDC Index does not host video files

If you have access to video files, you can configure a URL pattern to be used in a video player.

URL pattern

preview

Use any of these variables in your URL pattern, the pattern is stored in your browsers' local storage.

$id
ID of session: wwdc2007-410
$eventId
ID of event: wwdc2007
$eventContentId
ID of session without event part: 410
$eventShortId
Shortened ID of event: wwdc07
$year
Year of session: 2007
$extension
Extension of original filename: mov
$filenameAlmostEvery
Filename from "(Almost) Every..." gist: ...

WWDC07 • Session 410

Introduction to FxPlug Development for Final Cut Studio

Graphics and Imaging • 50:41

The FxPlug SDK lets you create image-processing plug-ins for Final Cut Studio leveraging OpenGL, Cocoa and Objective C. Begin an in-depth exploration of the FxPlug SDK. Learn to create rapid FxPlug plug-ins such as effects, generators and transitions using the built in XCode templates. Get an understanding of how best to target both Motion and Final Cut Pro with a single plug-in. Watch the demos to get a better understanding of the real world FxPlug plug-ins currently available.

Speakers: Dave Howell, Vijay Sundaram

Unlisted on Apple Developer site

Transcript

This transcript has potential transcription errors. We are working on an improved version.

I'm Dave Hall. I work on FxPlug in the Pro Aps team at Apple, and, you know, since we introduced FxPlug at NAB, the National Association of Broadcasters conference two years ago, a lot of people have started writing FxPlugs, and I'm really glad to see the -- the number that have been coming out and the quality of them, from camera manufacturers to more traditional filter and transition developers. At this session we're going to talk about, basically, what the FxPlug SDK is, how you can write an FxPlug and some new developments in FxPlug inside Final Cut Pro and inside the FxPlug SDK itself and with the new revision called 1.2.1.

Now FxPlug is a -- it's a new plug-in architecture that we created so that we could add new features rather than just hosting plug-ins written with other architectures. And at first the -- the host application that supported FxPlug was Motion 2.0. Since then last year we introduced FxPlug support in Final Cut Pro and the FxPlugs that you can write can be based on -- they're based on Objective-C. You can use C in your rendering functions, a lot of people do. You can use OpenGL and other Apple frameworks.

One really attractive thing about writing FxPlugs is the large user base, there are 800,000 registered users of Motion Final Cut, and they all use Fx. Another thing is that the effects that you write as FxPlugs can have really great performance and that's due to our use of OpenGL.

It's due to the universal binaries, and you can also have custom UI, both in the inspector window in the apps where you can make a control that looks however you want it to, or you can -- you can do on-screen controls where you draw controls directly onto the canvas and those run in Motion and those are based on OpenGL.

In the last version last year we introduced RGBA -- or we had RGBA but we introduced YUV plus alpha pixels and floating point pixels have been in there from the start which is a great way to get really good quality and, finally, in this -- in this latest release, Motion 3, we've added 3D to the ap=plication. So we've added a little bit of 3D interface to= the FxPlug SDK so you can access the camera and layer transforms.

So the first thing I want to talk about is how you write an FxPlug plug-in and there're five basic steps, and I know it's going to sound ridiculously simple but, of course, there's a lot more to these than these three words each. You initiate a template in Xcode to make an FxPlug filter transition or generator.

You create unique IDs so that we can tell one plug-in from another and then, of course, you customize the source code which is the real work. You build an install and you test, and we'll go through each one of these steps in more depth now. So, when you select an Xcode template, you get the list and the three that we've added: FxPlug filter, generator and transition. In this example we're selecting the filter.

You give it a name and what you'll see is a very simple project that has two source files and some resources. It's just a .M and a .H. this is for the filter, for the generator you'll see a .MMs, we've used Objective-C++ but it will look very similar to this and the Info P list in there is -- is interesting, too. We'll show you a little bit about that.

Now just dive into the Info P list and the first thing you'll want to do is create the unique IDs, and you can -- you'll find comments in the -- in the Info.P list but you can also just look for a UUID, and you'll find three of them, one of them is the ID for the group that your plug-in belongs to, the category name like blurs or, your company name or wherever you want it to go, and you'll generate a UUID which you can use -- you can use the terminal genUUID or UUIDgen, but I'll also show you a way to automate that, makes it much easier so you can use a keystroke.

Then that group UUID appears twice, there's the one where you're defining the group and later where you define your plug-in you state which group it belongs to, so you'll copy that UUID and put it in the plug-in descriptor, and then there's a UUID for the plug-in itself and you give that a separate unique ID.

So here's how you can create a script that goes into Xcode so you can have a key stroke. In this case you can see it says @dollarU that means command shift U. So you select one of those UUIDs, press this keystroke if you have this installed, and it will replace your -- the -- the UUID from the template with yours and don't bother to write it down, it's all in the SDK, in the documentation.

So, again, you'll see the comments in there, just look for those, make the change in the UUID and then customizing the source codes two -- two main things there's a parameter list that the samples will define, and you'll change that to be your own parameters, and you'll change the render method and those are the very basics and there's a lot more that you can do in a plug-in but that's what's going -- going to create your -- your first effect and the render methods you can write one or two. You can either do CPU rendering or GPU rendering or both.

Then, when you build your project, you'll want to install the -- the product into library plug-ins FxPlug either from the root domain or your user domain, and then you want to be able to debug and -- and so there're a few different things that we -- we do. One of them is to change the build location to directly write into the -- into the target folder by making a script command.

What I usually do is just go into terminal like this and make a -- a symbolic link to point to the -- to the build product directory, and then test in Motion and Final Cut. In the case where we built this my filter example it'll show up in the browsers of either Ap as shown there.

Now I'd like to have -- I'd like to introduce Vijay Sundaram to talk a little bit about how FxPlugs work in Final Cut and what's new in that space, Vijay.

  • Thank you, Dave. Hello, everyone -
  • ( Applause ) Is that for me or for Dave? Thank you and welcome to the introduction to FxPlug session. We introduced support for FxPlug within Final Cut Pro at WWDC 2006 and since t hen we have continued to make a lot -
  • a number of enhancements to the application as we continue to support new APIs in the FxPlug API model.

As Dave mentioned a number of developers have now started using this. So I thought we'll first of all go through some of the salient features that we've introduced inside of Final Cut Pro for the 1.21 version of the API. First off is hidden, and disable and not animatable parameter support. This will allow you to be able to create new and different controls for your plug-ins. Support for parameters groups basically allows you to create your plug-ins with controls that can be grouped together in logical units giving a seamless interface to your customers.

FX parameter setting API, it allows you to create presets for your plug-ins. So, for example, if your user -- if you have a pop-up menu for one of your controls and the user makes a selection, you can prepopulate some of the other controls and enable or disable them and provide those default values to them. Parameter sampling at arbitrary times allows the plug-in to get Keyframe parameter values at arbitrary times, and you can then support Keyframing inside your plug-in. Move to next slide.

So because Final Cut Studio comes as a box and Final Cut Pro and Motion are in the same box, there is a very high probability that your plug-in will be used in both applications. There's some caveats to remember when you target your plug-in for each one of these applications, and I thought I'll take a little time and go over some of the things to remember especially when you're targeting Final Cut Pro.

On-screen controls, on-screen controls are very powerful technology inside of Motion, unfortunately not supported in Final Cut Pro. So if you were making your plug-ins with a number of on-screen controls, please remember that you provide equivalent controls and parameters inside of Final Cut Pro so that your user when he switches between these two applications will be able to use your plug-in and get a better benefit from your plug-in basically.

Software rendering is preferred for Final Cut Pro and there's a couple reasons over here. Motion prefers hardware rendering and Final Cut Pro prefers software rendering. The reason in Final Cut Pro is that Final Cut Pro users tend to go between the tower machines and Mac Pro machines so, if you are targeting the GPU and looking for certain features in the GPU, when they move their projects from these high-end machines down to the Mac Pros, they may not have equivalent functionality in the GPU.

Also because in Final Cut Pro we'll be using the resulting images for other effects or if an external monitor is connected to it, we will read the pixels back from the GPU which can effect performance and not give the same amount of flow that the user will be expecting in the plug-ins. So, if possible it would be better for you to create two paths, software and hardware.

Native YUV support is important inside of Final Cut Pro because Final Cut Pro is basically a video editing engine. R408 which is our 8-bit YUV color space is the native color space for preview so, when the user hits playback and that's the color space that we use and R4FL is a 30-bit float YU space, and it's preferable to support these two color space parameters for two reasons, again, one is for performance reasons. You don't want to convert between the different color spaces and two to prevent clamping when you go between these color spaces. So keeping fidelity and quality and not losing your super white and super blacks.

There are two modes in Final Cut Pro, Safe RT and Unlimited RT and, when I go to my demo, I'll actually show you what this means a little bit so that you can actually see in action, but SafeRT guarantees that we will always playback without dropping frames, and this is easy for us to do for internal plug-ins because we actually profile these plug-ins and then we can show the user whether it's playable in RT or not by giving them either a red or green bar over the particular segment, but this is not possible for third-party effects. So how can you take advantage of something similar? The user can switch from Safe RT to Unlimited RT.

When they do this, they get the same functionality as an internal plug-in except that we do not guarantee that there will be no dropped frames but this is a very useful mode for the user to work with in your plug-in so that they can change the parameters and get instant feedback and even play it back and get preview. So to be able to support your playback in realtime the first two points would be -- would be something to consider so that you can give them the same amount of functionality when they switch from Safe RT down to Unlimited RT.

Finally, paying attention to pixel aspect ratio there is a -- there is a -- there is a bigger thing to remember, one of the interesting technology inside of Final Cut is what we call Dynamic RT. The concept of Dynamic RT basically says that we will try and keep playback constant at the best quality possible.

Depending on the runtime conditions we will scale back the quality of the -- of the image and the way we do this is we basically send a hint down to the Kodak and ask it to go from high quality down to say medium quality, and if there's resources available, go back up to high quality and the implementation detail is left up the Kodak so the Kodak can give you a scaled image back or a single field back, so you should be able to look at the parameters that get passed down to you in Render Info.scale and Image Dimensions to make sure that you are not expecting always a certain behavior but your -- that your plug-in adapts to the values that get passed down to you and puts out the output frame that is full frame.

Another part about this is that because we can send you an interlaced field or a single field unlike Motion which will actually align double and give you a full frame, so this is a thing to remember when you are targeting these two applications as to what the behavior is and then pixel aspect ratio becomes important because, if you're passing down a single field, you will get a pixel aspect ratio, for example, in the case of DV of 0.4 where as, if you are passing down a full frame, you will get a pixel aspect ratio of 0.8.

It's something to consider when you are targeting Final Cut Pro and finally as Dave mentioned, test, test, test. Test in both applications to make sure that your plug-in works correctly in both of these applications and that the result in output is correct. I'd like to switch over to my demo. If I could switch to the demo machine, please.

Sorry, gets dry out here. Couple things I want to show you in the demo is, first of all the sample plug-in that actually comes with the FxPlug SDK, the first plug-in is what we call SimpleMatte, and it allows basically shows you how you can create custom UI parameters for your controls.

So, if you follow Dave's steps and do the right thing, you will see that your plug-in will show up under certain groups. So there are basically three types of plug-ins that you can create, filters, transitions and generators and all three are supported inside of Final Cut Pro, and your company name will probably be the group, in this case, it's Examples.

You -- so in this particular case you will apply the SimpleMatte plug-in to the clip, you see there's an NS view that's been created here and the user can paint around it and create a matte effect for -- for -- in your plug-in so it's a good example to be able to understand what's the best way to go ahead and create custom UI controls.

One point I quickly -- before I go to my next plug-in this is what I was talking about with the respect to SafeRT and UnlimitedRT. As you can see over here if I switch this from UnlimitedRT down to SafeRT that particular filter segment went red, which means it's in RT because we didn't profile it and this is what's going to happen to your third-party plug-in.

So, if you -- if the user switches off from SafeRT to UnlimitedRT, it will go orange and they can actually play it back and get equivalent performance as though it were in SafeRT although we don't guarantee that you won't drop frames in playback and as you can see below that is DynamicRT in the three modes, so you can force -- for your testing -- you can actually force the mode to be high, medium or low and make sure that all three of these modes are supported correctly so that when then the user actually is in dynamic mode and it switches during runtime, the right things will take place with respect to your plug-in.

Now, since -- since the last time we introduced -- since the last year when we introduced FxPlug inside of Final Cut Pro, we have seen a number of users actually move to the Mac platform or even people who've written other filters who are interested in targeting Final Cut Pro using the FxPlug API.

To be able to learn the FxPlug APIs you can actually take advantage of a lot of interesting technologies within the Mac Operating System itself. There's Quartz Composer, Core Image, the Image, Image IO a number of these very powerful technologies that you can use as your image processing engine, so that you can start playing and learning the FxPlug APIs without necessarily having to import any of your previous plug-ins from another platform. The example over here is a simple Core Image filter. I will switch this one out and apply this filter.

It exposes one single parameter which is exposure, CI actually provides a number of interesting filters for you, and all I've done over here is pick one of the filters which is the exposure CI filter, and I'm asking that technology's CI to actually do the rendering for me. I'm exposing the parameter up to Final Cut in this case through the exposure parameter and as you can see, when I scrub her out, CI is actually doing the image processing for me.

So you can use technologies inside of the Mac Operating System to be, one able to learn quickly the FxPlug APIs and two probably even use those technologies as part of your -- of your plug-in and to demonstrate my point, we've actually had the good fortune of getting Noise Industries to provide us with some of their plug-ins, and I'd like to be able to demonstrate the Noise Industry filter.

So, if you go up to effects and video filters, you can see that there's a number of filters that they provide which can be used inside of Final Cut Pro. So let me apply Sobel Edge. As you can see it applies the edge detection filter and it works great.

So what's interesting about this is, let's say that your users typically apply the Sobel Edge filter and then they want to do a two-tone color effect, typically they would apply this first filter and then they would apply the second filter. So how can we change this work, how can you make it a little bit different and this is the problem that Noise Industries has -- has solved using technologies inside of the Mac Operating System and exposing them through FxPlug. So a quick Final Cut Pro here, and bring up the FxFactory application, so as you can see this FxFactory Pro Pack and you can open the Pro Pack and take a look at all the plug-ins.

As you can see all the groups are over here and under stylized you see Sobel Edge and you can see all the parameters that it exposed inside of Final Cut Pro. So we were talking about how we can add two-tone color effect to this without, you know, writing another piece of code, so this particular plug-in actually uses Quartz Compositions and, if you hit the edit button, it actually brings up the Quartz Composer.

Quartz Composer is a technology inside of the Mac Operating System, and it's available to every -- every -- it's -- it's available in every single version of the software -- of the operator system, and you can use Quartz Composer to create node-based edits where you basically create patches and wire them up and create interesting images.

So in this particular case, if you wanted to take a look at what the Sobel Edge filter looks like, you can see that there's an input. This input would typically come from Final Cut Pro or Motion. It goes through a clamp, does the edge detection, gets cropped and sent -- and the output is sent out and which comes back to Final Cut Pro or Motion and it would get rendered up. So our initial objective was to be able to add a two-tone color effect, how do we do this? Now the same Core Image that I was talking about in the earlier demo actually provides you with a patch called false color.

It's very simply put we drag it across and as you can see it's got two parameters Color 1 and Color 2 which would effect the colors of -- of the image. You can rewire this by breaking the connection over here -- oh, actually not here, but breaking the connection over here to the image and then adding this connection back down here and as you can see it actually applied a two-tone color effect.

Now that's interesting because you've been able to quickly add a new color but how do you expose the parameters Color 1 and Color 2 so that inside of Motion and Final Cut Pro you can do the same thing? Quartz Composer actually allows you to publish your inputs and outputs, so in this particular case I will go ahead and publish the input Color 1, call it Color 1 and publish input Color 2 and call it Color 2 and that's about it. Now I go and save this and apply changes over here and as you can see FxFactory has updated and shows you the two parameters Color 1 and Color 2.

Great. Now we quit this, save the filter, bring up Final Cut Pro -- takes a minute, and we look for the filter, stylize Sobel Edge and there you go, you've now been able to add two new parameters and a filter to your plug-in without any programs. So this is a very powerful technology and you should be able to take advantage of things like this when you create your plug-ins and give your users new and different ways of being able to create effects for themselves. That concludes my demo. I shall return it back to Dave, and I hope you guys have a great session. Thank you.

( Applause )

  • Thanks Vijay. Now let's actually dive into the FxPlug SDK a little bit and talk about the APIs. An FxPlug is a form of Pro Plug. If you went to the aperture export SDK session earlier -
  • I think it was yesterday, you learned about another form of Pro Plug the aperture exporter Pro Plug and Pro Plug in turn is a flavor of the NS bundle architecture from Apple.

With the NS bundle is a pretty generic way of wrapping up a plug-in and Pro Plug is a little bit more specific and FxPlug is an instance and the way that Pro Plug works is that plug-in classes that you implement conform to protocols defined by the SDK. So, for example, FxFilter is the protocol that you would conform to.

You'd implement all the functions -- the methods in that protocol in order to implement a plug-in and the host application, Motion or Final Cut, provides objects that conform to host API protocols. So between the two the -- the two -- the Ap and the plug-in are able to communicate.

Now within the plug-in manager framework which is where Pro Plugs are hosted we use the Objective C protocols and most of you who've gone to sessions here know that some who are new -- who are plug-in developers but not necessarily Mac programers from way back would just need to know that there's similar to C++, mix-in inheritance or Java interfaces and one thing a little more arcane that we make use of about protocols is that they are able to inherit from parent protocols so, for example, if you have a parent protocol that defines method one and a child protocol that inherits from the parent but defines method two, you end up with the child protocol implicitly defining both methods. So, if you are going to conform to child protocol, you need to implement both of those two methods.

Now, when you define your parameters in an FxPlug, you do that by implementing one method that is -- as part of the FxBase effect protocol that's the parent protocol for the filter, generator and transitions. So one of the methods in that protocol is the add parameters and in your add parameters method you define each of your parameters one by one which adds them to the list, top to bottom, and you use methods in a host API called FxParameter Creation API to -- to tell the application to add each parameter to your plug-in.

So in this case we add a float slider and a point parameter, and we'll talk about now each of the different parameter types, there's a floating point slider and an integer slider which is the same but truncated, check box or toggle button an angle slider which can go beyond 360 degrees and goes counterclockwise, this is an RGB color and an ARGB color.

I'll just note that the alpha RGB color is not currently supported in Final Cut but you can -- if you need that -- if you need the alpha channel in a color control, you can implement that by adding a slider for the alpha, for example, in the -- in your parameter list when you sense that the -- the host is Final Cut, and we have an API for telling you what the host is or you could use a custom control if you want and 2D points are -interesting because they're sort of a composite parameter type. They have the X and Y part and there are a couple of others that are composite including the RGB and RGBA.

2D point is also interesting because it implicitly defines an on-screen control which I'll show you in a moment, a pop-up menu where you define strings for each menu item, an image well which -- into which a user can drag an image or -- or other piece of medium and a custom control -- custom controls I'll talk about in more detail but it's basically any NS view that you assign to a custom control, a group parameter which contains other parameters and a couple of more esoteric ones which are the histogram and gradient.

Now, when you get values from a parameter, you use the FxParameter Retrieval API, another host API protocol, and you can see there are methods with names like Get Float Value and Get Int Value and so on and similarly there's an FxParameter setting API for setting the values of parameters. You'll use this one less often but you might use it, for example, if one parameter state effects the values of other parameters.

You might have a check box that once -- if it says limit values to from zero to hundred in some slider, so when that's checked when you're notified that it's checked through a change parameter -- a parameter change method you can then set the values to clamp to those ranges.

Now, when you make a custom parameter UI, you define an NS view. You can assign this to a custom parameter that has some type that you define or you can assign it to any of the standard parameters so you -- you may not like our interface for a point parameter or want to do something more interesting with it, you can make it look the way you want.

The custom view you define will be placed in the inspector, and it is resizable so you should make it respond correctly as resized. In theory it's resizable but that doesn't mean the host applications will actually let you resize them, and you can use any subclass of NS view and you can make that inside Interface Builder or you can create it programatically through your code.

In order to assign an NS view to a parameter, custom or standard, you use the normal method for creating the parameter but then you set one flag called the KFxParameter Flag Custom UI and, when that flag is set, the host application will call a method that you implement that's part of the FxCustom Parameter View Host Protocol that method is a create view for parm and in that you simply create an NS view, return that NS view, you can retrieve it from your NIB or you can create it right there as you see in the example.

When you get an event, you get it the same way that any other NS view would get it. You -- you override the methods in NS view and NS responder and you get mouse button -- a mouse button events, you get scroll wheel events and tablet, angles and eraser events or whatever it is that you want to handle.

You can have contextual menus as well so that when -- when somebody right clicks on part of your view, you can bring up a pop-up menu and the controller for your view can also change parameter values as I mentioned, and you can also change the state of other parameters, so you can hide other parameters, your check box that -- that effects other parameters might be one that -- that hides a whole group and so in your method that tells you that the parameter value has changed in this case the check boxes value you can hide or show other parameters and, when you change a value or state to access parameters in general you need to use the FxCustom Parameter Action API.

The action API gives you two -- two methods that let you enter and exit the mode you need to be in to access parameters, so this is start action and end action very simple. Also because you're not called by one of our methods here you're called by the OS, you need to find out what the current time is, because, whenever you access parameters, you say give me the parameter value at this time. So you call an action API method called current time to get that.

Now, in addition to custom parameters -- custom parameter UI there are also custom parameter types, and it can be any type that conforms to NS coding. I'll talk about that in a moment but, when you add a custom parameter to your list of parameters, use the add custom parameter with name method and give it a default value which is of the -- it's -- it's an instance of the class that you're using for your default values but also set the not animatable in the custom UI flag.

The not animatable flag is needed because our Aps don't support Keyframing or animating the -- the values of custom parameters. Some plug-ins have simulated this by making a custom view with its own timeline, but it's a bit of work. In general we don't explicitly support animatable custom parameters.

Another thing is that a custom parameter -- the class that it -- that it -- that -- that its values belong to like as I said have to conform to NS coding. Now in previous versions of our host applications you've -- you've had to use keyed coding which was a little more complex not much but you needed to do that or else your custom parameters wouldn't actually work and that's changed now. You can use any class that implements NS coding like an NS array conforms to NS coding as long as all the members of the array conform to NS coding; NS string conforms as well.

In -- in the examples plug-in that we have, the two of them, the SimpleMatte and SimplePaint we've defined a class that uses keyed coding and introduces a whole new class just for an array of points. As an exercise to the reader, you could throw away that whole class and just use NS array, but -- but you would still need to use this more complicated method if you wanted to support older host applications.

Now when you render you render into an Fx image, the -- the images that you get through a parameter well are also images -- Fx images and those that you get as an input to either a -- an Fx filter or an Fx transition are Fx images and that's the base class. The two child classes we have of that are FxTexture and FxBit map. FxTexture corresponds to an OpenGL P-buffer and Fx bit map is a RAM based bit map image.

The pixel formats that we support are alpha with RGB and there are three forms of that, there's the 8-bit integer ARGB, the 16-bit float ARGB which is only used on the GPU and -- so, if you're doing a software only render method, you don't need to worry about that format just support the 8 and the 32 and you may also do YUV and the 32-bit float ARGB.

The byte ordering is defined by a method in the Fx image class, so you can find out what the -- the ordering of those components are -- is, and YUV or YCBCR images are in two formats, one of them is the R408 8-bit form which is defined on our website this link is in the FxPlug documentation, but it's one of the old ice -- ice flow letter number 19 defines R408 as well as a bunch of other interesting things and the 32-bit float version of that which is the same thing but instead of 8-bit values it's float and the big thing there is that the floating point values can go out of range so that you don't have clamping. So it's optional to support the YUV formats. You'll get better performance in Final Cut in some cases if you do support them. They also -- an interesting thing about them is they do use premultiplied alpha in the YUV format as well as in the RGB formats.

Now, when you're using textures or P-buffers as inputs and outputs, you just ask for the coordinate system from the texture itself and our Fx textures are a very thin wrapper for OpenGL textures. You can do the normal OpenGL operations with them. You use binds and enable and you can get their texture ID and again they're premultipled just as our bitmaps are.

We also support retiming of parameters as well as -- as images. With parameters it's easy, whenever you get the value of a parameter or set the value of parameter, you pass a time. So it's explicitly supported. We added a new proto -- protocol for getting images at some specified time and that's the Fx temporal image API and in the more recent version we've added Fx temporal transition -- temporal transition image API which let's you get transition inputs at some other time.

We also added in FxPlug 1.2 methods for getting information about the timing of your effect, about the clip that it's applied to, about the timeline that it's on and that's the Fx timing API. We'll talk more about that in the advanced session this afternoon, but it's a way of finding out the start and end -- or start time and duration of these different entities of the -- you can find out the frame rate and so on, in and out points.

When you use retiming the Fx timing API protocol to get images at some time, you can get them in four formats: They can be filtered or unfiltered, they can be textured or -- textures or bitmaps and so we have four very similar methods for getting images in any one of those types and Fx the -- the temporal API for getting transition images is -- is similar.

Now our on-screen controls are implemented as another plug-in. It's a way of drawing custom controls in the canvas. On-screen controls like our custom parameters aren't Keyframed things, they're not really -- they don't even have -- they're not associated with parameters. It's just a control from whose methods you can set parameter values inside your plug-in.

So, in general, it's a -- it's a way of drawing custom controls in the canvas and a simple instance of an on-screen control is the built-in point parameter and there are slightly more complex examples you can see and one of them is in the kaleidoscope filter where there's an angle, an arc control.

On-screen controls are -- are Motion only and as Vijay pointed out you should -- you should provide alternate methods for controlling parameters that are controlled by -- by the on-screen controls so that when Final Cut hosts your plug-in, you'll be able to control those parameters. You draw your on-screen controls using OpenGL and they're composited directly over the window in the canvas, and you can use different drawing coordinates depending on what you're drawing, one or the others might make more sense, there is object window and document coordinate systems. You should use anti-aliasing.

You get the usual mouse and keyboard events, they're not through NS events or NS responders. We have our own methods for letting you know that an event has happened in on-screen controls because we're sensing the -- the region that was clicked on or the part of your control using OpenGL and you can set parameter values based on the events that you see.

So in order for us to know what parts of your controls have been clicked on, you need to draw your controls a second time using the GL select mode. It -- using that mode you define a number, an index for the part that you're about to draw. You draw that part not using anti-aliasing and not using any textures but other than that just as you draw it when you're drawing your controls, it's probably the same method and each part you give a different ID and, when we see that a click happens, OpenGL tells us which part was clicked on. So this is very fast.

In FxPlug 1.2.1 we've introduced a lot of new concepts, and I'm going to group the new additions to 1.2 with 1.2.1 since they've both been added since the last WWDC, but they correspond with the Motion 3 and Final Cut 6 applications and with a future application so -- a future version, so some of these things are not yet defined or supported, but I'll -- I'll let you know which ones of those they are. When the FxPlug 1.2.1 SDK is downloadable which will be soon, all of this will be clear.

The Fx versioning API is new this is really useful for a plug-in that has already released a 1.0 version and now they want to release a 2.0 and they, you know, in some cases these changes are easy. If you just add a new parameter, the host applications will pretty much do the right thing.

We'll say look there's a new parameter -- or I'm sorry, if you create -- if your user creates a project using your 1.0 version, saves it and then opens it up again in version 2.0, we need to know what to do and, if you've just added a new parameter, the host application will say, well, look there's no -- there's no values saved for this new parameter, so we used the default values defined by the plug-in when it created the parameter, and we'll just add the default value in unanimated for each new parameter and deleted or removed parameter similarly it's pretty easy to handle but what's harder for us to -- to handle is, if you've just changed the way you've rendered, maybe you've fixed a bug maybe the output of your render was -- was too bright.

So in 2.0 you've fixed that and you've darkened it a bit but what you don't want to have happen is users who use the bright version worked around that bug by applying a darken filter and now, when they open up your projects using your new plug-in version, you don't want to have them seeing an extra dark version.

So we -- we have the Fx versioning API which let's you find out the version that a project was saved in -- I'm sorry, that a project was created in. As you save subsequently those and the version will be the same. They'll still have the old version. We've added a 3D API for getti=ng the camera and layer transforms and focal lengths in Motion, the timing API I mentioned, the Fx Progress API let's you report progress on a long render periodically and let's you sense when the user has canceled a long render, and we've added more examples to the FxPlug-in SDK and also improved the support in the host applications.

Okay. So the Fx versioning API, you make this work by defining a new key in your Info.P list file, and it's called version, and you can give it any index you want. It might be an index starting with 1, it might be something else. And as I said we -- we remember the version that your plug-in -- the version of your plug-in that was used when your project was created, and you can find out the version of a project that's been opened by looking at the version at creation method in the Fx versioning API.

They will let you handle backward compatibility. You might add a control letting the user decide whether they want to render the old way or new way or do some special thing that you -- that you think is right. We don't handle that automatically. We leave it up to you because in some cases it might make sense, in some it might not.

The 3D API is Motion only and has four methods. You can -- you can find out the camera matrix at transform at time which returns a 4x4 3D camera transform matrix and layer matrix at time tells you the transform that's been applied to the layer to which your plug-in is applied. You can also find the camera's focal length and you can just query to find out whether a layer is 3D.

The timing API has timing information about a clip including the image inputs for a filter or transition plug-in and for image wells and you can find out the start time and duration and the field order. For effect timing you can get the times in and out points of -- of a clip to which your -- your plug-in has been applied. You can get the start time and duration like you can through your clip and the frame rate and there's also a time conversion methods for going between one of these different times and another -- one of these time scales.

The progress API is really only useful for a plug-in that does a fairly slow render but, if you do, it's really essential and the main thing you use is that the user has cancelled and you just check that periodically to find out if the user has tried to cancel your render. You don't need to return an error when you've been canceled, just abort and return.

There's another matter for updating your progress and in theory putting an interesting number in the progress bar which is not yet supported but it's good to implement just in case. FxPlug 1.2 introduces four new examples, previously we had the -- or in 1.2 and earlier we had SimpleMatte and SimplePaint showing you how to implement a custom control view and on-screen control, but it's not enough, so we've added one showing how to use the progress cancellation method in Fx methods in the Fx progress API that's called slow solid color.

It's a very simple plug-in that just very slowly renders, an options dialogue, an example is, again, a very simple one that just puts a custom push button in the parameter list and, when the user clicks on that, it brings up a panel with information and there's a directional blur example which is more complicated and shows a lot of different -- different concepts. In particular there's P-buffers and how to handle the capabilities of a machine for different kind of OpenGL features and something called effect helpers which has all sorts of useful goodies.

We'll talk about the directional blur example at the advanced session this afternoon, so I encourage you to come and hear about that and also there's a scrolling rich text example which, again, has lots of different concepts and there's a great reference when you're writing a -- a plug-in that uses some concepts that weren't defined before but we have now -- shows you how to handle pixel aspect ratio, gives you a control to turn on or off correct handling of pixel aspect and same thing for -- for field rendering and shows you how to hide and show parameters dynamically and other concepts as well.

For more information about the FxPlug SDK we have a mail list, and I encourage everybody to sign up for this. It's the best way to talk with other plug-in developers as well as with Apple engineering, and it's called Pro-Aps-Dev and the standard Apple mail lists sign up you'll see that listed.

We also have an a mail -- a mail group within Apple but just a few -- a few Apple engineers read and, if you've got something confidential that you want to talk about, you can use that mail address but really you get the best results from Pro-Aps-Dev, the pressure of having everybody else see that we've been asked a question is really is good for getting us to answer quickly.

The plug-in SDK -- the FxPlug SDK is available from connect.apple.com, just sign in, look in applications and you'll see FxPlug. When the new version is -- is up there, you'll see it in the what's new or new editions section and there's also other documentation including something for porting your plug-ins from other plug-in architectures up on the attendee URL.

We've a lab open today at the Graphics and Media Lab downstairs from 2:00 to 6:15, however, at the beginning of that we also have a session about advanced FxPlug development. So, if you want to talk about FxPlugs, don't go to the lab looking for us we'll be up here, same room but, if you want to talk about the aperture, export SDK or XML workflows in Final Cut, then -- then that's the place to be, but we'll also be down there after this session. It's the advanced session is from 2:00 to 3:15 in this room.

So to summarize, we've added a bunch of new features in FxPlug 1.2 and 1.21, and they are, again, retiming, 3D in Motion, progress and cancellation, plug-in versioning, timing improvements, improved consistency of plug-in hosting and bug fixes and enhancements and the examples that we talked about. So, please join the mail list.