Graphics and Media • 52:57
The FxPlug SDK lets you create hardware-accelerated plug-ins for Final Cut Studio image processing. Join us for an in-depth exploration of this SDK to learn about new FxPlug features and new opportunities for plug-in developers.
Speakers: Dave Howell, Vijay Sundaram, Gabriele de Simone
Unlisted on Apple Developer site
Transcript
This transcript was generated using Whisper, it has known transcription errors. We are working on an improved version.
I'm Dave Howell, an engineer working on FxPlug in the Pro Apps department at Apple. We're going to talk today about the FxPlug SDK, which we introduced at WWC last year and at NAB before that. We're going to talk about how you write FxPlugs and introduce some new features in the new version that's coming out, FxPlug 1.1. Now, the FxPlug SDK is a plug-in architecture specifically for making image processing and visual effects plug-ins. Oh, that's so much better.
The Apple Pro Apps team put together FxPlug and first introduced it in Motion 2.0 last year. And soon we'll add support in Final Cut also. And we're actually going to show you a little bit about what that's going to look like. The reasons that we developed FxPlug as a new plug-in architecture last year was, first of all, to be able to define what features we have and to add new features to a plug-in spec that enhanced our apps that were specifically designed for our own applications.
And would also enable you as plug-in developers to take advantage of the fact that this is running on one platform and you can use Mac OS concepts like Core Image and Quartz Composer and QuickTime and Interface Builder and so on. And in fact, for custom UI, you can use NSViews and Cocoa to simplify the user interface.
And another reason was we wanted to add hardware acceleration, and we were the first plug-in architecture. FxPlug was the first plug-in architecture to add hardware acceleration so that you can run effects on OpenGL or accelerated by Core Image or Quartz Composer. And it also, when new hardware comes out, it lets us take advantage of things like universal binaries quickly.
Now, why would you want to write an FxPlug as a plug-in developer? First of all, and maybe the biggest reason for some, is that there are over 600,000 registered users of the apps that host FxPlugs. Another one is that your plug-ins can really shine in FxPlug because of the hardware acceleration.
Some of the plug-ins that we demoed last year had been plug-ins in other architectures before, and when they ported to FxPlug, they got several times speed improvement in the rendering. Another thing, as I mentioned, was your custom UI can be done in Cocoa, which is just, it'll speed up your development times.
On-screen controls in Motion can be done with OpenGL, and we'll talk a little bit about differences between Final Cut and Motion, but one that I'll mention here is that the on-screen controls won't be in the next version. They're the final cut, but are still supported in Motion, and they're accelerated by OpenGL.
And as I mentioned, universal binaries. Without support for universal binaries, you would have to run a host app in Rosetta to support a plug-in, and you just wouldn't want to see that performance in a pro app. Another feature that we've added that I think is compelling for plug-in developers is support for YUV plus alpha. This is the YCBCR with alpha formats that Final Cut uses.
We had RGBA and ARGB in 1.0, but this is new. And also floating point pixels, which we have already had. What's new? I've indicated in yellow some of the things that are new features. And the floating point pixels we've had, but now we have floating point YUV as well.
I'd like to have Vijay Sundaram come up and demo the forthcoming version of Final Cut and show you what FxPlugs will look like in there. He's a technical lead on the Final Cut team. Over to Vijay. Thank you very much, Dave, and welcome everybody to the session. I'm going to give you a quick demonstration on a future release of Final Cut Pro, but I wanted to talk to you a little bit about the fact that writing FxPlugs for Final Cut Pro entails a few things that people should be aware of.
First off, as Dave mentioned, there's software and hardware support. Now, it's important that a plug-in allow for both hardware and software support because our users typically tend to work on the high-end towers as well as the MacBook Pros. So if you're taking advantage of a, you know, high-end GPU on the towers, you want to give them a software solution so that they can also use the plug-ins on a system like the MacBook Pro. 32-bit float support is being introduced for Final Cut Pro in the YUV space.
This is a native space for Final Cut Pro, so there's no need for you to go and change from YUV to RGB, and it helps in performance, which will allow your users to be able to be more responsive with the plug-ins itself. On-screen controls. They have been there for Motion, and they are not available for Final Cut Pro.
So what you want to do is, if you have a lot of plug-ins that have on-screen controls, you want to be able to make sure that you use the controls that are inherent inside of FCP to be able to give them the equivalent of going and using the plug-in in the Motion app as well as in Final Cut Pro. Pixel aspect ratio is very important.
This is important because Final Cut Pro users tend to use a lot of different media, and therefore, tend to use, a lot of different pixel aspect ratios. And it's very important for your plug-ins to be able to respect this value as it comes in to you, so that it looks correct as far as the user is concerned when using your plug-in instead of Final Cut Pro. As I said, native YUV support is new. There are two flavors that we have of YUV.
For 8-bit, we use R408, and for 32-bit, we use R40FL. These two color spaces, more information about it will be available. It's currently available, actually. And Dave will, in his presentation, tell you where you can get a link to what these spaces look like and more information regarding it.
And finally, I was talking about support for software-based FX plugs. And we've introduced FX bitmaps as row bytes instead of just packed data so that it's easy for you to be able to get the data and use it natively instead of converting it and unpacking it and things like that.
So having said that, I'm going to show you a quick demonstration on my machine over here of FX plugs running inside of Final Cut Pro. So could we switch machines, please? Okay, so what happens when you first... install your plugin. Your plugin will actually show up inside of the effects tab.
Typically, if you have a filter, for example, this one where it says examples would be your company name and you would have all of your filters show up inside of that bin. Remember that we not only allow you support for filters, we also allow you support for gradients and transitions. So there's two more types of filters that you can create.
This particular example is a simple method. It's actually in the 1.0 plugin. What I'm trying to show you here is that we have backward compatibility with plugins that are built in FX bug 1.0. So if you already have plugins, what you want to do is when the future version comes out is to test it thoroughly and make sure that it actually works and fits within the FCP workflow that a user is accustomed to. So in this particular case, I'm going to show you that -- it brings up the flip. I'm going to show you that -- it brings up the flip. filters for that.
This plug-in also shows you that we support custom controls. So this is important because you will be able to now add your own specific custom controls that are specific to your plug-ins and have it supported within Final Cut Pro. And this plug-in will actually work. So if you were to take the 1.0 and compile it, you'll be able to actually bring it inside of Final Cut Pro and see that it actually works.
Also, as I was talking about, This is a very simple filter that we're using over here. And it actually uses core image to be able to change the Luma values and works in the native YUV support. So there's no conversion going from RGB to YUV. And that gives you a lot of performance and user experience gets better because the user responsiveness can get better since the performance is better.
Okay, so... The other thing that I wanted to be able to show you is that we've actually, to be able to demonstrate this, have been able to port some of our motion FxPlugs instead of Final Cut Pro. And I'm going to pick a very simple example here, like Compound Blur, for example, and remove this filter here.
So anybody who's used Motion before will probably be familiar with this plug-in. And as you can see, it just automatically shows up inside of Final Cut Pro. These plug-ins that were developed with the FxPlug 1.0 SDK in Motion now automatically come up inside of Final Cut Pro. So that's pretty much my demo as far as Final Cut Pro is concerned. And I thought I'd show you a quick thing and my editorial prowess over here where I've put everything together so that you can see all this work.
Can I get the audio too, please? So I'm sure you guys are all excited to learn more about this API and how you can create plug-ins for Final Cut Pro and Motion. And remember, these plug-ins now have two hosts inside of this Final Cut Studio suite. And as Dave mentioned, there are 600,000 plus registered users, so you have a big customer base that you can now bring together. Thanks a lot, Dave. I'll give it back to him.
So we'll talk a little bit about how to write a FxPlug plug-in. What I'll do right now is just go over the five simple steps. You go into Xcode and say New Project and choose from the list of Xcode templates and choose one of our FxPlug templates. Then you create unique IDs so that your plug-in can be differentiated from other plug-ins that have been installed. Customize the source code, which I'll kind of gloss over for the moment because we'll spend a lot of time on that later.
Reinstall your plug-in and test it in Motion and Final Cut. So for each of those steps, the usual, you use in the standard Apple plug-ins section of the Xcode templates, you'll see a project template for FxPlug Filter, one for FxPlug Generator, and now with 1.1, we're adding FxPlug Transition as well. So just choose one of those and give it a name.
and you'll see your project. Now, what I have here is a pretty simple example. The FxPlug filter example just has two source files, a .h and a .m file and Info.plist that is what describes the plug-ins so that the host applications can enumerate plug-ins without actually having to load their code.
And then of course localizable strings are in there as well. And that's really it for the samples. And there's not much source code in them either so you can really get started and hook into your own code. If you're trepidatious about going to Objective-C, I mean the fact is that the filters pretty much do the main bulk of the Objective-C stuff you need to do and you can call your own C functions or C++ functions directly from there.
The next step you do is to choose unique IDs, and this is sort of a wonky little detail, but don't forget it, because the first time that you create a plug-in from one of these templates and don't change the UUID, it may load. But the second time, then you'll have two plug-ins with the same ID, and they'll conflict, and all hell will break loose, and you don't want to see it.
So inside your Info.plist, there are a couple of places where the UUIDs show up. One of them is the group, which is the category that your plug-in falls into. You actually get to, in your plug-in bundle, you get to define a list of groups and a list of plug-ins.
If you're just doing one, you'll just have a single group in here, and it just defines a name and a UUID. So you modify that UUID with a new one. You do that typically by going into Terminal and typing UUID gen and copying the result and pasting it into here.
I'll show you a shortcut for that in a minute. And then that UUID that you just created and just pasted into there, you'll paste it in again for the plug-in that belongs to the group that you just defined. So for each plug-in that you have here, you'll have a name and a class name, a display name, the group UUID, and then the plug-in UUID. So you also need to modify the plug-in's UUID.
Here's a tip for doing that. This is described in the FxPlug SDK overview, so you don't have to memorize it. But if you create an Xcode script and install that, this one that's shown here lets me just press Command-Shift-U. So I select one of the UUIDs, I press this keystroke, and it generates a UUID right there by doing this UUID gen command. It's a great time-saver.
The other thing is that this step is pretty easy because when you look at the Info.plist, there's a comment above each UUID saying, don't forget to change this. You can see that some of our developers ran into this and were asking me questions, and so that's why I go over this in such detail. In customizing the source code, there's two simple things you have to do. You have to edit the parameter list, which is where you declare your list of parameters, and edit your render method or methods. Your methods for rendering can be just CPU or just GPU.
You can do both. I highly recommend doing both, especially now that Final Cut is out because some rendering paths will normally go in hardware, and some will be faster in software, and the app will decide which one's best. If you're playing out to video, for example, it's going to need to recompress a frame, so it needs to read back a texture from the card, and this takes time, so they'll opt to do software rendering most of the time, unless you say that you only do hardware rendering. Then they would do it that way, but try to implement both because you'll get better performance in some circumstances, and your users will be happier, and you'll make more money, and everything will be good.
Then just build your plugin, and wherever it shows up, you'll need to either copy it each time into one of these locations, library, plugin, fxplug, or library, plugins, fxplug in your home directory, but I usually use a symbolic link to point to that, so I just install this. I install the symbolic link once, and then every time that I build it, it picks up the change.
You'll still need to relaunch the host app after building your plugin, and be sure to test in both Motion and Final Cut, and if you have an old version of one or a newer version of one, have them all and test on everything because there will be differences.
Since introducing FxPlug, we've had quite a number of developers begin to write plug-ins, and some of them have released them, and some of them announced them and are going to release them. And I want to bring up one particular one that I think typifies both the power of FxPlug and its custom UIs and the kind of complex things that you can do.
And it also represents an opportunity for plug-in developers to write plug-ins that plug into his sort of plug-in architecture that sits on top of FxPlug, or below it. Anyway, so here's Gabriel de Simone from Noise Industries to talk about the FXFactory FxPlug. Gabriel de Simone: Thank you, Dave.
Thank you. Good morning, everyone. So what I'm going to show you this morning is our new product under development for FxPlug. And as you've just learned, it'll be working the same in both Motion and Final Cut Pro. So let me start off with a couple of plug-ins. I have my little sequence here.
So, as you've just learned from Vijay, the filters will appear in the same place where you're used to seeing your current plug-ins that you may be shipping for Final Cut Pro. So, in our case, we've created a little category called Noise Industries. These are some of our test plug-ins. This is called Watercolor.
I'm going to just play with a couple of filter parameters just to show you the kind of funky effects that we can do. So you've seen plug-ins before, obviously. If you're a member of this crowd, you've probably shipped plug-ins for either Final Cut Pro or other hosts. This is another one called Threshold Posterized. So since you're kind of used to seeing plug-ins, I'm going to jump straight to the beef of the product, which is actually a separate application called FX Factory.
And this is sort of new from anything that you've seen in this market. What you see here in the main window is actually a list of effects packs. The little box on the left Just represent a bunch of plug-ins. There could be one, there could be hundreds in it. And what I can do is actually the best part about the product, you can actually inspect these plug-ins. So I'm going to double-click it, or click the button.
There you go. So what you see here, by opening that FxPlug, what you're seeing is the contents of it, which is plug-ins. There's a category. You can go in and change the category name for localization purposes. You may go in and change the name of the plug-ins. But where things get really interesting is that everything that you would normally be able to do just in the code, you can do via this UI and sort of play around with the plug-ins.
So what you see here is the basic information about the plug-in name, author, and copyright information. I'm going to skip the rendering part and get back to it in just a minute. The next tab is the parameters. And what these are, these are the parameters that will appear in the UI in Final Cut Pro in motion.
This list is not static. In other words, I can play around with the parameters that you would see in the host. For instance, the watercolor filter that I just showed you obviously has this list of parameter. And as you can see, you can tweak what the default values of the parameters, the default values for the sliders, or create new UI widgets.
But obviously, the reason why this is very powerful is that FxFactor is different from any other plug-in you've seen before for another reason. The usual development method for shipping plug-ins has been that you, as you've seen, you just create an Xcode template, add in your code, and compile it, and you ship it, which is, in our point of view, rather slow. So what we've done is leverage some technologies that are really amazing technologies that are part of OS X.
So what you'll see here in the rendering portion is an edit button. And by clicking it, I'm going to do it for the watercolor filter, for instance, it brings up a quartz composition. So each one of our plug-ins renders through a quartz composition. You might have seen Quartz Composer in many of the other sessions here at WWDC.
And what this represents, this is a node-based compositing engine for people who are not familiar with it. These are called patches, and by feeding various kinds of inputs, generating outputs and passing them onto the next patch, you basically have your rendering pipeline. So as you can see, a watercolor effect is a medium complexity plug-in. There are some color controls, hue adjustments, all the way to the rendering phase.
So in the limited time that we have, I'm just going to show you a few tricks. So another plug-in that we have is called the comic book plug-in. And suppose that I wanted to use that plug-in, but I just don't like certain parts about it. Well, normally, you're out of luck. If you're the developer, you can code a new plug-in. If you're a user, you're forced to just wait for the next update.
In the case of FxFactory, you can just come in here and say, please duplicate this plug-in. It creates a comic book 2 plug-in. And just by clicking Edit, I can say, well, the comic book plug-in happens to do some half-toning. It uses a dot screen core image unit. So I can come in here and say, show me the half-toning effects that you have. And I can drag a circular screen, for instance.
I'm going to pipe the output and the inputs in the right place, and I can just remove that dot screen. And just to make it customizable by the user, I'm going to publish certain inputs. This is a terminology that, unless you're familiar with Course Composer, may be a bit new to you. When you publish one of the inputs, what it means is that input will be available to the user, to the host application. So I publish just the center, and we're going to publish the width and the sharpness.
And it's as simple as this. I'm going to click Save, Quit, apply my changes. And what you see here now is that the UI has sort of changed because I've obviously messed around with the plug-in parameters. So I'm going to-- I've created a few. I've added, obviously, a new center parameter.
I'm going to make them into a point parameter. And via this UI, you can set where the default location of that parameter will be. And you can see that the FxFactory actually created automatically the other two parameters for me. So just by clicking Save and restarting Final Cut Pro, we're going to test our new plug-in.
This shows the tremendous opportunity, really, of Quartz Composer as an OS-level technology that's available at the OS level that incorporates Core Image, JavaScript. There's all sort of new features coming in for Leopard that you might want to look at. So this is the sequence again. And if you look at the effect palette, you go under a category. You'll see that there is now a Comic Book 2 plug-in. So let me just drag it over our clip.
And although it might be a little hard for you to see it, it's actually using a half-toning circular half toning instead of the previous core image effect. How hard is it to create plugins with this procedure? I've shown you how you can take an existing one just by clicking the edit button. Well, as effect developers, what you will find yourself doing is take existing compositions and create plugins. We obviously don't have the time to create a new one from scratch, but I've taken a few of the compositions that you might have seen around using WWDC.
This is the Jumping Peter Graffagnino composition. And because this is going to be the most useful plugin you've ever seen, I'm just going to make a new Final Cut Pro plugin out of it. And obviously going to be a generator. I'm going to delete that background color. I'm just going to save it. And perhaps just publish one parameter just to show you the... The automatic plug-in creation. Color. And the way it works is you just click the new plug-in button.
Click and choose your composition. It guesses automatically whether it should be a generator or other things based on the composition parameters. And this is the default value for the only parameter that this plug-in has. I also downloaded something off the internet. This is a FxPlug called a Fireworks plug-in.
It's a generator. And obviously in this case, if we're planning to use it inside Motion or Final Cut Pro, we might want to customize the background color, both here and from Motion so that you can overlay it over an existing track. And this is actually-- ah. There you go. We'll do it from FXFactory. So I'm going to create a new plug-in with Fireworks. Fireworks 2.
I'm just going to hit save and again restart Final Cut Pro. And you will see that we've just created another plug-in. This is a very powerful concept because you may be used to development schedules that mean the coding, the learning the SDK, and all the testing process. And you should start, if you're interested in the product, you should start thinking of your plug-in in terms of intellectual property.
That the intellectual property of your effects aren't really in the parameter UI or in loading and unloading your plug-ins and learning the SDK specific details. So I'm going to show you in here, oh it's a generator, so we'll look under the generators category. And there it is, the Jumping Peter generator.
There it is. This is obviously a very useful plug-in, so we're planning to sell it. And to go back to my discussion about intellectual property, what FXFactory allows you to do is to concentrate on the aspects of your plug-ins that are really specific to what you're delivering. You can create a course composition, you can write new core image units that will allow you to deliver to the user only the component that's really important. And with FXFactory, just create a plug-in really, really fast.
What's even better with Leopard, of course, Composer is introducing plug-ins. So within the course composer composition, you can have some more custom code, not just core image units, and have your development schedule virtually halved or turn it from months into days. All right. Thank you for your attention. Back to Dave. Back to slides, please.
Okay, so what's new in FxPlug 1.1? There's a few things. There are some new protocols that we've introduced that are all, they have the suffix underscore V2 on them, and we'll talk about them a little later. There's a new parent protocol called FxBaseEffect. Now that we're adding transition, I've decided to take all of the common methods that both the filters and generators were defining and put them in a parent protocol. This actually simplifies things a little bit for your development and for understanding the SDK. And a plug-in now can return a properties dictionary that describe its requirements and capabilities.
For transitions, we've added the SMPTE wipe code equivalents so that when we're writing an EDL or edit decision list, we can write down the number for the wipe code that you've specified for your transition. And we've added an FX transition protocol. As I said, we have universal binary support, which actually came out in FxPlug 1.0.3, but that was since last WWDC, so I thought I'd bring it up. We've added string parameters, which we're mysteriously missing from 1.0, and added some new bitmap features that I alluded to before in the YUV and floating point YUV, and also robots.
Now, when you write a plug-in with a 1.1 SDK, it should work in both older hosts, meaning Motion 2, or a newer host, unless you're using some of the new features. And then they'll only work in a host that actually supports them. And you'll be able to query dynamically whether features are available and fall back to a different behavior if they're not available, or you can opt for your plug-in not to be registered at all if you use dynamic registration if a feature's missing. You may require a certain host API, and it might not be there. But existing plug-ins that you wrote with FxPlug 1.0 will continue to work in new hosts as well.
But you should still test them. So for the _v2 protocols, these are child protocols. And they inherit all of the methods defined by the parent protocol. So when we had before an FX parameter retrieval API, we've added an FX parameter retrieval API v2 that inherits all the old methods and adds one more, which is get string parameter value.
So the trick here is that if you don't need to use string parameters, use the 1.0 protocol so that you'll still run in an old host. If you do need string parameters, use the new one. And you request a host API. So if it's not there, you can choose not to use a string parameter. Maybe you can use a number, depending on what you're doing. Or maybe you can use a custom parameter.
And one thing to remember is when you use, when you require an _v2 parameter, remember to list it in your Info.plist. There's a list of all the host APIs that you need. It's the protocol names. Remember to add this so that we'll know not to load you if this protocol is missing. That's if you require the v2 protocol. Put it in your plist. And as I said, use the base protocols if you don't need the new features.
Now, the FXBaseEffectParent protocol defines these common methods that had been in filters and generators before. There's one that says whether your plug-in varies over time. There's one that you use to add your own parameters. One where you get notified when a parameter value changes, if you care. And one where you're asked to return your properties. And the generator, filter, and transition protocols all derive from that base effect.
And here's the properties dictionary that I mentioned. There are a few keys that we have in there. All the keys are optional. And if they're missing, they imply a default value that's documented in the headers. So, for example, there's one that says may remap time. And if that key is missing, we'll assume that you may remap time, which means that we have to be ready.
We can't assume that the frames you're going to operate on are the ones at the time when you're going to render your output. So, I'm going to start with the first one. In some cases, you won't need to look at these at all, but in others, things like Preserves, Alpha, Mayweather Map Time, and Pixel Independent, you can get better performance by setting these flags.
As I said before, the SMPTE wipe codes are only relevant for transitions, and they're in the properties dictionary. So I wanted to put up an example of a properties method that just shows you just build a dictionary with objects for all these keys and return it. A 1.0 FxPlug won't implement this method, but that's fine. If you try to build your plug-in that you wrote with the FxPlug 1.0 spec, you'll see an error about this method not being implemented, but it'll actually still work.
But do implement your properties dictionary. The transition protocol is similar to the FX filter, but obviously has two inputs instead of one input, an A and B input, and also a time fraction that goes between 0 and 1. So at 0, input A is used, and at 1.0, input B is used by convention. And the new plug-ins will be universal binaries. So I set up the projects, the Xcode templates, so that if you build the debug version, that is the development target, you'll get native builds. So whichever machine you're on, it'll build that. And the deployment builds a universal binary.
And string parameters, there are three new methods for string parameters. And if you happen to be working around the lack of a string parameter by using a custom parameter in 1.0, you'll probably be thrilled that we've added these. But I'm just going to go right past them. The new bitmap features, the primary one is the two R408 and R4FL. If you're familiar with these, great. If you're not, they're documented in one of the IceFlow articles. I'll have a link to that later on in this.
The letters to the IceFlow number 19, where they talk about the YCBCR stuff that was added to QuickTime a long time ago. Also, we added row-byte support. Previously, row-bytes was always the width times the size of the pixels in bytes. But now that can vary, which is great for performance. You'll never get bitmaps with any of these new interesting features unless you set the appropriate values in your properties dictionary saying that you need them. That's for backward compatibility with old plug-ins.
Now, back to the sort of the foundations of what an FxPlug is. An FxPlug is a ProPlug. And a ProPlug is our own flavor of NSBundle. And a ProPlug is defined by its conformance to a plug-in protocol. And the other protocols involved in a ProPlug are host API protocols. So these are the equivalents to a callback suite or a callback in another architecture.
What you do if you want to call back to a method is you request a host API object. This is what you call an object that conforms to a host API protocol. And if you get that, then you can call the methods in it. If it's not available because, say, you're running on an older host, then you have to fall back to other behavior.
Now, the ProPlug host API access and so on and registration is all handled by the PluginManager framework. And there are only a couple of methods you need to worry about, and one of those you'll never use unless you're doing dynamic registration. So basically, just look at the FxPlug SDK overview, and it'll tell you what you need to know about this.
Most of you, probably almost all of you are by now familiar with Objective-C protocols, but just in case you aren't, maybe you've been doing plug-ins for a while and haven't had to get into that, but they're similar to C++ mix-in inheritance or to Java interfaces. And a really simple one is shown here. It's a protocol that's actually from our SDK, and it just defines one method.
And what that means is that if you define a class that conforms to this protocol, you have to implement this method. That's all there is to it. I alluded to inheritance of protocols. Like classes, a protocol can inherit methods from a parent protocol. Here's an example of a parent protocol defining method 1 and a child protocol that inherits method 1 but also defines a method 2.
We'll talk a little bit about the APIs in the FxPlug SDK, not in extreme depth, but enough to give you a feel for it and sort of overall overview. You define your parameters by implementing an add parameters method. And here's an example of one that just adds a floating point slider and a point parameter, a 2D point parameter. You can see that what we do is we get the FxPlug, I'm sorry, FxParameterCreation API, and we get the object Parm ID from that. And if it's there, then we add parameters.
And here are the types, really quickly, the types of parameters that we support. Our floating point slider. and an integer slider, same thing but quantized to integral values, and a toggle button or checkbox, an angle slider. Two flavors of colors, RGB colors and ARGB colors. and 2D Point, which is an interesting one because it implies a sort of custom on-screen control. Whenever you have a 2D Point, you'll have a little crosshair at the default value for the point. But you can also control them with other controls, depending on the application you're in.
and pop-up menus that let you define an array of Unicode strings for the menu items. and an image well, which is the way that your plug-in can refer to other tracks or imported media. The user can just drag a piece of media onto that or can drag another clip from another track onto an image well. Or it can also be a still image. And if none of those work for you, you can always define a custom parameter and draw any crazy stuff you want in there.
And some people will use those just for drawing their logo and if somebody clicks on that it'll bring up a dialogue panel, but others will do much more creative things with it and do color mixers or whatever. The group parameter won't be in the next version of Final Cut, but it's in motion. And we have a couple of optional types that happen to be supported in motion in Final Cut, but who knows, some host might not support them. The two of them are histogram.
and Gradient. And Gradient's pretty useful. The histogram is fairly specific for the motion histogram FxPlug, but I think the Gradient is useful. It's a 1D Gradient. When you get the values for the parameters, you use the FX parameter retrieval API, and there's a method for each type of parameter, and you just call those to get the values. And similarly for setting, there's something called the FX parameter setting API, and you call the appropriate methods.
I'll talk a little bit about custom parameter UI. Any parameter can have a custom UI. It doesn't have to be a custom parameter type. It could be a checkbox. And instead of a checkbox, you might have wanted to define your own more beautiful control. A custom parameter can be any subclass of NSView. You can use Interface Builder to create that so that it'll scale with proportional sizing and all that, and to sort of ease development. Or you can make it programmatically from scratch.
In your custom parameter UI's NSView subclass, you'll get events just like any other NSView. And if you want to add a pop-up menu in there, a contextual menu, you either call set menu or menu for event, just like you would with any other NSView. And you'll find this very familiar if you've been doing Cocoa programming.
And of course, you can get key, mouse, scroll wheel events, pen pressure and angle and eraser information and all that stuff. And if you get an event, you'll probably want to change a value on some parameter or change state or change whether or not a parameter is hidden or something interesting. And to do that, you need to use this action API, the custom parameter action API.
Which is pretty simple. It's just you call a, I think I've got my slides out of order, but the start action method you call before accessing any parameters and an action afterward. And that just makes sure that the host application has everything in the state it needs to get to your parameter values.
Now, the way that you assign a custom parameter UI is to, when you create the parameter and you have a set of flags that you can turn on, they're just bit fields, you set the custom UI flag. And then if you've implemented the protocol, which you should, the custom parameter view host protocol, that simply asks you to return the subclass of NSView, an object of that class.
And whoops. One other method that's in the Action API is the current time method. And you'll use this not just for doing custom parameter UI, but also for on-screen controls. And there may be other instances where you need to get the current time and you aren't handed it. In most cases, like when it's time to render, you'll be given the current time. So you won't have an issue with that. But if you're doing something non-standard, communicating with another application, I don't know, opening up a separate window or something, you'll need that.
And you do need to call the start action and end action before getting that. Another thing to remember is that if you implement the custom view host protocol, don't forget to add it to your list of protocols that you implement. So in this case, we have a generator that conforms to FX generator and to the view host.
Now, custom parameter types, which are, of course, distinct, because you can have a custom parameter type with no UI, which might be a hidden one that doesn't appear at all, or you can have a custom UI with standard parameter type. So when you make a custom parameter type, you define a class and create a default value that's a member of this class.
I'm using the MyData class for this. And make that your default value. And from that, the host app will know what the class of your objects are. Some restrictions about it. We don't, at this point, support animation of custom parameter types. They have a value that... stays from one keyframe linearly. It's just not linearly interpolated, but it's flat until the next, like a stair-step function, until the next keyframe. So you should set the flag not animatable. Although, even if you don't set that, we'll know not to animate it.
But who knows what will happen in the future. So you should set that. And then the second thing is that the object needs to conform to the NSCoding protocol, which means you have to implement two methods. You need to implement encodeWithCoder and initWithCoder, which is your way of... serializing and unflattening custom data types, custom objects.
The other thing is that you have to do keyed coding in your NSCoding implementation. It's possible to do NSCoding another way, but you need to do it this way, which means you have to use encodeObjectForKey and decodeObjectForKey, which is just an implementation detail of how we support this. Here's an example of that. Again, this is in the FXplug SDK overview.
Now, rendering. There are two types of image, and they both inherit from the FxImage base class. You won't create an FxImage directly. You'll either create an FxBitmap or an FxTexture. And a bitmap, of course, is a RAM-based thing where you can poke the pixels in software, and FxTexture is an OpenGL pbuffer.
The FX bitmap in 1.0, FxPlug 1.0, had 8-bit ARGB, 16-bit, and 32-bit. The 32-bit is the float. And we've added now the 8-bit integer R408 and 8-bit float R4FL. And here's the link to that, but if you just search for IceFlow 19, you'll find it too. And all of our bitmaps use pre-multiplied alpha. And now if you've set the flag in your property saying that you support it, they may have interesting row bytes too.
FX textures are, again, they're P buffers, and you'll always need to ask for the coordinates for the texture. And they're really just thin wrappers for the textures in OpenGL, so you can do things that you could do with other textures. You can do bind and enable. You can get the ID of it and do OpenGL operations on it. And those, again, like the bitmaps, are pre-multiplied alpha.
Now, I won't go deeply into retiming, but basically, when you're told to render in a filter-- we only have this in filters, not in generators or transitions. They wouldn't make any sense in a generator, and in transitions, well, it's just not there. And what it lets you do is request an input image at some time other than the rendered output time.
You don't need to do anything special to do that for other parameter types besides images. They're special because they have to be loaded and, and So we ask for those specifically and use a host protocol to get them. The other ones, when you get a parameter value, you say get value for parameter at time. So that's explicit. Now, we have four methods for getting a filtered or unfiltered image at some time.
A filtered or unfiltered image, meaning that if you're the fourth filter applied to some input, you can get it without those other three or with those other three applied. If you want the raw pixels, you can get them too. And you can get your input image as a bitmap or as a texture.
Now, on-screen controls, as I mentioned, the point parameter is sort of a trivial example of an on-screen control. But you can draw anything in your on-screen control. You can completely obscure the canvas and draw your own canvas that does something interesting. You can do a split screen where we use our image for half and a modified image for the other half. You can do all kinds of crazy stuff in there. A fairly simple but pretty example is the Motion Kaleidoscope Filter, which does an angle control or an arc control as an on-screen control, all drawn with OpenGL primitives. And for the moment, they're only in motion.
Now, when you draw your on-screen control, you can assume that the, well, first of all, that you're operating on a texture because you're using OpenGL, that the image has already been rendered and that you're rendering on top of that. You're compositing over the top of the frame and the canvas and that sort of thing. And you can choose your drawing coordinates. You can switch coordinates between one of these and one of the others. There's object coordinates, window, and document, which are documented in the overview. And use the OpenGL anti-aliasing to make nice edges.
For on-screen controls, we have our own methods for getting events, not the NS event. And they're very simple. They're just your mouse and keyboard events, not the tablet pressure and so on. And you can set parameters based on these inputs, and you can use the action -- you should use the action start -- the start action and end action methods before and after modifying parameter state or value.
For selections on the on-screen controls, we use this OpenGL thing that you might or might not be familiar with. It's the GL Select Mode. So what you can do is implement one drawing method for both rendering and for defining your selection areas. If you draw a part of your screen while you're in Select Mode, that creates a selectable shape.
And when the user clicks in one of those, the app will -- the host app, Motion specifically will use OpenGL to figure out which shape was clicked in for its hit testing. But you don't load textures for this, and you don't need to do anti-aliasing because each pixel is either on or off for hit testing. And if you have different parts of your controls, you draw them each with a different selection ID, and that's how Motion can differentiate the different parts.
To summarize, Final Cut Pro support for FxPlugs is coming. That's probably the big news. Although we've mentioned it last year too, but now it's even closer than it was last year. And the new features, the biggest new features are the transitions, the new bitmap features, and string parameters.
And you should join the Pro Apps Dev mail list. In the past, we've had people just send to a group address we have at Apple that some of the motion engineers and Final Cut engineers and FxPlug people look at. But it's a private thing, so none of the other developers can see your question or the answers. And if you want private communication, that's still the best way to go. But we want to try to open up this mail list.
Just go to the Apple mail list server and sign yourself up. And you can read other people's responses, and you can search them later to see if a question you're about to ask has already been answered. I think that'll help getting some information flow going. And do optimize for software and hardware rendering paths and test in each of the different host applications.
For any additional information, as I said, there's the SDK mail group for private communications, and there's the mail list for sort of group-wide discussion. and you can contact me or you can contact the groups. And the documentation is up on the web. And also if you go to Apple Developer Connection, search for downloads and watch for the announcement about the FxPlug SDK 1.1. It will be up there real soon now.