Apple Applications • 41:57
In this session we will introduce FxPlug, the new SDK for quickly creating hardware accelerated filters and generators for the Apple Production Suite. We will demonstrate building and running a filter in Xcode using FxPlug templates, and show how to use Apple's Core Image framework to write hardware accelerated plug-ins on Tiger. We will also show off the results from several developers that have already created new FxPlug plug-ins, and we'll demonstrate them in Motion.
Speakers: Dave Howell, Pete Warden
Unlisted on Apple Developer site
Transcript
This transcript was generated using Whisper, it may have transcription errors.
Hello. Welcome to the FXplug overview session. My name is Dave Howell and I'm an FXplug engineer. And we're going to talk today about the FXplug spec. You'll learn about what the FXplug spec is, what the SDK is, and why we developed it. And we'll talk about how to write your own FX plug plugins. First, the FXPlug spec is a new FXPlugin architecture that we've introduced. It was created by the Apple Pro Apps team. And the first host app, which was recently, just last month, released is Motion 2.0.
Now, we developed the new plugin architecture for quite a few reasons. There are other ways to write plugins that will run in motion. We wanted to add features that we think plugin developers need to really make their image processing algorithms and effects shine. So some of those are unique things that we want in our own, that we have in our own apps, the dashboard in Motion. Taking advantage of things like the fact that the Motion app renders its canvas in OpenGL, so you can render custom on-screen controls right onto the canvas in OpenGL. Hardware acceleration, there isn't another image processing effects plugin architecture that lets you do hardware accelerated plugins. And we wanted you to be able to take advantage of Core Image and OpenGL itself and Core Graphics, Quartz Extreme.
So another thing is that most plugin architectures are cross-platform. And a disadvantage of that is that the people designing it need to use the least common denominator between the different platforms that the plugins will run on. And we didn't have to do that. We were able to leverage strengths of the Apple platform. All the things that you've heard at the other sessions here at the conference, things like first class adoption of OpenGL, core image filters, and creating interfaces using AppKit and Interface Builder and Objective-C itself. Now, why did we go and use Objective-C? Well, We wanted to be able to use both in our design of the architecture and also to support in your plugins all of the elegance of C++ without the problems that you get with subsequent versions of the GCC compiler like the application binary interface changing from one version to another or the fragile base class problem that CCC presents. And Objective-C helps with that. And of course we wanted you to be able to access AppKit and Core Image in particular. And to use Interface Builder, which integrates nicely with the classes in your controllers and views when you're doing custom parameter UI.
So reasons that you might want to write FX plug-ins, FX plug plug-ins, and port any plug-ins you may have to the FX plug SDK, or develop new plug-ins in our SDK. One is that even though at the moment we're only supporting FX plugs in motion, there are plans, we plan to support them in Final Cut. And between the two apps, over a third of a million users are waiting to buy your plug-in. Another thing is that if you've already got a plugin that's running in software only mode, you can use OpenGL and use core image filters to accelerate your plugin, get really dramatic results.
And another thing is that the custom UI that we're supporting lets you use all of the app kit and NS view and NS responder. code that you would put into an app of your own, into a plug-in. And finally, the on-screen controls that I mentioned that you can render controls onto the canvas surface itself to let people directly manipulate your points and shapes and outlines or motion paths or whatever you have in your plug-in without having to present them separately in an inspector. So that's all supported in FXplug. So to get right into it, we want to show some things that have been done with the FXplug spec before coming back and talking about some of the details of writing your own. And to start that out, we have Ben Syverson from DV Garage. Hello. Hi. So I'm just going to show you, I'm going to show off DVMAT here and just talk a little bit about the process of importing a plugin which was an After Effects plugin to Motion. Okay, so let me just play this and get this loaded up into RAM. Okay, so this is our base plate. Let's go ahead and turn on DVMap Blast. Actually, what I'm gonna do is delete this from the stack and add it so you can see how easy it is. Alright, okay, hang on.
Okay, now what we do is just pick a high color. This is all pretty even. Pick a low color of the screen, which might be down here. Switch this over, and there you go. And this is, you know, HDV footage. This is running in, well, pretty close to real time here. If we turn this off, which we don't need, we can get up to, you know, 30 frames a second on HDV footage keying in real time, which is, you know, just something you would not be able to do in software, and probably something you'd have to spend about $200,000 for about six months ago, so. So yeah, this is, well let me show you some of the sort of features of DVMAT. Switch this over to Primat RT, which is a built-in real-time keyer in Ocean. And if we zoom in, you can kind of see some of these artifacts on the side of her face and sort of down by the shoulder. If I switch back to DVMAT, you should be able to see. You know, it's doing it pretty clean. So yeah, I came to FXPlug without any OpenGL programming experience, and without any... You make these R fragment programs, which are sort of pseudo-assembly for the graphics card, and I had no experience with that either. I just have done strictly software-only image processing. And within a couple of days... Also, I had no Cocoa experience.
And within a couple of days, I was able to get basically this version of DVMAT going just by sort of figuring out what, how you make that sort of mental mapping from going pixel to pixel to doing sort of thing, processes that go, that take in entire images. And so, I think that's kind of the way So yeah, it was a really fun experiment, and obviously I'm very pleased with the results. This same frame rendering in the software version of DVMAT takes anywhere from two to four seconds to render. So now it's going at 30 frames a second, which is, as a developer, it's just so gratifying to see your stuff running that quickly. You have to spend a little bit more time developing, but once you sort of get over that hump, you realize this is where you want to be.
So it's very exciting. And now I'm going to pass you off to Pete Warden, the graphics wizard from Apple's motion team. Yeah, thanks, Ben. APPLAUSE So I'm just going to give you a little idea of some of the variety of things that you can actually do using FXplug. Since we are just opening up the whole of OpenGL, I'm just going to show you another plugin from another third party developer. Somebody who's been active in other image processing apps and has been porting over, Boris has been porting over their filters to using the FXplug architecture. And you can see here, You can imagine anybody who's got any OpenGL experience will know that this is very straightforward just to do this sort of 3D rendering as long as you've just got an OpenGL context. You can see the sort of speed. We're running at 30 frames a second. And doing these sort of like 3D effects running in motion, because we're just giving you an OpenGL context to render into, you can use all of the OpenGL rendering capabilities. And so you can get these sort of 3D filters and any 3D effects that you can imagine really doing using OpenGL, you can do as FX plug filters. And I'm really excited to be seeing some of the stuff that people are gonna be coming out with and some of the stuff that we've been seeing already. Okay, I'll pass back to Dave with the slides. Okay, so you've seen what you can do, and now we'll talk about how you go about it.
There are five basic steps in writing your own FX plug. And at the very top level, those things are choosing an Xcode template, which in order to make this an easy transition, we've shipped two Xcode templates that make it easy to get started. One is a filter and one is a generator. And it's not very much code to start out with, so it's really a matter of taking your code and putting it in there. Next thing you need to do is edit the unique IDs in the info.plist. If you don't do that, your plugins will probably, one plugin that you write will conflict with another one. So you need to follow that step. It's an easy thing to forget to do. And you'll need to customize the source code, of course. If you've got an existing C function, you can just call that as a first pass. Ideally, of course, you'd implement OpenGL path as well.
And the other part, of course, of customizing your source code is to edit the parameter list. The render method that you edit, that you change, there's one or two render methods. You may either implement a hardware path or a software path, or both. And finally, you build and install and test in the motion application. So to sort of step you through that, you go to one of these two, the FX plug filter or FX plug generator templates. Choose that.
change the name, and that will change the name of the project and the resulting plug-in, all that kind of thing. And your project will open with, there are just two source files. There's a header and an Objective-C file. You can use Objective-C++, of course, if you want. And one of the examples does use Objective-C++. It's using a templatized method for doing the software rendering. And of course, there's the info.plist.
Now, editing the unique IDs is a bit tedious because you have to go to the terminal to enter, to type UUID gen to make a UUID. I recommend creating an Xcode script. This isn't installed by the FXplug SDK installer, but it's pretty easy to do and the documentation explains how to do it. a file that contains texts that you can, I believe, copy out of the documentation. And it basically just issues the UUID gen command and pastes it into your text. So you select one of the existing UUIDs and in this case it's set up to be command shift U and it will type in a new UUID for you.
So when you open up the info.p list, you need to change two UUIDs. One is the ID of the plugin itself, and the other is the ID of the group that the plugin falls into. And those are groups like, normally you would use the name of your company or product line as the group. It could be blurs or stylize or something like that. And you define what group you're going to appear in, your plugin will appear in with the dictionary inside the info.plist. So in addition to editing the UUID of the group, you also need to edit that for the plugin. Now there are two UUIDs in the plugin. One of them identifies the plugin itself and the other points back to the group, so you need to use a copy of the group UUID there.
Fortunately, there are comments right in the info.plist that sort of step you through that. Customizing the source code, of course, is very simple. It takes about six seconds, as you can see, taking the filter and editing it. It could be easier. No, seriously, the editing, we'll go into that in more detail, but if you have an existing algorithm, the first pass of doing this is really straightforward. if it's just a C function that renders a frame or that filters a frame. The next step of moving to OpenGL, of course, depending on your algorithm, it could be impossible or difficult or very easy.
The next step of building and installing your plugin, I have another tip for doing the installation, which you're probably using in your own plugins if you're writing plugins now, which is just to create a sim link that points from the FXplug folder into the place where your built products go. And just do that once using a line like this. This is also in the documentation. The plugins go into library, plugins, fxplug, or your home directory, library, plugins, fxplug. And that's where Motion will look for those. And finally, the next time you launch Motion after building, assuming that that link is in place, you'll just be able to test your plugin and step through it and put breakpoints in and so on.
Now, to the details of that one step that I kind of glossed over about editing your source code. An FX plug is a type of Pro plug. And Pro plug is a generic plug-in architecture that we've developed in Pro apps that is an extension of CFBundle and NSBundle. And we developed this so that we could support dynamic as well as static plugin registration and add support for host APIs that can be retrieved from a plugin. This is similar to a callback suite in some of the legacy plugin specs. And the host APIs might be available in one host app but not in another one, so this lets you request an API that you might be able to use. And Proplug helps you with that. It has some support for retrieving host APIs.
The way Proplug is designed, most of what we do is with protocols. We define protocols that you implement and we define protocols that a host app implements in a Proplug architecture. So FXplug itself defines two basic plugin protocols, one for a filter and one for a generator. You would implement those, implement a class that conforms to those protocols. The host app provides, there are a few different host app protocols. There are some for creating parameter lists, one for retrieving parameter values or setting parameter values and so on. And we'll talk more about those.
Now, ProPlug is defined in the Plugin Manager framework. And ProPlug itself is-- I mean, the Plugin Manager framework is very simple. There are only, I believe, two protocols defined in it. One of them is for doing dynamic registration. So if you're just going to put your plugins into a bundle, which is the normal course, you don't need to use the dynamic registration protocol at all. You can just simply ignore it. If you are going to use it, then you need to-- You might use dynamic registration if you wanted to look for certain conditions before deciding whether to load plugins. If you wanted to only load some of the plugins in your bundle, that kind of thing. You have multiple plugins in a bundle. And of course, the plugin manager defines a protocol for retrieving host APIs in the pro API accessing header file.
So static registration is done in the InfoP list. This is where the plugin manager doesn't have to load your plugin at all to decide whether it's a valid plugin that can be hosted by a given app. Everything's defined in the InfoP list, so it just has to load that, parse it, and figure that out. The InfoP list for a ProPlug plugin is the same as a CFBundle with a few added things.
And I've listed those added things here. There's a Boolean stating whether or not this plugin bundle is dynamically registered. In this case, it's not. And it also defines a group of plugins or a list of groups of plugins. For a single plugin in a single bundle, you just have one group defined and one plugin defined. But in the generic case, it's an array of dictionaries that define groups and an array of dictionaries that define plugins. Finally, you have an array of dictionaries that define the protocols that you expect in the host application itself.
If you're doing dynamic registration, there are five different methods that you can use to-- to register plugins, I'm sorry, that your principal class conforms to to do the dynamic registration. In most instances, you won't have to worry about that. So host APIs. To get a host API, you need to first get the API manager.
And when your plugin's principal class is initialized, it will be initialized with an init with API manager method. And that passes in an object that conforms to the pro API accessing-- protocol, and you just retain that and you can use that later to retrieve protocols that you need from the host.
So once you have the API manager around, you can then use the API for protocol method, give it the name of a protocol, and get back the protocol that you want to use from the API. In this case, we're getting the one for retrieving values of parameters. And we then call-- excuse me-- we then call it a method getFloatValue, which is one of the methods defined by this protocol. You'll see more of that as the session goes on. Now the FX plug SDK, once you understand the Pro plug bit, the FX plug itself, the SDK is, Each of these things I've listed here is a header. And there's one for the FX filter and one for the FX generator, which are the primary protocols that you would conform to to implement a plugin. Amen. The next thing that you might or might not use is on-screen controls and custom parameter UI.
The host APIs that you might want to retrieve and use, you'll always use the parameter API header for creating, setting and getting parameter values. You might use the optional parameter API if you're doing certain kinds of parameters, if you're using histograms or gradients specifically. And of course you might do on-screen controls, you would get the host API for that. And you might want to do retiming, in which case you would use the temporal image API. There's a layer info API, which is very simple, just for getting position information about an input. And in addition to all those protocols, we also define a class. So creating parameters.
Your plugin needs to implement a method called add parameters that's defined in both the FX filter and FX generator protocol. And it will be called when your plugin is instantiated. It will use the FX parameter creation API to add each parameter. And there's a method for each different kind of parameter that you would add. And you can see here we're adding a floating point slider and then a point parameter. And I'll go through each of the types of parameters that you can create very briefly. Here's the floating point slider.
And this is the way it looks in the inspector. Of course, you can also animate most of these in the timeline as well and see them in the dashboard. So this long method here is what you use to add a floating point slider to your parameter list. You can add an integer slider, which is the same thing, obviously, but integers. You can add a checkbox, which we call a toggle button.
And an angle slider, which lets you define, like the other ones, a minimum and maximum angle and the default value and so on. I haven't mentioned this, there's a parameter flags value that you can pass in to set characteristics of each parameter, like whether it's visible or if it's enabled or if it's to be saved or not saved when a project is saved. if it's to be suppressed or included in the dashboard, and whether it has custom UI or not, and a few other flags like that. You can add an RGB color or an RGB color with alpha.
And a 2D point here, this is shown open. The 2D point is an interesting one because it automatically is given on-screen controls. And of course a pop-up menu. In this case the default value is an index and the menu entries is an array of strings for the menu items. And you can create an image well that a user can drag images or movies into.
Now, the most interesting one is the custom parameter. And a custom parameter-- two things you should keep distinct is custom parameters and custom parameter UI. You can create a custom parameter with no UI at all. It can be completely hidden. And it might have its values set by something about external conditions, or it might have its values set by an on-screen control.
But it doesn't need to have a custom UI. I think in most cases it would, but you can think of a lot of uses for a custom parameter with no UI at all. In this case, you can see, because there's no explicit support for a text-type parameter, you can create a custom parameter that does text, and then you can completely control how it works because you're creating an NS text edit view control. excuse me, so you can respond to the methods in that view in its controller however you want. You can limit the length, you can add features for capitalizing or not capitalizing or whatever.
Next we have the group parameter, which is not really a parameter, but it does have a name. It's the way you create a hierarchy of parameters in your add parameters method, and the group is created by a start parameter subgroup and end parameter subgroup. And you just surround all the parameters in your group with those two calls and you'll automatically get a nested set of parameters. Here you can see one called Groove Settings that has a single slider in it.
And I'm going to gloss over the gradients and histograms, because we'll talk about them more in the FX Plug in-depth session tomorrow. But you can create a gradient or a histogram. Now, to get a value of a parameter, you use the FX parameter retrieval API. So you go to the API manager and request that parameter, and then call a method to get a parameter value. You might do this in your render method. You might do it in response to an action in one of your custom views or to an on-screen control.
And likewise, setting values is done by the FX parameter setting API. And there's one method for each kind of parameter. Thank you. Now we want to talk about how you do rendering, and I'm going to hand off again to Pete Warden from the Motion team. Thanks, Dave. Okay, well, I'm actually one of the engineers who's worked on a whole bunch of the filters that we use internally to motion. So what I'm going to be talking about now is OpenGL rendering specifically. The CPU-based rendering will be very familiar to anybody who's ever had to deal with a bitmap. But the OpenGL rendering is a bit of a new world. It's really not something that anybody else is doing. So there's an awful lot to talk about as far as the details. And a lot of those I'll be going into tomorrow in the FX Plug in-depth talk. But I just want to give you like a whistle-stop tour of the fundamentals of how the rendering works in OpenGL with FX Plug. But also, just before I do that, I just want to do a little bit of evangelization for doing these OpenGL FX plugs. It really is a hell of a lot of fun. We've done 60 or 70 in motion. And just seeing the speed increases, as Ben's example showed, going from one frame every two or three seconds up to 30 frames a second, really is something astonishing to see. But also, what I think is really exciting is the kind of new possibilities that actually having OpenGL rendering being in there opens up. Could everybody who's actually an OpenGL programmer who knows some OpenGL put their hands up? Just so I get an idea. Sweet. Now, if you imagine, I'm sure a lot of you guys have got your favorite little effects or demo effects or just little, you know, sort of maybe a couple of screens of code that really does something that looks cool. But there really isn't, you know, you can't write a game just to put that effect in there. You know, you can put demos out on the web and, you know, we do see an awful lot of cool stuff out there. But there's going to be 350,000 people paying customers who may well get very excited by seeing those effects. You know, they aren't people who necessarily look at all the demo stuff. So even stuff that you might not really think much of yourself, people get very excited to see that. So there really is some very strong reasons to be writing these FXplugs. And we're already seeing an awful lot of really cool stuff coming out of the third-party developers. So I just really want to encourage you to really think hard about doing FXplugs. Okay. Hello. How do you write a filter that runs on the GPU? The real cornerstone of everything that we're doing in motion is the ARB fragment program extension.
It's, as you can see, we've just got an example here. It's a pseudo-assembly sort of language that runs per pixel on the GPU. It's supported on all the graphics cards that ship with the G5s and is supported by pretty much most Apple hardware. And increasingly, you know, most people out there with Macs will be able to run this. and that's going to be increasing as time goes on.
So you write one of these per pixel, sort of pixel shaders, and then you just use the GL calls to actually bind it. And then all you do is we pass you an OpenGL context that you end up just drawing a quad with that pixel shader bound to it, and with the input texture that we actually give you, also bound to it so that the pixel shader operates on that texture, does its processing on those pixels that we pass into you, and then you end up drawing the results into the output screen. It really is very, very simple. You can have, I think the example template is maybe a screen of code for the actual render function. And anybody who's familiar with OpenGL should really find it very easy. There's no viewport setting up or anything else like that that you need to do.
And I'm just going to briefly cover the different classes that we give you for passing, like, the image information back and forth. FX image is the fundamental building block of this. All it's got in it really is, like, get width, get height, some information on the depth and the fundamental, like, image attributes. And then we have FX texture, which just builds on top of that. And it really isn't very much above and beyond an OpenGL texture.
It has the texture coordinates in there that you need to use, and, you know, not really that much else. It isn't a big utility class. And the RAM-based version of that is FX-Bitmap. And anybody who's ever used a CPU-based Bitmap class will feel very comfortable with this. You can get the data and you can manipulate it. And yeah, that's about it.
Now we support various different pixel formats, all with alpha, all ARGB. That was a deliberate decision on our part. We really didn't want to be supporting or forcing people to support many different pixel formats. We just wanted to keep as few code paths as possible. So we have the 816 and the 32-bit float formats that we support. and all of our bitmaps are actually coming through as pre-multiplied alpha. Our whole pipeline works using pre-multiplied alpha.
FX texture, again, like I said, it's got the texture coordinates contained as part of the structure, and you pull those back when you want to actually use that texture to draw. Now, one thing to be careful of is often the origin is in one corner, and you can just use the width and height as the texture coordinates, and you can skip actually explicitly asking for the texture coordinates.
But that gets kind of dangerous because we do various things, especially when we're rendering at different resolutions, for example, for preview or if the user's decided that they want to get faster rendering by cutting down the resolution. So it's very important. You'll run into-- it will appear to work at first if you don't actually get the coordinates, but you really need to explicitly check for them if you're gonna have something robust. And apart from that, All you do is just call bind and enable functions on these textures, on these FX textures, just like you would for a GL texture. And actually, you can even get the idea of the texture if you want to do stuff more explicitly than that. And these textures, as I said, all of our pipeline is pre-multiplied alpha.
So I'm going to pass it back to Dave. So before we go on to the more information section, I just wanted to point out that, of course, if you've been to the core image sessions, you've seen that you can write an image unit that can be used as a plug-in and is available in Motion. And they're great. And if you want to deploy a plug-in that works across applications from iMovie to Motion, It's a great way to go, unless you need some of the specific features that are in FXPlug but not in Core Image, such as the on-screen controls, or the custom UI, or dashboard access or dynamic registration.
So to get the FXplug SDK, there's information on the WWDC website. And there's a download, but it will point you to the developer connection, where if you go to connect.apple.com and log in and go to Downloads and look at Apple applications, there is a FXplug SDK 1.0. Just download that and it contains an installer and a documentation folder. The documentation folder has something called the FXplug overview and also HTML text documentation that was generated by HeaderDoc, which is not always the easiest thing to read, but it's pretty complete. So start with the overview and then the HeaderDoc stuff is a great reference later.
The installer will give you the Xcode templates and two examples, one of which shows how to write a filter which uses on screen controls, I'm sorry, custom UI, parameter UI. And the other one is a generator that shows you how to do on screen controls. So it's like the custom, each of those uses a single custom parameter type which is just a list of points. It lets you draw a path. One of them lets you draw the path in a simple view in the inspector and the other one lets you draw it on screen as a, on the canvas. And each of them is a pretty awful example of how to write a paint program but maybe you could start with it and give us something great.
In addition to the examples, of course, there's the framework and the FX plug protocols and so on. Now, there are two related sessions. If you can come out tomorrow, we have at 10.30 a.m. the in-depth session where we'll talk more about the issues that we didn't go into too much depth about today.
And there's also a brown bag lunch after that that we encourage you to grab a lunch downstairs and head up and meet us on the third floor in the pro audio and pro video connection room. And we'll just hang out and talk about FXplug and what you can do.
Just as, just an example of some of the interesting OpenGL stuff that we can actually do. I'm just gonna show off a couple of generator effects. Just, you know, just to give some concrete examples of what I was talking about that our users are really interested in, that I think some of you guys can come out with some stuff that will hopefully be even cooler than this. For example, here's a little example of a, water core sticks generator that's done entirely using OpenGL. And it's really something that the software versions are very -- it's never been something that's been solved very well in the software space. But actually using the OpenGL rendering engine, doing polygonal rendering to actually get this, it actually turned out not to be too tricky. And another example is In motion, we call this clouds, but what this actually is, is a Perlin noise generator. And just written using fragment programs, and running, as you can see, 30 frames a second, 720 by 480, really does blow away the speed of the competing stuff out there. And if you guys have got similar, just any cool stuff, I really think that our users will be very excited and will be actually willing to pay money to actually get hold of this sort of stuff. And plus, I just really want to see it.
So finally, if you have any other questions, if we could go back to the slides. Any questions, you can email Patrick Collins, who's the technology manager for this Apple applications track, at patrick at apple.com. Or better yet, write to the proappsdk. Only one, it's not proappsdk. There were some people that got the wrong address before and were getting an email sent somewhere else. but [email protected]. And a few of us who are involved in the FX Plug project get those emails and will respond to it. We don't have a mail list set up for this just yet. We're starting with this.