Apple Applications • 41:57
In this session we will introduce FxPlug, the new SDK for quickly creating hardware accelerated filters and generators for the Apple Production Suite. We will demonstrate building and running a filter in Xcode using FxPlug templates, and show how to use Apple's Core Image framework to write hardware accelerated plug-ins on Tiger. We will also show off the results from several developers that have already created new FxPlug plug-ins, and we'll demonstrate them in Motion.
Speakers: Dave Howell, Pete Warden
Unlisted on Apple Developer site
Transcript
This transcript was generated using Whisper, it has known transcription errors. We are working on an improved version.
Hello, welcome to the FxPlug Overview session. My name is Dave Howell, and I'm an FxPlug engineer. And we're going to talk today about the FxPlug spec. You'll learn about what the FxPlug spec is, what the SDK is, and why we developed it. And we'll talk about how to write your own FxPlug plug-ins. First, the FxPlug spec is a new FxPlug-in architecture that we've introduced. It was created by the Apple Pro Apps team, and the first host app, which was recently, just last month, released is Motion 2.0.
Now, we developed a new plugin architecture for quite a few reasons. There are other ways to write plugins that will run in motion. We wanted to add features that we think plugin developers need to really make their image processing algorithms and effects shine. So, some of those are unique things that we want in our own -- that we have in our own apps, like the dashboard in motion. And taking advantage of things like the fact that the motion app renders its canvas in OpenGL.
So, you can render custom on-screen controls right onto the canvas in OpenGL. Hardware acceleration, there isn't another image processing effects plug-in architecture that lets you do hardware accelerated plug-ins. We wanted you to be able to take advantage of Core Image and OpenGL itself, and Core Graphics, Quartz Extreme.
So another thing is that most plugin architectures are cross platform. And a disadvantage of that is that the people designing it need to use the least common denominator between the different apps that between the different platforms that the plugins will run on. And we didn't have to do that. We were able to leverage strengths of the Apple platform. All the things that you've heard at the other sessions here at the conference, things like first class adoption of OpenGL, core image filters, and creating interfaces using AppKit and Interface Builder and Objective-C itself.
Now why did we go and use Objective-C? We wanted to be able to use both in our design of the architecture and also to support in your plug-ins all of the elegance of C++ without the problems that you get with subsequent versions of the GCC compiler like the application binary interface changing from one version to another or the fragile base class problem that CCC presents. Objective C helps with that. And of course we wanted you to be able to access AppKit and Core Image in particular. And to use Interface Builder, which integrates nicely with the classes in your controllers and views when you're doing custom parameter UI.
So reasons that you might want to write FxPlug plug-ins and port any plug-ins you may have to the FxPlug SDK or develop new plug-ins in our SDK. One is that even though at the moment we're only supporting FxPlugs in Motion, there are plans, we plan to support them in Final Cut.
And between the two apps, over a third of a million users are waiting to buy your plug-in. Another thing is that if you've already got a plug-in that's running in software only mode, you can use OpenGL and use Core Image Filters to accelerate your plug-in and get really dramatic results.
Another thing is that the custom UI that we are supporting lets you use all of the app kit and NSView and NSResponder. and David Koehn. Thank you, Dave. Thank you, Dave. So, we're going to start off with a quick overview of the FxPlug code that you would put into an app of your own, into a plug-in. And finally, the on-screen controls that I mentioned that you can render controls onto the canvas surface itself to let people directly manipulate your points and shapes and outlines or motion paths or whatever you have in your plug-in without having to present them separately in an inspector.
So that's all supported in FxPlug. So to get right into it, we want to show some things that have been done with the FxPlug spec before coming back and talking about some of the details of writing your own. And to start that out, we have Ben Syverson from DV Garage. Ben Syverson: Hello. Hi. So I'm just going to show you, going to show off DVMAT here and just talk a little bit about the process of importing a plug-in which was an After Effects plug-in to Motion.
Okay. So let me just play this and get this loaded up into RAM. Okay, so this is our base plate. Let's go ahead and turn on DVMap Blast. Actually what I'm going to do is delete this from the stack and add it so you can see how easy it is. Alright, okay, hang on.
Okay, now what we do is just pick a high color. This is all pretty even. Pick a low color of the screen, which might be down here. Switch this over, and there you go. And this is, you know, HDV footage, this is running in, well, pretty close to real time here.
If we turn this off, which we don't need, we can get up to, you know, 30 frames a second on HDV footage keying in real time, which is, you know, just something you would not be able to do in software, and probably something you'd have to spend about $200,000 for about six months ago, so.
So yeah, this is, well, let me show you some of the sort of features of DVMAT. Switch this over to Primat RT, which is a built-in real time keyer in Ocean. And if we zoom in, you can kind of see some of these artifacts on the side of her face, and sort of down by the shoulder. If I switch back to DVMAT, you should be able to see. It's, you know, it's doing it pretty clean.
So, yeah, I came to FXplug without any OpenGL programming experience, and without any, you know, you make these ARB fragment programs, which are, you know, sort of pseudo-assembly for the graphics card, and I had no experience with that either. You know, I just have done strictly software, you know, only image processing.
And within a couple of days, also I had no Cocoa experience, and within a couple of days, I was able to get, you know, basically this version of DVMAT going, just by, you know, sort of figuring out what, you know, how you make that sort of mental mapping from, you know, going pixel to pixel, to doing sort of, you know, processes that go, that take in entire images.
And so, you know, I think that's kind of So yeah, it was a really fun experiment, and obviously I'm very pleased with the results. This same frame rendering in the software version of DVMAT takes anywhere from two to four seconds to render. So now it's going at 30 frames a second, which is, as a developer, it's just so gratifying to see your stuff running that quickly.
You have to spend a little bit more time developing, but once you get over that hump, you realize this is where you want to be. It's very exciting, and now I'm going to pass you off to Pete Warden, the graphics wizard from Apple's Motion team. Pete Warden Yeah, thanks, Ben. Thanks, Ben.
So I'm just going to give you a little idea of some of the variety of things that you can actually do using FXplug. Since we are just opening up the whole of OpenGL, I'm just going to show you another plug-in from another third-party developer, somebody who's been active in other image processing apps and has been porting over-- Boris has been porting over their filters to using the FXplug architecture.
So you can see here, You can imagine anybody who's got any OpenGL experience will know that this is very straightforward just to do this sort of 3D rendering as long as you've just got an OpenGL context. You can see the sort of speed, we're running at 30 frames a second. And doing these sort of like 3D effects running in Motion, because we're just giving you an OpenGL context to render into, you can use all of the OpenGL rendering capabilities.
And so you can get these sort of 3D filters and any 3D effects that you can imagine really doing using OpenGL, you can do as FxPlug filters. And I'm really excited to be seeing some of the stuff that people are going to be coming out with, and some of the stuff that we've been seeing already. Okay, I'll pass back to Dave for the slides. Okay, so you've seen what you can do, and now we'll talk about how you go about it.
There are five basic steps in writing your own fx plug. And at the very top level, those things are choosing an Xcode template, which in order to make this an easy transition, we've shipped two Xcode templates that make it easy to get started. One is a filter, and one's a generator. And it's not very much code to start out with, so it's really a matter of taking your code and putting it in there. Next thing you need to do is edit the unique IDs in the Info.plist.
If you don't do that, your plugins will probably... One plugin that you write will conflict with another one, so you need to follow that step. It's an easy thing to forget to do. And you'll need to customize the source code, of course. If you've got an existing C function, you can just call that as a first pass. Ideally, of course, you'd implement an OpenGL path as well.
The other part of customizing your source code is to edit the parameter list. The render method that you edit, that you change, there's one or two render methods. You may either implement a hardware path or a software path, or both. And finally, you build and install and test in the Motion application. So, to sort of step you through that, you go to one of these two, the FxPlug Filter or FxPlug Generator templates. Choose that.
So, we'll start with the project. We'll change the name, and that will change the name of the project and the resulting plug-in, and all that kind of thing.
[Transcript missing]
Now, editing the unique IDs is a bit tedious because you have to go to the terminal to enter... to type UUID gen to make it UUID.
So, I recommend creating an Xcode script. This isn't installed by the Fxplug SDK installer, but it's pretty easy to do, and the documentation explains how to do it. It's a file that contains texts that you can, I believe, copy out of the documentation. And it basically just issues the UUID gen command and pastes it into your text. So you select one of the existing UUIDs, and in this case, it's set up to be command shift U, and it will type in a new UUID for you.
So when you open up the info P list, you need to change two UUIDs. One is the ID of the plugin itself. And the other is the idea of the group that the plugin falls into. And those are groups like normally you would use like the name of your company or product line as the group. It could be blurs or stylized or something like that.
And you define what group you're going to appear in, your plugin will appear in, with the dictionary inside the info P list. So in addition to editing the UUID of the group, you also need to edit that for the plugin. Now there are two UUIDs in the plugin. One of them identifies the plugin itself. And the other points back to the group. So you need to use a copy of the group UUID there.
Fortunately, there's comments right in the info.p list that sort of step you through that. Customizing the source code, of course, is very simple. It takes about six seconds, as you can see. Taking the filter and editing it. What could be easier? Seriously, the editing, we'll go into that in more detail, but if you have an existing algorithm, the first pass of doing this is really straightforward. If it's just a C function that renders a frame or that filters a frame. The next step of moving to OpenGL, of course, depending on your algorithm, it could be impossible or difficult or very easy.
The next step of building/installing your plug-in, I have another tip for doing the installation, which you're probably using in your own plug-ins if you're writing plug-ins now, which is just to create a SIM link that points from the FXPlug folder into the place where your built products go and just do that once using a line like this. This is also in the documentation.
But the plug-ins go into library/plugins/fxplug or your home directory library/plugins/fxplug. And that's where Motion will look for those. And finally, the next time you launch Motion after building, assuming that that link is in place, you'll just be able to test your plug-in and step through it and put breakpoints in, and so on.
Now, to the details of that one step that I kind of glossed over about editing your source code. An FxPlug is a type of ProPlug. And ProPlug is a generic plug-in architecture that we've developed in ProApps that is an extension of CFBundle and NSBundle. And we developed this so that we could support dynamic as well as static plug-in registration and add support for host APIs that can be retrieved from a plug-in.
This is similar to a callback suite in some of the legacy plug-in specs. And the host APIs might be available in one host app but not in another one, so this lets you request an API that you might be able to use. ProPlug helps you with that. It has some support for retrieving host APIs.
The way Proplug is designed, most of what we do is with protocols. We define protocols that you implement, and we define protocols that a host app implements in a Proplug architecture. And so FxPlug itself defines two basic plug-in protocols, one for a filter and one for a generator. You would implement those, implement a class that conforms to those protocols. And the host app provides, there are a few different host app protocols. There are some for creating parameter lists, one for retrieving parameter values or setting parameter values, and so on. And we'll talk more about those.
ProPlug is defined in the Plugin Manager framework. The Plugin Manager framework is very simple. There are only two protocols defined in it. One of them is for doing dynamic registration. If you're just going to put your plug-ins into a bundle, which is the normal course, you don't need to use the dynamic registration protocol at all. You can simply ignore it.
If you are going to use it, then you need to You might use dynamic registration if you wanted to look for certain conditions before deciding whether to load plug-ins. If you wanted to only load some of the plug-ins in your bundle, that kind of thing, you have multiple plug-ins in a bundle. And of course the plugin manager defines a protocol for retrieving host APIs in the pro API accessing header file.
[Transcript missing]
[Transcript missing]
So, host APIs. To get a host API you need to first get the API manager, and when your plug-in's principal class is initialized, it will be initialized with an initWithAPIManager method.
And that passes in an object that conforms to the pro-API accessing Dave Howell, Pete Warden, and Dave Howell, Pete Warden, and Dave Howell, So once you have the API manager around, you can then use the APIForProtocol method, give it the name of the protocol, and get back the protocol that you want to use from the API. In this case, we're getting the one for retrieving values of parameters. And we then call-- excuse me-- we then call a method GetFloatValue, which is one of the methods defined by this protocol. You'll see more of that as the session goes on.
Now, the FXPlug SDK-- once you understand the ProPlug bit, the FXPlug itself, the SDK is-- Each of these things I've listed here is a header, and there's one for the FX filter and one for the FX generator, which are the primary protocols that you would conform to to implement a plug-in. The next thing that you might or might not use is on-screen controls and custom parameter UI.
And the host APIs that you might want to retrieve and use, you'll always use the parameter API header for creating, setting, and getting parameter values. You might use the optional parameter API if you're doing certain kinds of parameters, if you're using histograms or gradients specifically. And of course you might do on-screen controls, you would get the host API for that. And you might want to do re-timing, in which case you would use the temporal image API. There's a layer info API, which is very simple just for getting position information about an input. And in addition to all those protocols, we also define a class. So creating parameters.
Your plugin needs to implement a method called Add Parameter. Add Parameters, excuse me, that's defined in both FX filter and FX generator protocol. And it will be called when your plugin is instantiated. It will use the FX parameter creation API to add each parameter. And there's a method for each different kind of parameter that you would add. And you can see here we're adding a floating point slider and then a point parameter. And I'll go through each of the types of parameters that you can create very briefly. Here's the floating point slider.
This is the way it looks in the inspector. Of course, you can also animate most of these in the timeline as well, and see them in the dashboard. So this long method here is what you use to add a floating point slider to your parameter list. You can add an integer slider, which is the same thing obviously, but integers. You can add a checkbox, which we call a toggle button.
And an angle slider, which lets you define, like the other ones, a minimum and maximum angle and the default value and so on. I haven't mentioned this, there's a parameter flags value that you can pass in to set characteristics of each parameter, like whether it's visible or if it's enabled or if it's to be saved or not saved when a project is saved. If it's to be suppressed or included in the dashboard and whether it has custom UI or not and a few other flags like that. You can add an RGB color or an RGB color with alpha.
And a 2D point here, this is shown open. The 2D point is an interesting one because it automatically is given on screen controls. And of course a pop-up menu. In this case the default value is an index, and the menu entries is an array of strings for the menu items. And you can create an image well that a user can drag images or movies into.
Now the most interesting one is the custom parameter. Two things you should keep distinct is custom parameters and custom parameter UI. You can create a custom parameter with no UI at all. It can be completely hidden, and it might have its values set by something about external conditions, or it might have its values set by an on-screen control.
But it doesn't need to have a custom UI. I think in most cases it would, but you can think of a lot of uses for a custom parameter with no UI at all. In this case, you can see because there's no explicit support for a text-type parameter, you can create a custom parameter that does text.
And then you can completely control how it works because you're creating an NS text edit view control. So you can respond to the methods in that view in its controller however you want. You can limit the length. You can add features for capitalizing or not capitalizing or whatever.
Next we have the group parameter, which is not really a parameter, but it does have a name. It's the way you create a hierarchy of parameters in your add parameters method. And the group
[Transcript missing]
And I'm going to gloss over the gradients and histograms because we'll talk about them more in the FxPlug in-depth session tomorrow. But you can create a gradient or a histogram.
Now to get a value of a parameter, you use the FX parameter retrieval API. So you go to the API manager and request that parameter, and then call a method to get a parameter value. You might do this in your render method, you might do it in response to an action in one of your custom views, or to an on-screen control.
And likewise, setting values is done by the FX Parameter Setting API, and there's one method for each kind of parameter. Now we want to talk about how you do rendering, and I'm going to hand off again to Pete Warden from the Motion team. Thanks Dave. Okay, well, I'm actually one of the engineers who's worked on a whole bunch of the filters that we use internally to motion. So what I'm going to be talking about now is OpenGL rendering specifically. The CPU based rendering will be very familiar to anybody who's ever had to deal with a bitmap.
But the OpenGL rendering is a bit of a new world. It's really not something that anybody else is doing. So there's an awful lot to talk about as far as the details. And a lot of those I'll be going into tomorrow in the FxPlug in depth talk. But I just want to give you like a whistle stop tour of the fundamentals of how the rendering works in OpenGL with FxPlug. But also, just before I do that, I just want to do a little bit of evangelization for doing these OpenGL FxPlugs. It really is a hell of a lot of fun.
We've done 60 or 70 in Motion, and just seeing the speed increases, as Ben's example shows, going from like one frame every two or three seconds up to 30 frames a second really is something astonishing to see. But also, what I think is really exciting is the kind of new possibilities that actually having OpenGL rendering being in there opens up.
Could everybody who's actually an OpenGL programmer who knows some OpenGL put their hands up, just so I get an idea? Now, if you imagine, I'm sure a lot of you guys have got your favorite little effects or demo effects or just little, you know, sort of maybe a couple of screens of code that really does something that looks cool.
But there really isn't, you know, you can't write a game just to put that effect in there. You know, you can put demos out on the web, and you know, we do see an awful lot of cool stuff out there. But there's going to be 350,000 people paying customers.
Who may well get very excited by seeing those effects. You know, they aren't people who necessarily look at all the demo stuff. So even stuff that you might not really think much of yourself, people get very excited to see that. So there really is some very strong reasons to be writing these effects plugs.
And we're already seeing an awful lot of really cool stuff coming out of the third-party developers. So I just really want to encourage you to really think hard about it. And I encourage you to really think hard about doing effects plugs. Okay. So. How do you write a filter that runs on the GPU? The real cornerstone of everything that we're doing in Motion is the ARB fragment program extension.
As you can see, we've just got an example here. It's a pseudo-assembly language that runs per pixel on the GPU. It's supported on all the graphics cards that ship with the G5s, and is supported by pretty much most Apple hardware. Increasingly, most people out there with Macs will be able to run this, and that's going to be increasing as time goes on.
So you write one of these per pixel, sort of pixel shaders. And then you just use the GL calls to actually bind it. And then all you do is we pass you an OpenGL context that you end up just drawing a quad with that pixel shader bound to it and with the input texture that we actually give you also bound to it so that the pixel shader operates on that texture, does its processing on those pixels that we pass into you, and then you end up drawing the results into the output screen.
It really is very, very simple. You can have, I think the example template is maybe a screen of code for the actual render function. And anybody who's familiar with OpenGL should really find it very easy. There's no viewport setting up or anything else like that that you need to do.
I'm just going to briefly cover the different classes that we give you for passing the image information back and forth. FxImage is the fundamental building block of this. All it's got in it really is get width, get height, some information on the depth, and the fundamental image attributes. And then we have FX texture, which just builds on top of that. And it really isn't very much above and beyond an OpenGL texture.
It has the texture coordinates in there that you need to use, and not really that much else. It isn't a big utility class. And the RAM-based version of that is FX bitmap. And anybody who's ever used a CPU-based bitmap class will feel very comfortable with this. You can get the data, and you can manipulate it, and yeah, that's about it.
Now we support various different pixel formats, all with alpha, all ARGB. That was a deliberate decision on our part. We really didn't want to be supporting or forcing people to support many different pixel formats. We just wanted to keep as few code paths as possible. We have the 8, 16 and the 32-bit float formats that we support. All of our bitmaps are actually coming through as pre-multiplied alpha. Our whole pipeline works using pre-multiplied alpha.
FX Texture again, like I said, it's got the texture coordinates contained as part of the structure and you pull those back when you want to actually use that texture to draw. Now one thing to be careful of is often the origin is in one corner and you can just use the width and height as the texture coordinates and you can skip actually explicitly asking for the texture coordinates.
But that gets kind of dangerous because we do various things, especially when we're rendering at different resolutions, for example for preview or if the user's decided that they want to get faster renderings. So it's very important, it will appear to work at first if you don't actually get the coordinates but you really need to explicitly check for them if you're going to have something robust.
All you do is just call bind and enable functions on these textures, on these FX textures, just like you would for a GL texture. And actually, you can even get the idea of the texture if you want to do stuff more explicitly than that. And these textures, as I said, all of our pipeline is pre-multiplied alpha.
So, I'm going to pass you back to Dave. So, before we go on to the more information section, I just wanted to point out that, of course, if you've been to the Core Image sessions, you've seen that you can write an image unit that can be used as a plug-in, and is available in Motion, and they're great. If you want to deploy a plug-in that works across applications from iMovie to Motion, It's a great way to go unless you need some of the specific features that are In FxPlug but not in Core Image, such as the on-screen controls or the custom UI.
[Transcript missing]
So to get the FxPlug SDK, there's information on the WWDC website, and there's a download, but it will point you to the developer connection where if you go to connect.apple.com and log in and go to downloads and look at Apple applications, there is a FxPlug SDK 1.0.
Just download that and it contains an installer and a documentation folder. The documentation folder has something called the FxPlug Overview, and also HTML text documentation that was generated by HeaderDoc, which is not always the easiest thing to read, but it's pretty complete. So start with the overview and then the HeaderDoc stuff is a great reference later.
The installer will give you the Xcode templates and two examples, one of which shows how to write a filter which uses on-screen controls, I'm sorry, custom UI, parameter UI. And the other one is a generator that shows you how to do on-screen controls. So it's like the custom, each of those uses a single custom parameter type which is just a list of points.
It lets you draw a path. One of them lets you draw the path in a simple view in the inspector, and the other one lets you draw it on-screen on the canvas. And each of them is a pretty awful example of how to write a paint program, but maybe you could start with it and give us something great.
In addition to the examples, of course, there's the framework and the FxPlug protocols and so on. Now, there are two related sessions. If you can come out tomorrow, we have at 10:30 a.m. the in-depth session where we'll talk more about the issues that we didn't go into too much depth about today. There's also a brown bag lunch after that that we encourage you to grab a lunch downstairs and head up and meet us on the third floor in the Pro Audio and Pro Video connection room. We'll just hang out and talk about FxPlug and what you can do.
Just as an example of some of the interesting OpenGL stuff that we can actually do, I'm just going to show off a couple of generator effects. Just to give some concrete examples of what I was talking about that our users are really interested in, that I think some of you guys can come out with some stuff that will hopefully be even cooler than this. For example, here's... and David Koehn.
So, we have a water core sticks generator that's done entirely using OpenGL. And it's really something that the software versions are very... It's never been something that's been solved very well in the software space, but actually using the OpenGL rendering engine and doing polygonal rendering to actually get this, it actually turned out not to be too tricky.
And another example is... In Motion, we call this Clouds, but what this actually is, is a Perlin noise generator. And just written using Fragment programs,
[Transcript missing]
So finally, if you have any other questions, if we could go back to the slides. Any questions, you can email Patrick Collins, who's the technology manager for this Apple applications track, at [email protected].
Or better yet, write to the ProApp SDK. Only one, it's not ProApps SDK, there were some people that got the wrong address before and were getting an email sent somewhere else. But ProApp SDK at group.apple.com. And a few of us who are involved in the FxPlug project get those emails and will respond to it. We don't have a mail list set up for this just yet. We're starting with this.