General • 50:52
Apple's Final Cut Pro 4 includes two new helper applications, LiveType and Soundtrack. LiveType is a titling tool that supports LiveFonts, Apple's new 32-bit, fully animated font format, as well as animated textures, objects, and templates. Soundtrack is a music creation application that matches and arranges music clips in real time, utilizing Soundtrack Loops - a new AIF format that contains metadata for instrument, genre, and mood. Both applications offer content creators a new market for their products. This session discusses the new applications, and the process for developing optimized graphics and audio content for Final Cut Pro users.
Speakers: Brett Halle, Tom Langmacher, Dave Howell, Xander Soren
Unlisted on Apple Developer site
Transcript
This transcript was generated using Whisper, it has known transcription errors. We are working on an improved version.
Good afternoon. Welcome to the Creating Content for Soundtrack and LiveType session. Soundtrack and LiveType are a couple of our applications that are part of our professional application products available from Apple. Specifically, Soundtrack and LiveType are part of the Final Cut Pro 4 bundle. We'll be talking today about the process for actually creating content for these applications as one of the numerous developer opportunities that are available to help contribute to the professional application set, which also include things like plug-ins and workflow opportunities and driver development and such.
This session is going to focus specifically on creating content for these two apps. We're going to start by talking a bit about LiveType, and we'll get into a little bit later Soundtrack. Soundtrack is the music creation application that's, again, part of Final Cut Pro. But to start, let's get into Lifetype.
LiveType, as I mentioned, is an application that's bundled as part of Final Cut Pro 4. It is basically our pro video titling application. What's unusual about LiveType is its incredible capacity for doing animation and manipulating a very sophisticated media. What we're going to talk about today is the content that LiveType uses and how you can actually create content to enhance the application and the opportunities you have for selling into this particular customer base. There's a number of different kinds of content supported by LiveType. Things we call active files, which is kind of the high-level conceptual container model for this content, which is built up of different kinds of content. Specifically, textures, which are, if you will, motion-back or animated backgrounds, typically.
Objects, which are similarly packaged but also incorporate an alpha channel capability. Basically, would be used in conjunction with other material in the background or might be merged into some other video or other material you may have. And live fonts, which is a new technology that we've introduced, which is the concept of having fonts that are actually fully animated movies where each glyph can have animation capability. Motion can be, as you'll see, made up of some very amazing footage or material to basically present text in a very, very new way.
And another form of content that LiveType incorporates is effects, which is the ability to then take all of this content that we've talked about and do very amazing things in terms of animation. Move it around on the screen, make things fade, come into play, et cetera. And lastly, we have templates, which are basically all of this material that's combined together into a form that can be packaged up so that users can actually build projects with it. And to kind of get into this in a lot more depth, I'm going to invite up Tom Langmacher, who's the product designer for LiveType, and he'll give a brief demo and actually get into what this all means.
Check, check, check. Glad you all could be here today. I'm always excited to talk about LiveType. This is the world's only 3D, or only application that will type 32-bit animated fonts. It's a really exciting concept that Apple's developing here. I like to think of LiveType and LiveFonts analogous to what system fonts were for the desktop publishing industry in the '80s. LiveFonts are for digital media content creation and delivery, the new 21st century font standard. And Apple's bringing this technology forward and making it available for the masses. And we have an excellent opportunity here to create content for it.
Just by owning LiveType, you have the ability to create LiveFonts and create your own animations, compile them into a LiveFont, and deliver them to the media, hungry public. And we have a great opportunity to create a new version of LiveFonts that's royalty free, which is a very exciting, brand new emerging market that we'd love to have you all be a part of.
Right here is the canvas. This is where you do your composing for the text. This is the inspector. This is where you type in characters and also create and influence the attributes of the characters. And in the media browser, this is where the content comes from. Things like Brett was talking about textures, objects, effects, and LiveFonts. And then you have the timeline, and that's where you can influence the timing for your composition. So let me start by typing something out here.
The first thing you notice that's truly unique about this application is, everyone of these characters are drawn as an independent layer. You have the ability to grab and compose, rotate, scale, essentially control all the attributes available to you from the Inspector on a character by character basis. Makes it very pleasurable to use. Not only are these independent layers, but they're contained in a single track.
This single track is represented down here on the timeline. Imagine having this many movies or animations, eight different layers on a typical animation program. It becomes very difficult to, for instance, control color and your timing considerations. Whereas we give you this all on a single track. Imagine if it were a sentence. You'd end up with dozens and dozens of layers.
[Transcript missing]
Now, any of these attributes that are available to me in the inspector, things like opacity, blur, scale, offset, rotate, all of these I can apply on a character-by-character basis. LiveFonts are essentially, can be created from anything. Take this bevel, for instance, it's created in a 3D application. You can do something with a hand-drawn font, for instance. Things you can create in 3D or 2D applications, this is created in a particle generator.
"Things that you can shoot with a video camera, real-world style fonts, stop frame, high resolution film cameras, for instance, could be LiveFonts." Once I choose my font from the media browser, I'm happy with it. I can demonstrate how this works. Every one of these characters are individual animations that are being drawn character by character. But let's say I want to control the timing of this. Right now they're all triggering one at a time. I have a timing tab that gives me the ability to set when these animations are triggered. I can choose to trigger them randomly.
"I can sequence them from left to right or right to left. I can control the speed of it. I can also loop them. I can do hold first frame, hold last frame. Things that would be a bit complex to do in any other application if you're dealing with all these layers at once." Now in addition to LiveFonts, we also give you the ability to use system fonts.
And something really great came out of our development as we were working with system fonts. You also have the ability to control these system fonts on an individual character basis, so that your composition is fun and easy. You're not grabbing - you can grab these one at a time, control their attributes. Now, when it comes time to animate these, the same with LiveFonts as with System Fonts, we offer this new concept of effects that are essentially modules with keyframes. Let me just pick an effect here. Lots to choose from.
Simple matter of double clicking. This effect is applied to a track, and that effect stays with the track. You can apply as many effects as you like. And just by simply clicking on it, and hit and play, I've created a fairly complex animation. And obviously I can swap out my language however I like.
And it's just as easy to create effects from scratch. Let me demonstrate that. So if I add a new effect, let me scoot this first effect over just a little bit. As an effect comes in from scratch, I'm working with two keyframes to begin with, a beginning and an ending keyframe. And in the Effects tab, you'll see that I've got this new effect, and I can call it anything I want.
So if I grab any character on here and start applying attributes to that keyframe, let's start with maybe an offset and a scale. Let me rotate it a little bit. I can even give it a Bezier motion path. As you see on the wireframe, it gives me real-time updates of what I'm doing as I go along. I think I want to give this an opacity change as well, so that it starts with zero opacity and moves in full screen.
Now what's interesting, here, you see in the Effects tab, I've just applied four parameters to this keyframe. That's a very unique concept. In other applications, you would see these broken down into four discrete tracks down in the timeline. And then I would have to control each four of those and control where they occur in relation to each other, and it gets awfully complex very quickly. With this application, it's all contained in a single keyframe. Now I could break these out into various effects if I want to, if I want that level of control, but for most cases, I don't want to do that. I want this nice, clean timeline down here.
I have all of these parameters available to me. I can do things. I can interpolate over time with shadow and glow and outline, extrusion, blur, color, you name it. And the same kind of control that I have with LiveFonts in terms of timing, I can also do with effects. So I can randomly trigger those effects with each character, or sequence it over time, loop it. So let me just set a bit of a random here.
"And there, with just a few clicks, I've created a unique animation. And I can choose, if I like that effect that I just built, I can choose to save that. Just come under save, save the effect, put it in my own folder, call it what I want, give it a description, and then it'll show up next time I need it in the effects browser. So it's really handy if I'm starting on another project, that effect that I built days ago, I can bring up and apply it at any time.
Now, I like to think of this as a content delivery system. We work with the media browser delivers effects, objects, textures, and LiveFonts. As Brett was saying, textures are background elements meant to be used as either background or perhaps foreground layered elements. But you can also put these textures inside of the text themselves.
Lots to choose from, but the idea is that this is the best delivery mechanism for the end user, because while you can buy canned animations and stock animation, this gives you the ability to create your own layers, your own unique style. You can colorize the textures before you apply your final touches to it. You can also apply objects. And those are the ones with alpha channels that are meant to be in between layers or on top of layers. And of course you control that layer order.
So the end result ends up being something very specifically unique to your own project. And it's easy to save this out and then apply it in a template. Just let that render for a minute. OK, so my color coordination may not be the best, but you get the idea.
So I've got all these elements at play here, and I can move them around with keyframe animation but with these effect modules. So the combination of all this content is all nicely put together in a template. If I were to save this project, I can then apply this through the template browser, and then call that up at any time, and then swap out my new text with this template.
So it's an excellent way to deliver to the customer how you intended to use these live fonts and how your additional content, such as objects and textures, interplay together. You can create a number of different styles, and categories, and themes that greatly reduces the amount of time that an animator or a video editor would need in order to create these from scratch. But our primary purpose for being here is to talk about LiveFonts and the production thereof. So I'm going to ask Dave Howell, the lead engineer for LiveType, to come up. And he's going to get into more detail about how you make LiveFonts. Thanks, Tom.
This monitor isn't on. Is that intentional? As Brett mentioned before, the three main types of LiveType content are active files, effects, and templates. And of active files, there are three types. There are textures, objects, and the LiveFont. And I'll go into how you make each one of these things.
First of all, active files are actually a file pair. There's an active font proxy file and an active font data file, AFP and AFD. And the reason there are two types is that some users may not install all the content that you deliver. They may not install the data. A data file can be pretty large. It can be 100 megabytes if you've got a font, say, with a couple hundred characters in it and a few seconds, 30 frames a second, high resolution.
So they're separated out, and they can be installed independently. The proxy files are relatively compact. They're compressed well, and they contain all of the parameters that define a live font, although they don't contain all of the compressed frames. They only contain one compressed frame for each glyph. Thanks.
So the simplest form of a live file is a texture. A texture is just a-- it's basically a movie, a 24-bit per pixel movie. So anybody who has a library of texture movies can repurpose these and make LiveType content from them. And that would be a viable product in itself, just a collection of textures.
It would enhance the application quite a bit. The textures are high resolution, they're full frame rate, and they're typically used for background tracks, as you saw Tom demonstrate, and also for texture mats. So, given a character, you can texturize that and basically mat to the texture with that character. Here's an example of a texture.
Now, an object is, again, it comes from a QuickTime movie. You can use any QuickTime movie with an alpha channel to make an object. And the main difference between that and a texture is that it contains an alpha channel. The alpha channel can be straight or it can be pre-multiplied against white or black.
And again, it's high resolution, full frame rate. And you typically use an object for lower thirds. You can use it for the bar across the bottom. You can use it as an accent, like pixie dust. You can use it for a mat, say a frame around your video, and for any special effects like this.
Now, LiveFont is the new font format that we've developed. And it's basically a collection of objects, one for each glyph in a font. So, For a very simple one, you might just have the Roman alphabet, and you may have just capital letters. And for your own use, that might be sufficient. For the ones that we ship, we have a full collection of 127 Mac Roman glyphs.
And they're similar to objects and the rest of their parameters. You build a LiveFont by using a font script. And it's a simple command language that we developed for this. Just a way of specifying all the parameters in a way that's easy to replicate and edit in a text editor. It's not something that the normal user would use, although it ships with LiveType. We have a font maker tool that's built into LiveType that you would use to build LiveFonts.
So I'll just quickly give you a feel for what the font script looks like, although I won't go into detail on really any of it, I just want you to see one to see what's involved. A font script has a few commands that specify the names of those source files and the destination files. Also, you can specify what the name of the disk that you're going to distribute this. So that if a user doesn't have it installed, a live type will say please install the disk, in this case, demo disk to install.
You tell the font script what the flavor is. You can have, again, an active font, which is a live font, or you can have texture or object as the argument there. You tell it the alpha type, which is straight, white, or black. You can give a description, which shows up in a template browser. And these two commands you see here, the lower left and center, are things that you'll measure out in a graphics application. You'll find those two points as offsets from the lower left hand of each source moving.
Now, The next part is some spacing, some horizontal spacing. There's the width of a space. That's just given because there is no actual source movie for a space character, typically. You have timing information in these two. There's whether or not the font can loop, which is just true or false, zero or one.
And then there is a count of frames for intro frames, loop, and end. And when you extend the length of a LiveFont in duration on the timeline, we loop the loop portion here. So the intro frames play first, then you loop some number of times, and then go to the end frames. You specify the compression quality of the RGB and alpha channels, which are compressed separately to give you finer control over that. And the source frame rate for the movies.
Aspect Ratio. You may have taken them from a DV camera and you may have 0.9 pixel aspect, or you may have 1.0 if you came from a 3D rendering app or some Photoshop or another app. And you also have a bunch of default settings that are stored with LiveFont, and those are gone into in some detail in the user manual.
And finally, for each glyph, We let you specify the source movie, the characters that will be mapped to that movie, and some numbers that I'll go into in a little detail. Here's an example. We have a source movie for the letter A here, and the letter A comes from A.movie.
And the next two parameters are the advance width from that glyph until the next glyph when rendered out on a track. And the next one is the proxy frame index. So you can have a - you can tell which proxy frame will be used if the font hasn't - if the data file has not been installed yet.
There are also some optional glyph parameters that you don't need to put into your script unless they're different from the lower left and center values found in the lower left and center commands for your font script. So each glyph can have its own values for those. Now, effects are pretty simply the same as the effects that you build when you're running the app for your own use.
There's some tricks to preparing an effect for release to the public, like making sure that the default timing values are going to show up right for any duration of track. So if you've got something that's a fade out, you'll want it to have a start time that's from the end, not from the beginning. So it'll show up always at the end of the track.
And in effect, Tom went pretty much into the capabilities of that, so I'll skip over that, but here's an example of how you might... "What it does." And templates, again, Tom went over that. It's just a saved project file with a description. And it's something that the user finds in the template browser, selects that, edits the text, and renders out his own movie into them.
So there's an example of one of the templates just simply with the text changed and re-rendered. Also, when you're building live fonts, effects, and templates, you're going to need a thumbnail movie that the user will see inside the template or effect or live font browser. And the thumbnail movies are 160 by 120 movies.
They're compressed however you want, just .mov files with the name specified from the font script. And when you're building those, you should use a template of your own. You can make a thumbnail template that's just 160, 120. It makes it easy to to build that as an example of what one might look like. Finally, you'll find in this user manual, it says exactly where these things should be installed. You'll probably want to ship an installer that puts these in the right place. And on to design tips from Tom, who will give you some tips on creating these things.
Now remember these are movies, these are animations. And so essentially all we're doing is putting these animations into a folder with this font script that Dave was talking about, and then encrypting them into a single big data file. Now it also has a proxy file, as Dave was referring to. This is what you use to compose on the screen and when you're ready to render you use this big data file.
But the point being, because you've got a lot of characters that you're creating movies for, optimization is a key. That's something that you wanna focus on. So what are the number of characters when you're planning is an issue for you. Apple ships 127 characters. We're striving to support English, Spanish, German, and French. So we contain all the special characters for accents and umlauts and so on.
Maximum point size. You're starting with an animation, so the very largest size you work with is to be scaled down from there. Obviously you don't want to scale above that, so that will affect your final file size and the size of the animation. We recommend somewhere between 200 and 500 points, depending on what you're shooting for, for file size.
The number of frames that you use and the frame rate. Now this depends on the style and what you're working on. Typically for broadcast you'll want to work in 30 frames a second, but that's not always the case. If it's something like the cool font that you saw up there with the wiggle, you can get away with 10, 12, 15 frames a second. Again, this will lower the file size. And as far as the number of frames, that's the kind of thing that you want to optimize as much as possible. I don't recommend going beyond 90 frames for your maximum animation length.
Okay, so for when you start, this is essentially a matter of starting out with what project size you're going to work in, what resolution you're going to work in. Let's say it's 900 by 900. You need to keep that same resolution for every single glyph. It's important to start with that lower left point that Dave was referring to and register all the characters to that lower left point so that you have a common baseline and a common kerning point for every character.
Now, you can override those in the glyph command, but it's much easier if you register them to begin with. So you take a big character like a W and make sure that it fits your project size and establish your lower left point from there. Characters with lower extenders like a lowercase g, you would start with a, you make sure that that lower left accommodates that extender point.
Looping, does your animation loop? Character should loop. We handle looping in two ways. We do something really interesting called segment looping. Here's an animation where this is called a TV font. And these characters, they have three segments to them. First they come up, then they sit there and have an animated staticky screen for a period of time, and then they go down.
So there are three segments to this animation. When you build this, you can define in the font script where your segment loop is, where the center portion is, so that when the user uses this and he wants this TV set to stay up for an extended period of time and he sets the loop value, then it will only loop that center segment portion. So it's a really intelligent way to loop. And then there's also just a full loop. Here's an example of that where the beginning frame matches the ending frame. Simple loop.
And of course, it's important to discover who your target audience is. Broadcast video is the way it's packaged right now, but it's also great for multimedia and web. I think people are finally discovering, or just now discovering, since it's a new application, that this works in Keynote. And Keynote might mean lower resolution, depending on where you're going with it. Broadcast is typically 30 frames a second and high resolution.
You can also go to web, and of course, web, you want to keep it down to 10, 12, 15 frames a second. And also print is a possibility. I've made live fonts before that are 500 point, very high res, single frame that amount to 4 megs when I'm done for the entire font, which is pretty cool.
And then of course the style of the font you use. LiveFonts are a new animal, and it's important to recognize how they should be used. I don't want to see pages and pages of LiveFonts being used. I want to see them used as a metaphor for a statement, or at least that's the way I think they should be used.
They can be combined with system fonts and then just use them sparingly in order to make that push, that point for the audience. And here's just some examples of the metaphors that I have in mind. So with that, I think that concludes what we're able to talk about today in this short period of time. And I hope you get out there and make LiveFonts. Thanks for being here.
Xander, Xander? Hi, I'm Xander Sorin. I am the product manager for Soundtrack, and we're going to get up here and make a little bit of noise for you. So Soundtrack is part of Final Cut Pro 4, and it allows you to create original royalty-free music. And it does this in a really cool way, because it lets you use pre-recorded musical performances. And what that means is that you don't necessarily have to be a musician.
This is great for video editors, because they can take somebody else's performances, and they can combine them together. The thing about other people's performances is that they weren't necessarily recorded to sound good together. Like, one could have been a drum beat recorded in LA at a slower tempo.
Another thing could have been a piano recorded in New York. So what Soundtrack does is, in real time, it will match them together. So all of a sudden, you don't need to know about all these technical things. You get to just go and say, you know, I want some drums here. I want a saxophone. So again, it's all done in real time, and it supports a variety of really popular file formats, including AIF and WAV. Which are uncompressed and don't have any metadata.
So there's no way for it to really know what the tempo or the key is. So Soundtrack is actually able to infer what the tempo is of these loops, and is able to guess really most of the time really accurately, and combine tempos of AIF and WAV. Now, there is also a file format called ACID.
Now, the ACID file format also gives you a couple pieces of metadata that makes it easier to match loops. And that is the key as well as the tempo. So Soundtrack recognizes ACID files. And there are lots of different ways to do that. So Soundtrack recognizes ACID files, and there are lots of different ways to do that. So Soundtrack recognizes ACID files, and there are lots of different ways to do that.
libraries of hundreds of asset files available on the market. For example, for $60, you can basically buy Mick Fleetwood and have Mick Fleetwood playing royalty-free on your session, which is really, really cool. But what we're here to talk to you about today is a brand new file format called Apple Loops.
And I'll tell you a little bit about what they are and how you can create them. And those are, we think, a great market for creating Apple Loops for Soundtrack. So Apple Loops are based on AIF and it adds some metadata to it. So this is used not only for matching the loops, but also for searching them.
Because one of the really cool things that Soundtrack lets you do is it lets you find files really quickly. And that becomes important. Soundtrack has about 4,000 loops that ships with it. And it's very easy to add additional loops with third-party libraries. So it'll be quite common for people to have 40,000 loops. To be able to find them is really, really important.
So the chunks that are included in an AIF to make an Apple Loop are author and copyright information to assign ownership, the beats, which is used to infer what the tempo is, as well as the time signature. Since Soundtrack supports different time signatures, that's embedded within the file.
The musical key, as well as the scale type. If you look at other file formats like Acid, it'll have the actual key, but there's nothing that tells you whether that, let's say, a piano part was in A minor, and then you have a guitar solo that was in A minor.
They both kind of match to that key, but they don't sound good together, and a lot of non-musicians don't know why that is. So we've taken it to the next step, and we've added major and minor information so you can filter down and things just sound better for you. A few other important things that we've added, which really bring out the power of the search engine, is genre, instrument, and then a whole bunch of these mood descriptors.
So you can, and we'll kind of go over the different mood descriptors, but if you say that you want an instrument to be defined as relaxed and acoustic, and maybe it has some processing on it, we have a bunch of descriptors which allow you to assign value to that.
And finally, transient markers. What's happening in the application is all your audio is being stretched. It's either being sped up or it's slowed down, and that involves samples and bits being taken out or added to the audio file. So you want to do that to the right place.
You want to make sure that you are taking samples out from a place that doesn't have a lot of active musical information, and transient markers allow you to protect the areas and you'll be able to see really clearly in your waveform which areas you want to avoid for that. So with that, why don't I give you a little demo of the soundtrack so you can see kind of the loops in action, and then we'll actually go ahead and show you how you create the loops.
Okay, so we're in Soundtrack here, and you can see in this area here we have the Media Manager. Can we switch to the other machine? There's the media manager that I was talking about. Okay, so the first thing we want to do is, this is part of the Final Cut Pro package, so a lot of people are working with video primarily to start out, and then you add music to your video.
So what I do is I go into this directory structure, and I have basically a shortcut to my home directory, and here's a QuickTime file that I drag in. And this is kind of the old way of going and finding your file. Now, that's fine with my video in this case.
I'm actually going to create a little cycle region here so I can get the video playing in the background. But if I've got 40,000 loops, and I want to go and find a specific drum that I'm looking for, this is not the way you want to do it.
So we built a search engine, which makes it really, really easy, and it leverages all these tags that are in the acid loops. So you can see we have a bunch of keyword buttons here, and this one's kind of a grab bag of a bunch of different instruments here.
You can see like cinematic rock and blues, urban, some different genres, and then a few different descriptors. And again, that's just one page of many different, ways that you can access keywords, including a custom page so you can define your own. But we'll stick with this kind of assortment. And in this case, I want to find a drum.
So I click on drum, and you'll notice I get about 950 drums. We have over almost 1,000 drums that just ship with the application. That's still a lot to go through. So fortunately, I can use these keywords and combinations to find exactly what I'm looking for. In this case, I will say that I want something that's fairly relaxed.
So I'll combine that, and now I've got drums and relaxed, and a much more manageable list. But I want something that has a little bit more of a modern feel. So I know if I type the word dance, now I have this refined search field. What refined search lets you do is type in a word that will actually be found either in the file name or the entire directory path.
So it's another really powerful way of zoning in to find the exact content that you want. In this case, I'll type the word dance, and I get a whole bunch of electronic club dance beats. Now, the cool thing is I can just click on one of these, Some audio coming in there. And I'm kind of previewing them against my composition.
If you just wanted to kind of make a mental note here, you can see the tempo and all these tags are identified. It's a drum, so it doesn't have any key information. So we have a 130 BPM drum groove going here, and I'll just drag that into the composition.
Now, because this is a loop, I can just drag out kind of as much or as little as I want, and you can see the length of the loop is pretty easily defined by this little indentation. And I've just filled that to my video, and now I have drums going through the entire composition.
But it's also very easy to add additional instruments, even, again, ones that weren't designed to sound good together. So I'll actually click off of "Relaxed," and in this case, I want something that is acoustic. So I'll make it acoustic, and in this case, I'll type the word "funk" in my refined search field, and I have a bunch of these Funkmaster kits.
Now you notice that these tempos are 110, which is completely different than the 130 that we first heard. So I'll drag that in, and you'll hear that they... They're played in sync. The other thing you notice is that the icons are different. Because the musical instrument is one of the tags, Soundtrack knows to assign the appropriate icon, in this case an electronic drum kit, in this case something that was more of an electronic beat.
There are a lot of different instruments, and these are all brought up by the tag. So let me quickly get a couple more instruments in here so we can see some of the other power of the app. We'll go into synthesizers, and I want to just work in the key of minor, so I want to make sure everything fits together in the minor loops. I'll go ahead and play this composition. Let's try a couple different sounds.
Okay, so here's a little synthesizer that's in the key of C, and I'll drop that in. I'll just get one more instrument going here. I will click on Cinematic, in addition to the synths, and now here's one in G. So I'll bring that in. "And it still fits.
So again, you don't have to know anything about music or what keys work or what tempos work. Soundtrack kind of does that for you, and it does it all in real time. The other really amazing thing is because that's all in real time, you have this unprecedented flexibility to be able to say, 'Well, I want to hear all this stuff faster.' So you just take the tempo slider." And now it's all faster. I can take it all and just make it slower.
You can hear, even though it's substantially slower, now I'm down to like 90 beats per minute when some of the loops were at 130, it's preserving the audio quality, and it's doing that because of the transient marker assignments that I'll be showing you in a little bit. And the final thing I'll show you here is that all the keys are being matched to a project key of A, and I can very easily just change it all until, say, the key of "We'll change it again and back up into... and it does that all in real time instantly to all the loops.
So that's just a little bit of an overview of how Soundtrack handles all these Apple loops, and now I'll show you how to go ahead and create your own." Okay, so creating Apple loops. A few things we're going to cover is the actual recording and editing of them, the tagging and adding transient markers to the loops, and then some things you can do in the naming and directory structure. And then finally, just some quick things you can do to test your loops and make sure that you did the right job.
So when it comes to recording, Soundtrack supports really, really high resolution audio, up to 24-bit, 96 kilohertz resolution. Most of the content out there is at CD quality, which is at 16-bit, 44.1. So there's an opportunity to deliver really, really high quality audio to people that are looking for high quality audio loops.
Like most other production environments, audio follows the garbage in, garbage out model. The audio you're going to get at the end is only as good as what you put in. So the quality of the connectors becomes really important. The input chain, if you've got a fan blowing in the room, those are all going to make it into your final mix. So make sure that you have as good of an input chain as you can have from the beginning. And that carries into whether your instruments are in tune or not. So most of the content out there is tuned to an A440.
So you want to make sure that you're in tune with the rest of the world. And even over a session, guitars fall out of tune really quickly. So it's good to keep up on that. The other thing is, when you develop your content, it's important to play to a really, really steady click or a steady drum beat that doesn't have a lot of sway and a lot of excessive feel.
Because you want to keep things fairly on the beat, because you're not necessarily trying to match the feel to the things that you're working on. You're trying to match the feel of your content to the rest of the world. And in loop-based music, most of it is fairly straight. Now you can add feel later, and there are things you can do in your project.
But for the most part, you want to make sure things are really, really steady. And finally, it really pays to be organized and to keep a recording log of all your performances. If you're sitting there and playing a guitar, and you have 30 different takes, and they're in a variety of keys, it's so helpful to know that when you actually have to go and tag them and define what those keys are.
So in terms of editing, what you want to do is you're going to have a long track, probably 30 different guitar takes, and you want to start cropping and trimming it down into the final loops. A lot of great waveform editors out there. There's one in Logic, Spark, and Final Cut Pro 4 includes an application called Peak Express from BIAS, which is also a fantastic way of trimming down loops.
So what you want to do is find that perfect looping region, and you establish that. And the best kind of test is to close your eyes and try some different tweaks and get that start and end point. And you really want to make it so when you close your eyes and you kind of tap out at tempo, you just don't hear where the break is.
You kind of forget where the beginning and end was. And that's kind of -- you'll know at that point you're fairly successful in creating a good loop. But the second part of that is you want to make sure that the beginning and end are happening at a zero crossing. So you want to make sure that the waveform is actually crossing the zero line of the X axis. And if you have one that's kind of at the top of the waveform and then one that's at the zero, you're going to have pops and clicks.
The reality is that sometimes you get that perfect timing and the zero crossings aren't where you want them. And that's where you'd want to use actually a manual destructive. You can do a fade in at the beginning and a fade out at the end, and that'll force it to kind of begin and end at a zero crossing. And the last thing is, since you're probably recording a lot of different instruments, maybe at different volume levels, it's a safe last step to just do a normalization so that everything that you have is kind of at a consistent volume level. Okay.
So tagging. To assign the metadata to the AF file, look at all these things that we have to do. You've got Author Copyright, Beats and Time Signature, Musical Key, Scale Type, Genre, Instrument, Mood Descriptors, the Transient Markers, and ultimately the File Name. That's a lot of work. So how are we going to do this? Well, we created an application for you to make this really, really easy called the Soundtrack Loop Utility.
And we built this in conjunction with a lot of people that were doing this for years, that were professionally doing content production. So we learned a lot of things about what people are looking for when they're tagging their loops. They need to be able to batch convert a lot of files at a time, or possibly even run the entire thing off of a keyboard instead of a mouse.
There's a lot of content developers that are getting carpal tunnel. So we built an application that makes it really, really easy to work with that kind of a stressful production environment. So I will actually jump in and give you a quick demo of the Soundtrack Loop Utility. And so you can see this in action. Thank you.
Okay, so the first thing the Soundtrack Loop utility asks me for is what files do I want to work with? And as I mentioned before, you can work with single or multiple files. In this case, to start out, I'm just going to go ahead and select this whole list of files. And I've got these different organs, and I can go ahead and I can preview them.
So in this case I want to work on this one jazz guitar riff. So I've selected that and now I have the ability to assign all the different parameters to it. Well I know it's not, it's a loop, so I'm checking on loop. A one-shot is, would be an Apple Loop, Apple Loops file that doesn't get stretched.
So if you have a symbol hit or a spoken word, there are times when you don't want it to actually stretch like a loop, so you enable that as a one-shot. In this case it's a loop, it is eight beats right now. The key I happen to know is in the key of D, so you assign that there and it's also major.
Then you can assign the time signatures, you can see there's some author and copyright information and even some, there's a comment field, so you can either make some internal notes or if you just want to say for more information go to my website, you know, that kind of thing. Then we have the ability to assign genre. We put some really, really basic big buckets for genre.
We could all argue about the 30 different sub-genres of electronic music. So we just basically kept it, fairly vague as far as rock blues, electronic jazz, just to kind of accommodate the largest genre. So in this case that was a jazz loop and it was an electric guitar, so I click on guitar and I have a whole bunch of different choices here. And then you can see here we have all these different descriptors that we can assign. And I'm actually going to go ahead and turn on keyboard tagging because again this whole application can be driven by keyboards. And I can listen to the loop.
So when I listen to that, it's a single instrument, it's not an ensemble, and I'm going ahead and I'm just typing keys. It's a part, not a fill. It's electric, it's dry, it's clean, it's kind of cheerful, and it's relaxed, and it grooves, and it's melodic. So really, really easy to just go ahead and assign those, and then I would just save out the file, and then all that information would be embedded, and we have an Apple Loop as a tagged AIF.
Another thing that a lot of people, a lot of content developers need to do is they need to work with the content they already have, because they want to add all this additional rich data, let's say, to their existing AIF or their ACID files. So we allow you to select, let's say, there's a whole bunch of ACIDized WAV files, and I've just batch selected the whole thing.
As you can see, the key actually came in from the loops, as well as all the other copyright information. So it's very easy to say, okay, my scale type for this is these are minor loops, and it's an organ in this case. So then I'd go ahead and I'd save that out, and I'd create a whole new set of the AIF files.
One thing that I want to actually go back to this guitar riff and show you how transient markers are assigned. So by clicking on the transient tab, I'm going to intentionally go to kind of a wrong setting here. I'm going to switch to transient markers happening at whole notes, and I'll play this loop through.
You can kind of see there's some weird delay happening, and the more I stretch this loop, So there's some weird stuff happening in there, and that's because the transients are not in the right place. So I want to make the transient divisions go to quarter notes, because it seems like there's more happening there, and it seems like it could even maybe benefit from eighth notes. That's too many. Let's go with quarter notes.
And then I have a sensitivity fader, so it'll actually help line up. You can see the transients are now, like, popping to the beginning of the major transient events where I don't want the stretching to occur. So most of it is pretty automatic, and you can kind of use sensitivity to zone into where you want to go.
And I can see this one's a little bit off, so I can move it. Maybe I want to add a point right here and drop in a couple. It's always good when you have a trail to add quarter notes in between. And we'll get one more in here and hear what that sounds like.
So basically it's playing it a lot faster than its original tempo, which was this. But because the tempo, these transient markers were properly defined, I now have a good sounding loop that stretches well. So that's kind of the Soundtrack Loop utility in a nutshell, and we'll go back and just wrap up a couple more production tips.
So when it comes to naming and directory structure, the thing that we really encourage is to - this is the opportunity to be more descriptive than you would be inclined to be. If you have a whole bunch of guitars, if you name them Guitar 01, Guitar 02, you're going to have the same file names as thousands of other loops on the market. So Down Home Delta Blues Guitar would be a much, much better name. The other thing is that refined search field not only uses the file name, it uses the whole directory path.
So if I actually had a folder called Xander Sorin, and then inside of that I had Delta Blues, or if I had strumming within that, I could type any of those words within Soundtrack in the refined search field, and those loops would come up. So that's, again, an opportunity to provide the users with additional ways of finding your loops.
So finally, you've created these loops using the Soundtrack Loop utility. Best way of testing the loop to see if it works is drop it back into Soundtrack. One of the first things you'll see right away is if an instrument icon pops up, you know you're at least kind of going in the right direction. So checking the icon is kind of the first telltale thing.
And then you probably want to make sure that the rhythm works with your other content that's in the project. So we recommend having a really, really straight and steady drumbeat, like an electronic drum. And if you're finding the loop is either just not looping right or is drifting back and forth, probably needs a little bit of attention.
And then finally, since you have the ability to assign key and then major and minor, it's very helpful to have, let's say, a major and a minor organ or a pad or some kind of a loop that you can drag in and then test your loops out against and see if they sound good.
And then the final thing that would be good to test against is there's an additional info field. Like you saw that media manager that we had. At the bottom, there's a disclosure triangle that gives you even more information about the loop. And you'll see all the tag and the copyright information within that. So then you'll know if the file received it all correctly.
So getting started, you guys all look really anxious to get out there and start recording and creating your own Apple Loops. So we've developed an SDK for you. It includes the soundtrack loop utility, as well as documentation for that app, and a separate application which talks much deeply to all the topics I was talking to today. Again, working with content producers to talk about the things that were important to them and the things that you can do to really optimize your loops, as well as a few different sample loops.
We have examples that you can go through, and each different loop type has different ways or strategies of assigning transient markers to make sure that they sound really good. And that is all I'm told going to be available by the end of the week on the Developer Connection site. So we hope you guys all go out there and start making some great Apple Loops for us.