Configure player

Close

WWDC Index does not host video files

If you have access to video files, you can configure a URL pattern to be used in a video player.

URL pattern

preview

Use any of these variables in your URL pattern, the pattern is stored in your browsers' local storage.

$id
ID of session: wwdc2003-710
$eventId
ID of event: wwdc2003
$eventContentId
ID of session without event part: 710
$eventShortId
Shortened ID of event: wwdc03
$year
Year of session: 2003
$extension
Extension of original filename: mov
$filenameAlmostEvery
Filename from "(Almost) Every..." gist: ...

WWDC03 • Session 710

Preprocessing Principles

QuickTime • 1:19:24

Preprocessing is widely considered the secret to how to make excellent web video. This session teaches you the general principles for how to pick appropriate cropping, scaling, noise reduction, and image adjustment parameters for optimal quality, whatever your data rate.

Speakers: Glenn Bulycz, Ben Waggoner

Unlisted on Apple Developer site

Transcript

This transcript was generated using Whisper, it has known transcription errors. We are working on an improved version.

So because this is such a kind of a different year for us mixing in the QuickTime Live and the WWDC stuff together, I just kind of, before I essentially start the class, just kind of get a sense from folks what angle we're coming from so I can frantically rewrite my keynote presentation in 30 seconds and do that.

So how many people are coming from the content world primarily? And from the programming software engineering world? Okay, about 50-50? Perfect. Nothing like a no audience overlap to make it all exciting. So the folks are coming from the content world. What do you want to learn today? What do you want to know about? Just shout it out.

But breast-sabering video into the computer, so basically the capture process, how to go from tape into the computer. Good. Always a challenging issue, but getting better. Yeah. Analog Digital. OK. Great. All these things, I haven't run a slide up for you yet. It's good when I ask these things. Yeah.

Currency of the tools. Good. Yeah, so my plan is I've got about a 45-minute president, half an hour, 45 minutes of talking, and then I'm just going to do demos, and we'll let audience vote happen what demos we're going to see. So make sure you have your favorite tool in mind, and we'll just take a look at what people are interested in, what scenarios you want to have, just to prove them. Keeping up my feet is always good to not know what I'm doing in advance. So, good. For the software folks, what do software people want to know from me? That was a frightening thought. Is anyone building software that involves preprocessing? Or have that in mind?

Great, OK. Sounds good. Yeah, so actually it was not in the syllabus. Actually, we'll be looking at Compressor today, by the way. I should mention. So I got my NFR copy of that at the last minute. So I haven't actually used it much yet. So it'll be a-- we'll be learning Compressor to get it together a little bit. But it's pretty easy to use so far. Actually, it completed an entire file in it during an earlier presentation, thanks to the batteries.

Cool. So, sounds good. So, is there a time? We've got three more minutes. So, I just got bored, so I started talking here. It's too nervous. I read all my email. What am I going to do up here? So, what are, is anybody having like a big preprocessing headaches? Just like kicking someone's butt or problems you're having, something like that that's just causing pain? Yeah.

There are so many rules of thumbs that you need a lot more arms than we each have on this kind of stuff. If you're doing it lots of times to see which works best, that means you're doing the right thing. I encoded my first file with Macramein Director Accelerator in 1989. So this is my 14th anniversary here of doing this stuff, and I still encode every file about three times that I do. It's like you just get a little bit closer every time. So it's always a good idea.

It's healthy to continue to experiment. If you're not sure what's the best way to do it, try it both ways and see what happens. One of the nice things about fast computers is you can just do a big batch and try a lot of different alternatives. The proof's ultimately in the pudding. All that really counts in the end is the pixels you get on the user screen. So whatever looks and sounds best is the right solution. Having that kind of exploratory approach is a great one, I think.

There's so many weird little things you can do. I'm not going to be talking about compression formats. I'm going to be talking about codec tips today. Primarily, everything that's going to happen from your video. Let me see. You've got your capture file on your hard drive to what you actually hand off to the compressor. It's kind of the middle part we're talking about today. And that's certainly enough for 90 minutes. But yeah, I mean, so shall we? It's time to formally start now. So thanks for indulging me.

Oh, and definitely when you're asking questions from now on, because we are taping, make sure you come up to the microphones. But please feel free to ask questions during the presentation. All right, thanks for coming everybody. This is Preprocessing Principles and a special hello to our friends listening in Nihongo and on ADC TV or on the DVD.

doesn't really need an introduction per se in the video world or in the digital video world. So given his last five minutes of casual discussion, we're really happy to have him here. Many of us wish that there was a do what I mean button in the compression tools and that everything would come out perfectly and that we'd get, you know, bytes per second when we really were getting bits per second. So we hope that Ben offers some good guidance on how to get really good video even before video hits the compression tool. And I'll hand it over to you, Ben, and then we'll have a Q&A afterwards.

And I would ask that if when we do the Q&A, you do that with the microphone, either raise your hand and somebody will bring a mic to you or there's a mic in the center of the room. That way the question can be understood clearly and translated well. So thanks, Ben. Off to you. Great. Thanks.

All right. So I'm going to have my Steve moment. I have a remote mouse here. First time here. I got Keynote and a remote mouse. I feel like a real boy for once. This is good. No more right arrow key in PowerPoint for me. No siree. So we said that. This is me. And my name is Ben and I'm a codec nerd. So, yeah. So what we're talking about today is preprocessing. And we're going to look at a few tools, talk about the theory of it, try to get this all together.

So talk about some ideas and actually show you how to apply those things in some real world projects. And the focus of today, this is also to be useful stuff. If you're compressing video, I hope everyone today, I can answer a question you've had. And it will save you some time and work. I'll be helping you get better results and lower your Aspen budget. It's always the goal. So we have an agenda.

What is this? Why it matters? How do we do it? How do we do it? And primarily today we're going to talk about web delivery. DVD code is actually quite a bit easier. I'll mention it here and there. But the focus today is delivering on web. You know, you're on cell phones, all that kind of stuff. But feel free to ask DVD questions that are specific as they come up. But fortunately 90% of the time DVD is pretty easy. With a couple of caveats I'll get to.

So, preprocessing. Preprocessing is pretty much everything that happens between your source frame of video and your compressor frame of video. So, you know, any codec, you've got your capture your stuff, it's in Blackmagic codec, or it's in DV, or it's in animation. You've got a bunch of rectangles to start with, and then in the end you're going to hand a bunch of rectangles off to the codec, which is going to make it your bitstream. So preprocessing is all about taking one rectangle and then making a better rectangle out of it for the codec. Rectangle to rectangle doesn't sound too hard, if only.

For me, I find that, like, you know the bitrate you want, generally, in a project. You can kind of key that in, you know the right frame rate, a lot of those kind of stuff. Preprocessing is the part that's got the most artistic, it's the most crafty compression. It's the one I wind up spending the vast majority of my time on.

You know, I mean, even when you're doing a big project, you know the specs of it. You've got different kinds of source coming in. You often wind up tweaking each clip a bunch of times, and the filters and all that kind of stuff. And we're getting tools to make this better and better and more automated, which is nice. But, you know, it's always the part that I wind up sweating.

You know, it's all there, you know, just knock it over a couple pixels here and there and do it over and over again. Fortunately, there's actually a new compressor app I'm pretty excited about. It's got some pretty good features for actually seeing it in real time in your effect, which pays off a lot.

So, why does it matter? We're really trying to maximize what I call bang for the bit, compression efficiency. We're trying to turn every bit of bandwidth we have available into information that serves our communication goals. We're not compressing video for fun, we're compressing video to try to communicate. And by preprocessing, we're trying to give the codec the right input it can to turn, to deliver the best possible end result it can. You know, nothing, no distracting artifacts, as much information as possible. Every bit of pixel count, so.

This is a little illustration here. This is just, you know, I grabbed a, this is actually courtesy of DreamWorks here. They seeded me with the movie trailer for a movie called Biker Boys, which you've probably never seen. I've never seen it, but the movie trailer is kind of cool. And people always do like cool movies like Terminator 2 for the demos.

And the problem with a good movie like that or Finding Nemo is you watch the trailer and get excited about it. By choosing kind of a movie no one watched or cared about, you can actually look at the video frames and not be distracted by it being funny or anything. This is a kind of typical interlaced frame. It's a little bit jumbly here just because of the scaling and key note, but you get the idea.

Typical frame, you pause on it, you see with any kind of motion you get those lines like that. And then if we preprocess the frame, it'll look like that. So not a huge difference like this, you know, a little bit. But when we compress these two, we get a pretty big difference here.

If I just, you know, export, uh, at 800 kilobit MPEG-4 from QuickTime Players, we went up with this. And I think you can see okay in the audience here, it looks pretty bad. You know, all those sharp little lines really kill the codec, it's confused, it's messed up, you know, it's like a mountain or a motorcycle, it's really hard to tell what's going on. Same frame, same data rate preprocessed, you know, went up with a much clearer image.

So even if they don't look, even if before you compress it, you don't see that big a difference. When you actually compress it, it really pays off. By just getting rid of data that isn't there, by eliminating the noise, you're trying to maximize the signal. And there's a lot of different kinds of that. Also a couple things is, this frame was, what's going on? Okay, here we go.

Anyway, so let me just start off with the first steps here. I'm going to walk through typically the filter chain that most tools use, the order you're going to apply the filters in the tools, conceptually in the order I'm going to do. First thing, de-interlacing. Example I gave you there. Traditional video, you shoot in interlaced mode like most video cameras are going to do. All your even lines and all your odd lines are going to be captured half of a frame apart in duration. So in NTSC, how many people are coming from a PAL country?

Okay, so a 60th of a second apart in NTSC and a 50th of a second apart in PAL. I will try to give PAL examples. PAL is actually a much, much better source format for doing video with. You've got 25 frames a second progressive, you've got 576 lines. I mean, I've been moving to England and doing only PAL video compression many times. But here we are in NTSC and we'll make do. So just bear with us PAL people. We suffer more than you know.

Okay, computer video obviously is just drone top to bottom progressive scan. So not normally a big deal. Various codecs like our DVD player apps can often like deal with the interlaces and show a progressive on the screen and it looks okay. However, I see this a lot. People export into web video format and leave in the two fields in there.

So basically you wind up with a frame where you have every other line doesn't match where there's motion. And that is just a total killer because the things that really hurt codecs where it would take a lot of bits to encode are motion, and detail. And basically a lot of horizontal lines where each pixel is completely different than the one before and after it is almost the most difficult thing you can imagine encoding. So when you have interlaced artifacts like that in a frame, all the bits get spit trying to draw those little horizontal lines that don't really matter. And those are bits that don't get spit on drawing say the face, you know, the person in the frame.

It does it a lot. Also you want to get these double images of moving objects. You throw a baseball, you'll see like two baseballs that are kind of like translucent over the screen. translucent over the background because they're getting blended together. It doesn't look good at all. So if you have progressive content, it's a- I mean, you're great, but if you have Interlaced you actually have to deal with it one-fashioned.

A lot of tools, traditionally it was called the basic method or whatever, it eliminates one of the two fields. Okay, even and odd lines don't match, throw away all the odd ones. You got even lines left. The big problem with that is you've thrown away half your image data. If you throw away half your lines, you had 480 lines in NTSC originally, you've got 240 left.

Okay, well, that's gonna be an issue. You're throwing away half of your image data before you started. The preferable method to use is what's called adaptive deinterlacing. Most tools support adaptive deinterlacing these days. They can call it different names, but the basic idea is find the parts of the images that move where you get the interlacing, deinterlace those, parts of the images that aren't moving where you don't have the artifacts, leave those alone, so you get higher resolution in those areas. You know, pretty smart little thing.

Why is it paying off hugely? I mean, you know, it's for like slow moving content, almost doubles your resolution. You got to watch out. Some tools will wind up giving you a little, will just guess wrong as to what's moving and what's not, and you'll get little glitches here and there. 99% of the time it's not a big deal. I mean, it's almost always worth doing, just kind of keep your eye out for places where it could have some problems, but hugely pays off.

A special case of this is inverse telephony. How many folks deal with a lot of film content or 24p content? Like movie trailers, TV commercials, primetime dramas, all that? A few of you. Okay. So, when you're working with that kind of content, and I'll show you some samples of that in a few minutes here.

So, you take the 24p source. 24p gets slowed down 0.1%. So, it's 23.976, and then it gets transferred to the video running at 29.97. Math doesn't really matter. Basically, that effect is you take your 24 frames per second of video, and you spread it out on the 60 fields of video, and it doesn't really go very evenly.

So, you wind up with this pattern where you have the first frame of film becomes three fields of video. Next frame of film becomes three fields of video. Next frame of film becomes two fields, three, then two, three, then two, three, then two. Hence the phrase 3-2 pulldown.

And the way you detect this, and we'll look at this later on, is when you're going through a file, just go through, frame by frame, you're a QuickTime player, just the right arrow key, go through, and you're going to see a pattern of three frames that are progressive, and then two frames that are interlaced, and it'll repeat.

So, just like find a long shot with some motion, key through there, like, okay, if you see, if all the frames are interlaced, it's interlaced video, deinterlaced. If you see the pattern of three frames that are progressive and two that are interlaced, you're going to do an inverse telecine and do a happy jig. That's actually a nice thing to have available.

So, basically, Ambrose Telson, he looks at the pattern, figures out, like, okay, well, these fields can't be these original frames, and can reassemble them. And that's really good for a couple reasons. One, we've lost no data, as we do with the deinterlacing. You'll wind up with your full frames, 480 lines of, you get back in NTSC. And also, you can actually restore the original frame rate. You're actually at 24 frames a second.

So, instead of having to encode, like, 30 frames a second, or even 60, if you're doing the fields, you can actually get the original 24 frames of video out of it. And that means for every frame that was shot in the camera, you're doing a frame of video, but you're not spending any bits of frames that didn't really exist. I mean, sort of these phantom frames, you know, get spent there. It always bugs me, I see people encoding video, they take film shot stuff, and they encoded it, they deinterlaced it, so they do an Ambrose Telson, and they encoded it 15 frames a second.

And so you wind up with this really kind of jittery emotion, because you have, like, two of the film frames, and then one will be missing, and then one, and then two. And what should be a really smooth experience winds up not being. By doing Ambrose Telson, like, delivering at 24 frames a second, you actually can deliver a smoother motion experience than television would have had of that same thing. Because even the 3.2 gives you a little bit of temporal anomaly there. So, like, a horizontal pan can actually look smoother and better in high data rate web video.

And it even would have on television. So. You PAL guys are lucky, because none of this garbage happens with you guys. The way that film gets transferred to PAL is for 24 frames a second, it's sped up 4%. It's made 25 frames a second. Transferred is progressive. So PAL film source becomes 25 frames a second progressive. And you just have to turn off deinterlation, you're good to go. It's a wonderful thing. So many problems go away that way. So.

I hope you appreciate how fortunate you are. But of course it's all because you have 50 hertz power, which means your electricity kills you more. So I guess there's a body count associated with it. Yeah. It's a myth that things work out that way. You know, I mean, you have moral execution deaths because you don't have to have inverse telecycling.

Okay. Any questions so far? If you have questions, just come to the microphone. I'm going to go to the microphone and ask right there. Thanks. I'll see if I, I'm trying to be a hard ass on the microphone thing and see if that works. So you would either If it was shot on-- OK, so the three scenarios.

One, it was shot interlaced on video. I'm just going to do an adaptive vinterlace. Scenario two, it was shot progressive. Nothing at all. Scenario three, it was shot on film or 24p camera. Inverse telecine was applied. I'm sorry, telecine was applied, then it'll play in inverse telecine. So what are those three?

It's pretty easy to just go through, just frame by frame. If you just find five frames of film, of video with motion, step through those. That's all you need to see to see how to do it. We'll look at some samples later on showing what that looks like. But it's pretty straightforward.

And of course, 24p cameras are becoming more popular. Final Cut 4 obviously has support for that. So it's actually kind of nice. You can just pull it on 24p and work 24p natively and not even worry about this stuff and leave it alone. Okay. Is there another question over there? Or are you just wandering in? That's fine. We're good. So cropping. We don't see all the edges of a TV signal on television.

There's stuff around the edges. It's part of the video signal we don't see. That's fine. Computers, of course, get every last pixel. You're kind of annoyed if you bought your LCD monitor and there was stuff you couldn't see on it. The menu bar doesn't show up. People get all peevish.

So, the problem is, stuff that's part of the video signal that is fine to go on television, it's fine to go on DVD, because DVD is the same safe area, but that you don't want to be seeing on the computer. Typically a little horizontal lines at the top, or edges blanking, that kind of stuff. A little bit of light letterboxing. You want to take that out. A couple reasons.

One is, I mentioned sharp lines are hard to encode. A letterboxing mat is a sharp line. You know, and if you've ever seen a DVD or low data rate video, you'll often see where there's any kind of letterboxing. There's kind of a dimmer below where the black bar starts than other parts of the video.

That's because it didn't fall on a macro block boundary. Details aren't important there. But that line, even though it's just a big area of black, winds up messing up a lot of codecs, especially MPEG-2 and MPEG-4. So, you want to crop out everything that you can. The other thing is, you want to make sure that the edge of the video is in video. I like to go a couple pixels in from the edge of the video, just to make sure nothing's in there. So every pixel is something you want to see on the screen when you're going to computer video.

For DVD, you don't want to do this, because then if you bring it up too much, then the stuff at the edges doesn't really happen. But at least on DVD, you want to mat out the edges. So if there's some kind of noise up there at the top, draw a black area, just blank that out.

You don't want a signal in there. And here's a trick for you DVD guys. If you mat along the edges, you're going to get a little bit of black. So if you mat along the edges, you're going to get a little bit of black. If you mat along 8x8, or especially 16x16 blocks, that will align with the macro block structure of MPEG-2.

So, and if you align that, you won't get that weird distortion effect at the edge of the letterbox. If you're doing letterboxing, try to make sure it's aligned with the 16 by 16 blocks of the video. So, like, you know, if you're a pixel zero, you know, block from zero to 15, or from zero to 31, or zero to 47 in those increments, and it'll be a much cleaner effect. We'll kind of do that later on.

Actually, Compressor makes this really nice. It's got a letterboxing and a cropping filter, so you can kind of knock stuff out. Um... Okay. And we'll look at some samples later on, which are pretty bad. It's okay to cut out some of the noise. You don't have to grab going an even amount from top and bottom. VHS especially often has a lot of junk at the bottom, but it's okay at the top. Grab the rectangle that's the right rectangle. Don't worry about being centered or not. Grab that.

And another technique is if you're going to a really low resolution, you know, like below 340 by 240, like especially in the phones, because video is composed aiming for the safe area, no one's going to stick critical information into the edges of the screen. You know, so you know about a 10% boundary around the video. It could be stuff to look at there, but nothing important is going to be there. So you can go to a very low resolution device.

You can crop into that area a fair amount and make your foreground objects larger. So like if you're going to be doing a video, you're going to want to make sure that you're not going to be doing a video that's going to be too big. Like if it's an interview piece, you know, the head will be bigger. You won't see the set behind them, but you don't care. You can see that you can read the lips better because you get more pixels on what matters.

This is kind of an after effects screenshot here showing where the action safe and the title safe areas are. Instead of imagining you're coding the video, by cutting it into title safe, you get a much... His face will be a lot bigger than if you go all the way out of the edges.

It's a little bit dark in here, but there's some black edges around the screen that you probably can't see in that display there. They're over here. They're pretty cool. I have a much more frightening example of bad VHS source later on to look at. That's Oboe Addy, the drummer, by the way. Anyone from Portland knows who he is.

Scaling. Scaling is basically changing the shape of the rectangle. So by deinterlacing, we basically, one way or the other, made a progressive source rectangle of ourselves. By cropping, we define what subset of that source rectangle we want to process. Then scaling is saying what output rectangle do we want to do? And then, you know, your tool's going to take your cropped area and then scale that into your output rectangle. So that's all scaling is.

We do it in Photoshop with image size all the time. Pretty straightforward operation. There are a couple subtleties that really matter a lot. One thing that is biting lots of people that took us lately is the whole non-square pixel thing. Pretty much all the production standards of standard definition are non-square pixel. Your 720 wide PAL or NTSC, 4x3, 16x9, the pixels in your video signal are not square. So, you know, NTSC, 720x480, PAL, 17x576.

And that's the resolution. 4x3, 16x9, it's the same one. And so you wind up, you know, and when you play it back on TV, you can see they're 4x3, 16x9. It looks good because it knows that. But I see a lot of this on the web. So it takes a 720x480 file, go, oh, I'm going to compress it down.

Okay, I'll cut it in half. I'll make it 360x240. And you can pretty much say right now, no video is ever with a 3x2 aspect ratio is what that is. If you find yourself encoding to a web format at 360x240, and I'm sure everyone's done it at least once, you're doing it wrong.

That's your clue. You want to make sure you're matching the aspect ratio of your source. If you have a 4x3 source, after cropping, you want to be delivering in a 4x3 resolution with a web codec. 60x9, same thing. So I'll give you an example. 360x240 and 720x480, a lot of numbers here, so we'll do the things. Doesn't work. If you're a 4x3, a 320x240 is a 4x3 aspect ratio. That'll be right. Your circles will be circles, not ovals. Okay.

And the real problem with 360 by 240 is it makes everyone 10% fatter. And if you're working with actors, you're just getting in big trouble that way. Just know that's not, doesn't go down well. 16 by 9, 432 by 240 is a matching number for 16 by 9.

And so, I want to make sure your output aspect ratio, your output frame size matches the aspect ratio of the crop source. If you have like a letterbox movie, you know, like a 1.85 to 1 or 2.35 to 1, if you're cropping out all the letterboxing, you can get quite wide. Like if you have a, you know, like a CinemaScope kind of thing, you might wind up at 640 by 272 and crazy things like that.

And that's great because, you know, in terms of your bandwidth requirements, your processor requirements, all that kind of stuff, it's basically how many total bits in the area you have to worry about per second. So, if you can do 640 by 272 instead of doing like 400 by 300, you know, that looks great because 640 feels full screen. And in fact, it's short, you know, that just looks like you're doing cool letterboxing.

You're able to give the user what feels like a real full screen experience while you're actually not using all that many pixels. That's great. You know, you want to, again, it's all about bang for the bit. If you can achieve a good communication experience by using different aspect ratio, wonderful.

You know, for, I always encourage people who are doing web production, you know, think about shooting 16 by 9 or whatever. I always joke that I want to do a rock climbing video and shoot 16 by 9 sideways. You know, because with web formats, you can deliver if you want to do that. As long as you're not using NDO 3.2, by the way, it will crash if you were in portrait mode. But every other codec works fine.

I'm a codec nerd, what can I say? Kids today don't know these things. They don't know their heritage. Okay, another issue is scaling algorithms. Most tools today are used by cubic, but some of the older tools have some other scaling modes. This matters a lot. Make sure you're not using something which uses an old nearest neighbor thing.

We used it like at Premiere 5. It had one where if you didn't manually turn on this checkbox, the scaling quality was terrible. So just be warned about that. Most tools are getting pretty good at this these days. Okay, so big goals from scaling. You want to make sure you wind up in the right aspect ratio for your output.

Also, we want to make sure we're trying to not scale up. And scaling and cropping have an interesting relationship. Typically, the higher your output resolution, the less you're going to crop. I gave the example of doing a really small resolution. You want to do a very aggressive crop to try to get the foreground objects to be bigger.

But you never want to scale up. So just a little bit of math here. If you're cropping at 10% and you're doing a straight deinterlace, or if it's video with high motion like sports video where the adaptive deinterlace doesn't do much good, you can wind up with only like, you know, 260 lines of source. So even at 320 by 240, even though you're scaling down horizontally, you're scaling up vertically. And you want to reduce or eliminate any scaling up because that always enters artifacts.

You always want to shrink down or not shrink at all if possible. So if you're going to 320 by 240 or higher in NTSC, you want to crop as little as possible. Pal users, you've got 384 by 288 as your resolution where that really matters. You get a little bit more flexibility with your higher source resolution. Thank you. And again, using the adaptive deinterlacing or inverse telecine gives you effectively a lot more source pixels, so it's less of an issue, especially the inverse telecine when that's available.

OK. Luma adjustment, and we'll look at this for a while. How many people have been doing video compression for more than two years? OK. So Apple's made Luma work well now. It's been very confusing, because things used to work really weird, and they're kind of the traditional period when not every tool has got the whole story with getting Luma processing right. So Luma basically is brightness. It basically controls-- and typically, we think about in the computer world is black as being 0 and white as being 255.

But in the video world, black is actually 16, and white is actually 235. And it used to be codecs didn't do the right thing for you, so you had to actually manually add contrast when going from a video format to a web format in order to make blacks black on whites whites. And so we've got these rules of thumb. Probably how many people here automatically add plus 27 contrast? When they're-- doing a little video compression there?

A couple of you there? OK. Good thing is you don't have to do that anymore. But they never really told you that. So as long as you're using codecs that can properly deal with color mapping, and now you're going from, say, Apple motion JPEG or DV to MPEG-4, for example, they will automatically, behind the scenes, do all the color space conversion for you. So you don't want to add that extra contrast. You want to add the contrast you need to make the image look good, but you're not going to add any extra.

Some codecs aren't quite doing that now, so this is kind of a messy, weird error right now. But for the most part, things are getting better. You don't have to do that a lot. So I mainly mention this because the people who have automatically been adding contrast and also doing gamma correction for a long time, that's less important. Another issue is the whole gamma thing. A Mac will show the RGB value of 127, which is the middle of the range, as a lot brighter than a Windows machine will.

But the good thing is new codecs, especially MPEG-4, DV, MotionJPEG, will actually compensate for that. They will actually figure out the local gamma and will draw the image right on Macs and PCs. I've actually seen a lot of people who had the long struggle to learn how to raise the gamma to make Windows video look right, and they finally got over that hump.

And then now, if you do that with a modern codec, QuickTime will do it for you again on playbacks. You'll wind up with it totally way too bright on Windows. So, unfortunately, the only real solution, is test your codecs, encode with it, and look at it before and after, and make sure you're actually using the right Luma processing mode for that.

Most modern tools, you know, Compressor, Cleaner 6 gets this right. Cleaner 5, yeah. That's one of the things, you take a setting from Cleaner 5 that worked perfectly, you take in Cleaner 6, it's going to fail completely in terms of Luma processing because they changed the internal model.

Which, you know, it wasn't really the manual. It's kind of an important little aside there that, oh, by the way, how we process video is completely different now. It's better now. It's faster, higher quality, and all that kind of stuff. But, it changes a lot. There's also a creative element to Luma. You know, one of my creative goals is I want to have any frame that's going to be black, with black titles or credits or whatever, I want to have the black pixels all be actually mathematically black.

Because on a TV, you know, you've got a little bit of noise in the black, it kind of goes down pretty easy. TVs are pretty lousy display devices. But, on a computer screen, where you actually have things that are big, flat rectangles of black, to compare it to, you have a little bit of analog noise in there, so your black's a little too bright. It just looks terrible.

And also, it's hard to compress, because you've just got a lot of stuff going on there. So, by actually, you know, making, I mean, a big 16 by 16 block of the number zero, over and over again, it's easy to compress. That block will not take many bits.

The bits you don't spend on that get spent on your actual, you're trying to communicate. In the case of text, that actually matters a lot, because some of that sharp little small point text can be hard to encode. Making the rest of the frame as simple as possible, making everything really mathematically black, will make your text look better with fewer artifacts.

So my typical

[Transcript missing]

Contrast, this is, like I said, plus 27 used to be the rule of thumb. I've been quoted in a thousand articles I wrote saying, the key to making great web video is to add contrast plus 27. And now I'm here to say, well, that was true then, but now it's not anymore.

So it'll be a different rule tomorrow. But I mean, thankfully it's all getting fixed, but it's awfully confusing right now. Contrast is also, if you've got bad coin analog sources, as a video filter, contrast can often help. It's getting your whites white and your blacks black, but leaving the middle of the range relatively untouched.

Okay, gamma. Gamma has been a big source of pain for years, as I mentioned before. So gamma basically is kind of the inverse of contrast. Contrast affects the extremes, believes the closer to the middle, the less effect it has. Gamma is the reverse. Gamma controls the brightness in the middle of the range, but it leaves black and white alone.

And gamma is a useful filter in its own right. And if you've got a video that seems a little bit dark, raising brightness is not a good way to make a video look brighter because it's going to raise your black floor and you're going to have washed out blacks.

Raising gamma is how you make a video look brighter because it makes your midtones brighter is what you care about, but leaves your black and your white alone, which is assuming they were already right. So my typical workflow is all, if a tool does it the right order, which is brightness, contrast, gamma, I'll get my brightness and contrast right to get them nailed so I get my black point and my white point right.

Then I'll use the gamma filter to get the midtones looking right. Some tools do gamma first. That makes it a lot more difficult because things are in the wrong order then. So I mentioned the platform difference before. The good news is now with modern codecs, it's kind of going away, automatically corrected for it, but not in all codecs. So, ugly.

Okay, so lumen processing is kind of complex. Any questions so far on the image processing kind of stuff? I will show you some samples of this later on. Hopefully I can textualize a little bit more, but everyone with me so far? Or too shy to admit it otherwise?

Okay. This is a little bit confusing, and we will look at some samples of this. Noise reduction. Noise reduction is, you know, filters to try to remove noise. I mentioned before, compression is all about trying to communicate, and we're trying to, you know, we're trying to suck an elephant through a swizzle stick here.

We don't have enough bandwidth to do a good job, you know. I would say one of the goals of compression is you're trying to achieve balanced mediocrity. You never have enough bits to actually do it perfect, but you're trying to find the right balance of imperfect that gives you the best results for what you're trying to accomplish. Balanced mediocrity. Noise reduction is one of our tools to achieve that, appropriate mediocrity. We're trying to, like, filter out stuff that's noise, especially like video grain.

Video grain is a real killer because it's different in every frame. It's basically a random pattern of just, you know, distortions of the video that's different from every frame. So it looks like motion. The codex can go crazy trying to make sure it draws this little pixel here that moved over here, but it wasn't the same pixel. It's just noise. And codex and internally try to filter some of them out, but it often helps to do some to begin with.

All noise reduction algorithms will cause some blurring. The best ones do a good job of only blurring things you want blurred, leaving things that are actually signal alone. Some things that are called noise filters are just blurs, just a Gaussian blur. And really anything more than a very subtle Gaussian blur winds up hurting a lot more than it helps.

A few tools to do what's called temporal processing actually compares the frames and does some stuff there. In general, I say be very wary of noise reduction. If you have clean source, no noise reduction will typically be required. Really it's where you have analog source that has problems where noise reduction can pay off.

And the uglier, more horrible your source, the more it can help. The problem is the uglier, more horrible your source is, it'll never look good. You're just doing shades of terrible, unfortunately. So I've got a sample later. I have the infamous VHS ugly.move source file we'll look at in a little bit. And I'm going to do -- you're going to see a very happy jig about compressor because the compressor has introduced the feature I've wanted for so long, which is chroma channel only noise reduction, which is a good thing.

Okay, let's talk a little bit about audio preprocessing. Typically, audio comes in cleaner, so you're going to wind up doing less work with audio. And also, computer audio versus analog audio is a lot less different. Typically, a couple filters we're going to do. We're going to do audio normalization, which is find the loudest point in the whole audio. You're going to raise or lower the volume of it overall, so you're something a little bit below what the peak is available.

Typically, minus 3 dB is a good one for most digital formats. We don't go all the way to maximum possible 16-bit audio space, because some codecs do some rounding things and can wind up being digital peaks that way. So you need to leave a little bit of headroom before you go into the codec. But it's just a good idea to have that be pretty consistent. Most things like system beeps on the computer are going to be about minus 3 dB. And it's really important to have the loud things in your video match the other loud things the computer does.

Because if you don't do that, what happens is you have a quiet video. And the user turns the volume up and turns the volume up and leans over and it's kind of hissy and turns it up some more. And then finally hears the video and kind of listens to it.

And then they get mail. Mail shows up. And this thunderous bing sound happens and deafens their cats. And there's a lawsuit and everyone's sad. So you don't want people to have deaf cats calling you up and yelling at you. Just make it loud. You don't have to do a whole lot of crazy compressor limiter kind of stuff. But minus 3 dB for your peak and everyone will be happy. Yeah. Do the microphone. Yes, I'm being a hard ass about the microphone. Sorry. Thanks for reminding me to be a hard ass.

18, 20 dB, if you need to normalize up that loud boil, if you raise the noise floor, won't that play havoc with the audio side of the codec? Well, sucky audio is going to remain the equivalent suckiness whatever loud it's going to be. Yeah, if you've got audio that's really quiet with a high noise floor, you've got bigger problems than you're in. Yeah, go get Pro Tools and spend some time with it. Well, I don't know, fire your sound guy, I don't know.

Yeah, that's going to be a problem. Codecs don't deal well with noise. They try to filter it out. Speech codecs, actually, like AMR or Pure Voice, are so tuned to voice, they can actually filter out some kinds of noise, which is actually kind of cool. I've a couple times actually used the Qualcomm Pure Voice codec as a noise removal feature, because there was speech.

I just didn't know how to encode, and it wasn't speech, so it just left out the noise. But yeah, if you've just got a high... Yeah, it's just a problem. It'd be a problem with what you were going to do. You know, it's not... Nothing particularly about compression makes it a dramatically worse problem than you already had.

Other audio processing. You know, compression limiter is good. If you're going, typically most computer environments, you know, you don't, it's properly well mixed audio. You're not going to need to like do that kind of stuff. I mean, most computer speakers are pretty good. You know, you don't want to like mix it for like some kind of crazy AM radio thing because then people have good speakers that sounds bizarre. 3GPP phones, obviously. Those are pretty small. We still have some headphones. You try to ask someone who's going to watch on a phone, you probably do want to do some compression limiter stuff.

You know, typical things, notch filters, noise removal, if there's that kind of stuff, you know, it's what you would do to clean up audio that you were going to do anyways, you would apply to compression. It's somewhat more useful in compression just because it'll encode a little bit better, but still, it's, you know, if something sounded too bad to listen to before you're going to compress, it's not going to get any better for the most part, but typically won't get a lot worse.

[Transcript missing]

Before you go to the demo, you were talking about preprocessing. Could you suggest something about preprocessing in a live situation? What do you do for, because here you are using clean. Yeah, preprocessing in a live situation. Well, you're not really, I mean, yeah, it's, you're processing in that case. You can't really do it pre, but yeah.

In the sense that before the encoder, right? Yeah, I mean, a lot of it's going to, well, definitely one, like, if you're shooting something live, you want to shoot in progressive mode, so you don't have any analytics you need to deal with. You want to, I mean, one issue you can kind of see, sometimes if you're shooting with a DV camera, and, you know, there's black bars on the left and right, that's part of the video signal. Hopefully, using a software tool lets you crop the video signal out.

If it lets you do that, you'll crop out to the, you know, to the active image area. If you have a tool that doesn't do that, you cry a little cry, I guess. And move on, but, yeah, I mean, you know, shoot, when you're doing it live, you have the flexibility that you can actually just get a bunch of stuff right, you can control the whole experience.

The hard part of preprocessing is when you get stuff you didn't author that someone shot 20 years ago before they even knew about compression, and then you have to try to make it work. To the extent, you can control the entire process, you can make it work really well. So, yeah, I mean, shoot progressive, use a tool that supports native cropping, you know, do good live audio production like you normally would, it's going to be just fine for that.

It should be okay. Yeah, it's not, the hard parts for doing compression for live is actually a lot of things like camera motion. I mean, you know, you don't want to shoot, you know, handheld and you get all kinds of blockiness, you know, and really just trying to control the production environment is really where it pays off the most. So, question? Yeah, can we go back to, just real quick.

So you start out 720 by 480 for the say, DV video, and want to serve it on the web somewhere in the 320 by 240. It doesn't sound like you're a big 4 by 3 fan always. Well, that's what your source is. I mean, if you're producing, if you're shooting your own stuff, you can do whatever you want to. If it's 4 by 3, deliver us 4 by 3 for sure.

Let's say if it's your own stuff. So it sounds like what you're saying, if I understood you correctly, is go ahead and crop to get rid of any stuff on the edges there. And then when you scale down, what proportion, if you're serving on the web, if it's your project, what proportion do you think looks good and do you like a lot? In actual pixels. What kind of bandwidth are you looking at?

like 200 kilobit? Yeah, yeah. I mean 320 by 240 is kind of my good default. I mean that's what I start with. I'll encode it. I'll look at it. If we shot on progressive, you have a lot more stuff to work with. So you might bump it up a little bit or bump it down a little bit to be on the format. But 320 by 240 is generally a good starting point.

So you would crop 640 by 40 and then... Ah, I know. 720 by 480 anamorphic. So if you have a clean aperture 720 by 480 going to 320 by 240, you don't crop at all. You shrink 720 to 320 and 480 to 240. Yeah, I see people getting in trouble with that. They crop 64 pixels left and right. Yeah, yeah.

Right. Because you're going from a non-square pixel environment to a square pixel environment. You're actually correcting for that by just pure scale and not a crop. So you're going to do cropping to get rid of the stuff, but you're not going to crop correct for the aspect ratio. Great, great.

Okay, there we go. I'll save you hours right there. There we go. This whole WWDC just paid for itself right there, hopefully. So any more questions before I move on to demo land? All right, let's... Can we get the They told me to make sure I have my energy saver off. So that's my own darn fault. There we go.

Does that look good? Are we at 1024? Okay. So in the interest of being a show-off, I'm going to let you guys tell me what you want to see done, and we're going to do it. So we've got Compressor, we've got Squeeze, we've got Cleaner, we've got Premiere, we've got After Effects. Kind of the tools I'll often use for preprocessing.

I've got Film Source stuff, I've got PAL, I've got NTSC Source stuff, I've got ugly-looking VHS stuff. So we've got time for a couple different ones. So someone who's got a scenario that's kicking their butt, what do you want to see? You want to see how to make an Apple movie trailer? Like the QuickTime.com movie trailers? Is that a good one? All right. Okay, you want to see it in After Effects, you want to see it in Cleaner, you want to see it in Squeeze, what do you want to see it in? Cleaner, you got it.

Well, that's a clear goal there. OK. So let me show you-- do you want to see the Biker Boys trailer I did before, or do you want to see the Sinbad movie trailer? I know. It's horrible. We'll do the Biker Boys one. It's already had the output. So is that? I don't have that. I don't work for Apple. I know. I got DreamWorks. I don't work for them either, but they give me stuff.

[Transcript missing]

So who doesn't have QuickTime Player Pro?

Anyone? QuickTime Player Pro is the best $30 you're going to spend a year. Just think of it as a cheap magazine subscription. QuickTime Player Pro pays for itself in eight hours. So you're going to see lots of cool QuickTime Player Pro tricks here. Which isn't the purpose of this class, but it's like quarter of my lifestyle. So this is a source file here. This is a typical motion JPEG source file. It's 486 lines, not 480s like what you get off of CineWave or Kona cars, something like that.

But it's motion JPEG and not uncompressed because I only have a 40 gig hard drive. So a couple things here. So it's a preview. You can see we've got some blanking here on the left and right, and we're letterboxed. That's a 1.85 to 1 aspect ratio. Let me just find a good scene with some motion in it.

I'll mention before the pattern. Let me just find the start of the shot here. Blah, blah, blah, blah, blah, cut. There we go. Okay. So the start off here, the shortest thing, I'm going to go frame by frame here. So you've got progressive, interlaced, interlaced, progressive, progressive, progressive, interlaced, interlaced, progressive, progressive, progressive, interlaced, interlaced. So we see we've got the typical pattern of three progressive frames followed by two interlaced frames. We know it's telecine. So this is an inverse telecine project. So pretty easy. Good.

It's really as easy as that. You only need five seconds. I mean, you only need five frames of a video to find out what the right de-interlacing mode for it is. And that pays off so much. A couple of things looking at this thing. It's moderately noisy. Well, no, you can't really see it very well. There's some noise in the heel here.

This is a pretty clean, you know, there's some, the noise in here is from film, honestly. Here's a very de-interlaced, very interlaced frame right there. That's what happens when you have a frame that happens in the middle of a hard cut. Because when you get through to a pull down, you might wind up with the two different fields being of two different frames entirely. So here, that isn't a transparency effect or something like that. That isn't a cross-isol.

That's just two different frames on either side of a hard cut that is part of the telecine process wound up as part of the same frame even though they weren't originally. So that would be, that's a very hard frame to compress. So we definitely want to get rid of that kind of stuff. Okay. So that's the source. Cleaner is a pretty slick tool for pre-processing for a lot of this kind of stuff.

The biggest problem right now is the inverse telecine algorithm has been broken since version 502. It got a little bit better in version 6, but there's still, you can still find some cases where like reverse the order of frames around a cut and that kind of stuff. So you've got to carefully QA inverse telecine stuff done from Cleaner. But it's definitely kind of the easiest tool to visualize this stuff in.

I actually keep a copy of Cleaner 4 and an old G4 line around just to do inverse telecine-y stuff sometimes when I find things that expose the bug. So, and Cleaner has this great thing called the Project Window, which we'll do here.

[Transcript missing]

But it's not something, but you know, clean and smart enough to go, okay, it's actually four by three.

So it's going to compress it on the horizontal. So you see the current aspect ratio. First thing I'm going to do is run a crop. So I mentioned before, this is a 1.85 to 1 aspect ratio. Just look at these things a lot if you kind of learn. Whoops. Yeah, there's a bunch of numbers.

Most films are going to be 1.85 to 1 or 2.1 to 1 or 2.35 to 1. You have a question? Microphone.

[Transcript missing]

I can basically, it'll draw me a box and it'll constrain to that aspect ratio. And you can kind of do it here and kind of center it.

It's a really nice cropping dialog. I always like this feature of Cleaner. You're able to go in there and you can kind of scan through and make sure you're not getting out there. You know, and, yeah, that's about right. And getting a little bit of the imagery off is not a big deal.

And also, you can just do it unconstrained. Rule of thumb, you know, and if I just grab it like this, it might get a little bit distorted. Rule of thumb is that any aspect ratio distortion less than 5% is not going to be noticeable by the end users.

So, you know, don't worry about getting exactly 1.85 to 1. If you crop 1.81 to 1 and you scale it to that, you know, a few pixels off in terms of scaling is not going to be noticeable in terms of looking distorted. So, as long as you keep it under 5%, it's going to look generally pretty good. Okay, so that's what we'll do here. It's an interlaced video. Okay. So, let's make up a setting here. Let's come up with a scenario. Oh, wait, we're doing Apple-style web video stuff. So. Okay. Let me make a new setting here. Ben fakes Apple trailer.

Quick time thing, compress, skip all that stuff. So, it's cropped. So, with the high resolution, sometimes do the big woods up to 640 by 480. You can specify this thing so it's the same as crop source here, 640 by 480. So, kind of annoying bug here is it doesn't actually do the math right.

You use the rounded to divisible by their values. So, 640 by, so actually what I wind up doing all the time is going over to the calculator, actually doing the math myself because I know cleaner tends to get it wrong. So, it's 640 divided by 1.85 comes in and it runs, it's actually 345.

When you're scaling, most codecs work better when both resolutions are divisible by 16. So, geeky little thing, that's the way compression works. If you choose a value that's not divisible by 16, what it winds up doing is it winds up actually encoding another 16 pixels and then not showing some of them to you.

Which winds up being more work that you never get to see. So I always round the nearest 16. So kind of the way I do that is I do the math, so 640 divided by that number is that. I'll divide that by 16. Okay, 20, the nearest whole number is 22. Times 16. Which is going to be 352. 352 is 352, which is actually the correct value. And there we go. So let me just do the real-time preview here.

So this is-- oh, look at that. I didn't quite crop it enough. That's good. It's real time preview mode. Is anyone here not familiar with Cleaner? Are there people here? Any people? OK. Well, stop me if I do anything that's too weird, but I'll kind of assume you have a basic idea of what's going on here. OK. So that's a little too much.

Okay, there we go. So I'll make sure there's no black. I'm going to see if there's some filter in my default. Cleaner doesn't really like to be on a big monitor, so it's a little bit cramped here. So, by default it's going to do a deinterlace. And so we have frames with interlacing on them like, what do we got here? What's going on? There we go.

We'll get this after the inter... so it'll be interested by default, which does an okay thing like this. But really what we obviously want to do is we want to do inverse telecine. It's called a telecine cleaner. It's a little bit cleaner. You probably can't see it from back there, but it gives us more resolution and gives us a higher quality output.

Sharpen Filter. I recommend never using a sharpen filter. A sharpen filter can make it look better before you compress it, but sharpening adds noise. It adds complexity to the image. And even though sharpen can make it look better before it hits the codec, at any kind of reasonable web data rate, it's going to look worse after having applied the sharpen filter than if you just left it alone.

So I always turn sharpen off. It's a little bit noisy. The mile adapter noise reduce is a good default filter. It's not going to hurt it too much. It's not going to help a whole lot, but it helps a little bit. and it's not going to be too distracting.

So, by default we get a bunch of default settings here, which are totally the wrong ones. I don't know. The default image processing ones were kind of like design for Cleaner 5 and kind of ported over without acknowledging the whole image change thing. So never use the default processing image filters in Cleaner because they're all wrong in Cleaner 6. Every last one of them.

Uh, yeah. And you're at the microphone? Thank you. I'm right here, just for you. Yeah, you're good. No, we're... How much of a difference would it make if your source material was not telecine? Like, I mean, if you can actually get it before they telecine it, how much of a difference? How significant?

Okay, so basically if I had like 24p source frames, um... It won't make that much difference in that aspect. Obviously if it's like... The question I'm getting is like, you know, my, my, my, you know, Cineon files straight off the machine versus like a beta SP tape. You know, obviously, you know, that'll matter more. But the inverse telecine, if my choice between like a 720x486 series of TGA files and a 720x486, you know, Kona file, pretty much the same, the same net effect. Not gonna matter that much either way.

The workflow's obviously better for the... Yeah, it's gonna save you time, as far as... Oh yeah, you don't have to do this filter, you get smaller files, you get about a 20% reduction in data rate of the source. So it's a very nice thing, but it's not gonna have much of a quality impact.

Okay, so we got this guy here. I've seen this movie, I've watched this movie like 87 billion times. I have no idea what the movie's about really. It's got some very cool motor. I guess it's Fast and the Furious on two wheels or something like that. I never saw Fast and the Furious either. I have two small children now, so whatever hypnosis I once had is long gone. Cars or something. Okay, so... So, this is generally pretty good.

So, I mean, for a clean source like this, I don't actually need to add any filtering at all. I might just go for... The one place I'll often typically look at is a frame that's got some... I know it's all black on it, like... Get some credits frames typically, have some stuff. It doesn't really have all black on it, though.

Oh, the key to a great trailer is you edit out this graphic here, which almost always looks bad, and you actually get an EPS file of it from MPAA, and then just render that into it. So you have no analog noise in that start part. Save a lot of bits right there. And so we kind of look here and see this is all the way black. And actually one thing that Apple did that makes me happy is digital color meters. This is a great little utility.

It comes with a Mac OS X. So you can kind of go through here and actually look at the color values. This is great. Like, is this actually all the way black? I don't care. Okay. This is actually pretty much all the way black here. There are a few things here and there, but that's flat enough. All I'm going to do here is I can tune in an actual brightness of minus one. And that should... The one thing about Cleaner is the preview window does not show up when it's not the foreground app.

and the way... There we go. It seems like a really kind of tweaky, kind of goofy stuff. That's because it is. I mean, you could really spend all day going, well, I don't know. Let's try one off. It's going to make a big difference. Really not that much. Plus, it's got some gradients here. Yeah, that's good enough. If I could just turn these off, it's really not going to matter that much. Anything less than about five is really not going to pay off that much. And another thing, if you've seen Inverse Telecine, yeah.

The white and black restore filters. Basically, so we get two values, amount and smoothness. So Luma's encoded a range of zero for all the way blocks, white at 255. What amount does it say, basically, if I say six, take it, or 19, 19 for example, every pixel that is luminance value of 19 or below gets turned to exactly be zero.

So zero through 19 becomes zero, a 20 stays at 20. And then what smoothness does is it basically fills in the remaining range to interpolate. So I said this, so instead of making 20 zero to make 25 zero, and then 20 to 25 will get expanded into the range from zero to 30. So you don't have a boundary.

If you can see right here, there's a very sharp edge now between these two. Because stuff that was kind of near black turned all the way black, so we kind of lost our gradient there. I don't know if you can see it all the way in the audience on this projector. Can you see the edge there at all? Okay, let me crank it up here. Okay, or if you're a truly bold criminal, you get like that.

BlackRestore is one of those filters that, on those video clips where you really need it because they're very noisy, it doesn't work very well because it's so noisy you can't do it enough. It doesn't mess it up. It's frustrating. There's this narrow range of stuff that's bad enough to need it, but not so bad that it actually works.

I don't know the solution to that problem. It's just the way it is. Cleaner's version of this filter is better than a lot of the other ones because it specifies both the amount and also the smoothness values. It lets you kind of have a transition gradient. It makes it not quite so obvious. Useful feature. Okay. And another thing you do in inverse telecine, you definitely want to make sure when you're doing that in Cleaner, the output frame rate is going to be 23.976.

Cleaner and most other tools don't support temporary sampling. So even though the source is 24, it got slowed down by 0.1% as part of the telecine process to match the 30 to 29.97 ratio. So you have to do 23.976 and not 24. If you do 24, you actually wind up with a repeated frame every few minutes. Not normally a big deal, but you know, might as well avoid it.

So, and that's how you would preprocess this. Oh, and I turned on the audio. I normalized to a... And Cleaner doesn't measure in dB. Cleaner measures it in... This is a slider. It's about 90% about the same as minus 3 dB. So that's how you do the preprocessing. And I completely ignored the encoder settings for all that. So to get into the codex up, that was just the filter settings.

Cleaner is a very efficient workflow. Cleaner 6 is for getting preprocessing down. A really hard project is after effects. If you want to spend 10 times as long to make it 10% better, after effects is your tool of choice. But for most stuff, if you can do it in Cleaner, it's nice to get it done in Cleaner.

Yeah? Actually, in that same concept, or same thought, what's your opinion of Apple's new compressor, and how does it compare to cleaner? Apple's new compressor, let's look at it, let's do a compressor walkthrough. Okay, that's a good one. I'll show you. Shall we do, let's do the same clip in compressor, and you can I love this. So I've only had compressor now for about 48 hours. I had the flu for some of those hours. So if I, we'll be learning this together.

Compressor. So Compressor is bundled with Final Cut 4, but it is a usable standalone app. You can either export to Compressor straight from Final Cut, which is a very nice integration. You can also just launch it as its own app and do it up like that, which is kind of cool.

So its workflow is-- I mean, I think it's kind of reminiscent of a cleaner in the good ways, which is cool. So it gives you a little plus button here to grab a source file. So it was a Baker Boys-- So, and do that. We'll pick a preview. We're going to do, it ships with no QuickTime presets. It can do QuickTime export, but its presets are either MPEG-2 or MPEG-4.

[Transcript missing]

A couple of neat things about it. So I was saying this here. You have this kind of cleaner-like before and after slider, which is kind of cool. You've got this neat integrated crop thing here, so I can just kind of grab these crop edges. And it's actually in a live update telling me how my current settings are going to affect the output compression. Which I haven't really set up yet, but I mean, it's just so cool. This program really takes advantage of the power available on a modern G4, Mac OS Temper, multi-threading and all that.

And it's really, you know, those live update features are just like really awesome. I mean, compared to having to like, okay, change your settings, hit the update key, change settings, hit the update key. So the rapid turnaround in terms of developing things is really kind of nice. So, okay.

And we set some settings here. It's pretty light in terms of filters. For the most part, it's okay. One neat feature is you can actually change the order the filters go in. So I can drag this and drag this up here in this order. So I didn't do anything first. This is a cool thing. The mode you want is actually sharp. Sharp means adaptive de-interlace. I don't know why it's called sharp.

That's what it is. That's the one you want to use. A great tragedy of Final Cut is that it doesn't really have good inverse telecine functionalities. There is this thing called Cinema Tools, which is now bundled with it. I'll show you that in just a second. Which is awesome if you have the right kind of source. Which is a source where you actually know what the cadence is. You know where the 3-2 pattern lies.

What you would normally do in Final Cut is you go through here and what some of the tools you can do is you can reverse the telecine feature and you can go, okay, I want to make a thing. It actually will conform it out to 24.0 if you want to and all this, but unfortunately you actually have to know.

Either you get like a telecine log telling you what the field order is or you have to like guess. So basically your only option here is, okay, well, I've got a total of 10 combinations of A, B, B, B, C, C, C, style. I'll just try all 10 and see what happens.

If your first frames of video have content to them, you can just single frame through and say, okay, well, this is A, A, A, C, D, that pattern. Unfortunately, like most video out there, this particular video, the first five seconds are identical, so I can't see any emotion, so I can't actually detect it.

So it's kind of a pain. So this particular project is kind of annoying to the compressor. I can only encode it as video because I can't use inverse telecine with compressor in this environment. So I don't know. I've sent a... I talked to the Final Cut Pro team a little bit about this. And for their market, it makes sense. You know, they expect if you're doing 24p, you're either coming with a telecine log or you have actual 24p source.

So they're not expecting you to take 24p, produce content that's been done through analog and captured. But I get a lot of that stuff, unfortunately. So it doesn't work out. It's kind of a pain for me. Other kind of cool things. It's got this totally awesome noise removal filter, which I was going to do. Well, I have a problem.

I was a little jigged for that one. Okay. Oops. Where did I put it? I just wrote it all. This is a great thing. It's played all channels. It's a chroma channel. This is great for VHS source. I'll show you a sample of this in a second. VHS source is much noisier in chroma than it is in luma. So you can apply this noise reduction filter to just the chroma channels. It'll make them all blurry, but our eyes aren't very attuned to color. So it makes the chroma channels look way better without actually making the video look that dull. That's just the best thing ever.

Well, that may be an exaggeration, but I've been asking for that feature forever, and these first guys are like, come up to plate. So when I request a feature for five years and someone does it, I've got to give them props. Any things? It's got the actual, like, the whole, like, three-point color wheel of Final Cut is all in here. It's just kind of a weird UI for it. You can actually do highlights, mid-tones, and shadows, and change your things, and all this.

And these are all live in terms of preview, so I can... If I'm doing this right, I should be able to just like, you know, as I drag preview for, and it's actually showing it actually as it is compressed, not even, you know, just the preview thing like Cleaner does, but it actually does a good job of guessing what the actual compressed output is going to look like.

Cleaner will do previews, but it doesn't do the data rate right. It kind of, it compresses it as if the current frame is going to be a key frame, so you wind up getting about only a tenth as many bits per frame. It just winds up not being very good proof.

Another thing you can do is you can, a couple different modes here. Okay, so, encoder, yeah, gamma correction. You can also do letterbox. You can crop out the bad stuff, and you're doing it to DVD. You can actually add your letterboxing back in. You'd use that to basically mat out any kind of, you know, noise in the blanking areas, which is, you know, a little tiny feature there, but really handy, in my opinion.

[Transcript missing]

and others. So, anyway. And, first of all, you can like script it and you can have, send an Apple script to an output folder, all this kind of stuff. It's got a really neat infrastructure. Unfortunately, it only does QuickTime and MPEG-2 and MPEG-4 today, so it's, you know, I'm, it's, I don't see myself doing a lot of encoding with it because, you know, there are important features, like you can't do two pass encoding with QuickTime or MPEG-4, which are pretty critical for quality there and that kind of stuff.

So, I mean, I think it's a very exciting 1.0 and I really look forward to seeing what Apple can do, you know, to incorporate other technologies into QuickTime. Like, a better MPEG-4 codec is something I'm certainly looking forward to and hopefully an ABC codec is going to be a really awesome tool.

It's designed for nice productivity at the preview thing, works well. You can do batches, you can go back and hit history. Will tell you what kind of batches you've done in the past and you can find your files and all this kind of stuff. I mean, they really sweat us some details on this.

It is not Apple scriptable. I know. I talked to Sal. Maybe soon. Keep telling him for it. I mean, everything should be Apple scriptable all the way, all the time. I know. And this is not one of them. None of Final Cut 4 is Apple scriptable. I mean, you can fake it with that whole UI scripting, script the buttons through the accessibility thing. But you've got more time than I do, if you do. But yeah. So anyways. This isn't an compressor demo. So this is preprocessing. Its preprocessing is totally awesome, except it doesn't have an audio normalization filter.

It doesn't have inverse telecine. And if it had those, that would be all the better yet. But it got lots of really hard things when all this is done. So that makes me happy, too. Let me show you that thing I mentioned before about the temporal noise reduction filter. I mentioned VHS ugly. I think we'll agree this is an Apple named file. So this is my-- you see all the colors there? No, OK.

Uh, yeah, you know, it's okay. So does that look bad? Can you believe me there? So this is typical of VHS. You get all this kind of tearing at the bottom of the area there. It's hard to encode and it looks stupid too, so you want to take it out. Um, and you got stuff and the colors are bad and the bricks aren't really green and whatever.

When I was working on my book, I called on my editor and said, hey, do you have any really bad source I can capture? And he's like, yes. So this is like a third generation LP mode dub of something. I actually got it and I can't, I've got a professional VHS deck as part of my editing facility and I can't even play LP tapes.

I had to like go like haul out my old like consumer deck from my upstairs, you know, media closet. Like, which I hadn't even played a VHS tape in years and I had to like, I didn't even capture this video. So this is the worst video I've ever seen. Which is what I asked for, so I can't complain.

Really, I was like, wow, that's, yeah, yeah, you took me a little later, didn't you? So, um, anyway, let me find a good, so do you see any, do you see how she's got some color in her black shirt there at all? Okay, there we go. We'll make this happen. Okay, so we're going to add a preset. Just so it looks pretty obvious, we'll do a MPEG-2 high quality encode in 60 minutes. Okay. That's why it's cool.

So filters, noise removal. Well, there we go. You see the color right there? That's good. So I can do a noise removal, and I could do a really intense thing that's kind of blurry like this and like that. But I can also do just purely chroma channels. So you can see it removes all that chroma cast and makes the color a little more accurate there. You can see nothing, can you?

Okay, sorry. The color precision work on the trade show is always a battle. Okay, so yeah, I love that filter there. So I'm so happy to see it. I should talk about that letterboxing filter, because I was talking about it before, but I can do some good stuff with it. Scale, center. Center, I don't know. Map, there we go.

[Transcript missing]

which runs fine on virtual PC, by the way, if I can bring it back into the Mac world. Surprisingly swift, you can turn off the preview mode. Okay. Good. So how are we doing on time here? All right. Sounds good. So what else do people still want to see? What do you want to see still? What's awesome you haven't seen yet? Who's got a problem I haven't solved for them yet?

I'm not going to be the hard ass. I'm sorry. I'm overcompensating. Let's see some Squeeze. Let's see some Squeeze. Squeeze has got a pretty slick preprocessing UI. Its preview isn't that great, but if you know what you want to do, it does the job quite simply. But the problem is it's hard to tell if it's doing the right thing or not. So here I'll just do the same darn video over and over again, because such is my life.

Every now and then my wife will tell me, I cannot hear that trailer coming from the basement ever again. You must get a new clip. That's when I get new samples. I was working with this Smash Mouth music video for Why Can't We Be Friends. I used it for like two years. That's why I said, you can't.

Either the trailer goes or I do. I thought about it for a while, but it seemed that... Trailer went. Okay, so Squeeze has just one processing dialog. All this stuff goes on. It has inverse telecine. Like Cleaner, it requires you to specify what your field order is, which I always think is a lame of a tool to expect people to know what the field order of your source is. This kind of thing is in the file. QuickTime has a tag for it. Don't ask the user that question. You can figure it out on your own.

Also, don't call it field dominance, because field dominance is something else. It's field order. But anyways, in this case, it actually is a lower field dominance source file. I just happen to know. It's got a little noise reduction thing, which is fine. It's got this preview module that's kind of hard to work with. This is a high-resolution source file. I've actually been using Squeeze for... And it's kind of hard to find the crop dialog too.

And you can't really do the aspect ratio thing. So I kind of frustrated because I know I've got so many more pixels here that I can look at, so it's kind of hard to know if you actually got it exactly right. And the finger thing to grab is kind of hard to get. But it's little things like that. But it does the job. But it's a little-- there we go. Something like that. But because I'm only at like a 1/3 resolution, I can't really tell if I really got it all.

[Transcript missing]

Yeah, so that's pretty much it for preprocessing. It's kind of weird because you specify how it's processed here and then when you're actually in your setting, it's where you define what your output frame size is. It's got one kind of cool thing it does, which is that it will, if you type in a weird aspect ratio, but you say like I want 640 by 480 but I copied some different ratio, it'll actually add letterboxing to make it right.

I don't want to turn that feature off because I actually, you know, like I said, if I get it off by a couple percent, I'm not worried about it and I don't want it to add like a three pixel black line at the top and bottom of my video because it's going to make it hard to compress. So I'm almost always going to turn this off and just do the math and make sure that things match up right. So, you know, so 640 by 52 like before. So there you go.

Unlike Cleaner, where you can customize the filter settings on a per output basis, on a per input basis, which is better in most cases when you're doing the exact same thing everywhere, but it's kind of a pain in the clear to have it constantly be replicating the same preprocessing settings over like five settings, and if you get one of them wrong, you have to go and fix it five more times. But when you actually want to customize it for each one, it's a little bit handier to have that kind of control. So, okay, that's Squeeze for you. Thank you.