Configure player

Close

WWDC Index does not host video files

If you have access to video files, you can configure a URL pattern to be used in a video player.

URL pattern

preview

Use any of these variables in your URL pattern, the pattern is stored in your browsers' local storage.

$id
ID of session: wwdc2003-710
$eventId
ID of event: wwdc2003
$eventContentId
ID of session without event part: 710
$eventShortId
Shortened ID of event: wwdc03
$year
Year of session: 2003
$extension
Extension of original filename: mov
$filenameAlmostEvery
Filename from "(Almost) Every..." gist: ...

WWDC03 • Session 710

Preprocessing Principles

QuickTime • 1:19:24

Preprocessing is widely considered the secret to how to make excellent web video. This session teaches you the general principles for how to pick appropriate cropping, scaling, noise reduction, and image adjustment parameters for optimal quality, whatever your data rate.

Speakers: Glenn Bulycz, Ben Waggoner

Unlisted on Apple Developer site

Transcript

This transcript was generated using Whisper, it may have transcription errors.

So because this is such a different year for us mixing in the QuickTime Live and the WWDC stuff together, before I essentially start the class, just get a sense from folks what angle we're coming from so I can frantically rewrite my keynote presentation in 30 seconds and do that. So how many people are coming from the content world primarily? And from the programming, software, engineering world?

OK, about 50-50? Perfect. Nothing like a no audience overlap to make it all exciting. So the folks are coming from the content world. What do you want to learn today? What do you want to know about? Just shout it out. Thank you. But breast-sabering video into the computer, so basically the capture process, how to go from tape into the computer. Good. Always a challenging issue, but getting better. Yeah. Thank you. Analog digital, okay. Great, all these things not, I haven't run a slide up for you yet. It's good when I ask these things. Yeah?

Currency of the tools, good. Yeah, so my plan is I've got about a 45 minute president, half an hour, 45 minutes of talking. And then I'm just gonna do demos and we'll let audience vote happen what demos we're gonna see. So make sure you have your favorite tool in mind and we'll just take a look at what people are interested at, what scenarios you wanna have. Just to prove, keep me on my feet is always good to not know what I'm doing in advance, so good. For the software folks, what do software people wanna know from me? I was afraid you thought. Is anyone building software that involves pre-processing? Or have that in mind?

Great, OK. Sounds good. Yeah, so actually it was not in the syllabus. Actually, we'll be looking at Compressor today, by the way. I should mention. So I got my NFR copy of that at the last minute. So I've actually used it much yet. We'll be learning Compressor together a little bit, but it's pretty easy to use so far. Actually, it completed an entire file in it during an earlier presentation, thanks to the batteries.

Cool, so sounds good. So is it time? We got three more minutes. So I just got bored, so I started talking here. It's too nervous. I read all my email, what am I gonna do up here? So what are, is anybody having like a big pre-processing and headaches, just like kicking someone's butt or problems you're having, something like that that's just causing pain? Yeah.

There are so many rules of thumbs that you need a lot more arms than we each have on this kind of stuff. If you're doing it lots of times to see which works best, that means you're doing the right thing. I encoded my first file with Macramein Director Accelerator in 1989. So it's virtually my 14th anniversary here of doing this stuff. And I still encode every file about three times that I do. It's like just getting a little bit closer every time. So it's always a good idea. It's healthy to continue to experiment not sure what's the best way to do it, try it both ways and see what happens. Well, the nice thing about fast computers is you can just, you know, do a big batch and try a lot of different alternatives and, you know, the proof's ultimately in the pudding. Now, all that really counts in the end is the pixels you get on the user screen. So, whatever looks and sounds best for, you know, is the right solution. So, having that kind of exploratory approach is a great one, I think. Yeah, it's, but, yeah, there's so many weird little things you can do. I'm not going to be talking about compression formats and, like, codec tips today. primarily everything that's going to happen from your video, let me see, you've got your capture file on your hard drive to like what you actually hand off to the compressor is kind of the middle part we're talking about today and that's certainly enough for 90 minutes by any means so shall we it's time to formally start now so thanks for indulging me, there we go Oh, and definitely when you're asking questions from now on, because we are taping, make sure you come up to the microphones. But please feel free to ask questions during the presentation. Thank you.

All right, thanks for coming, everybody. This is Pre-Processing Principles, and a special hello to our friends listening in Nihongo and on ADC TV or on the DVD. doesn't really need an introduction, per se, in the video world or in the digital video world, so given his last five minutes of casual discussion, we're really happy to have him here. Many of us wish that there was a Do What I Mean button in the compression tools and that everything would come out perfectly and that we'd get, you know, bytes per second when we really were getting bits per second. So we hope that Ben offers some good guidance on how to get really good video even before video hits the compression tool. And I'll hand it over to you, Ben, and then we'll have a Q&A afterwards.

And I would ask that if when we do the Q&A, you do that with the microphone, either raise your hand and somebody will bring a mic to you, or there's a mic in the center of the room. That way the question can be understood clearly and translated well. So thanks. Ben, off to you. Great, thanks. All right, so I'm going to have my Steve moment. I have a remote mouse here, first time here. I got Keynote and a remote mouse. I feel like a real boy for once. This is good.

No more right arrow key in PowerPoint for me. No siree. So we said that, and this is me. And my name is Ben, and I'm a codec nerd. So, yeah. So what we're talking about today is pre-processing. And we're going to look at a few tools, talk about the theory of it, try to wrap this all together. So talk about some ideas and actually show you how to apply those things in some real-world projects. And the focus of today, this is all supposed to be useful stuff.

If you're pressing video, I hope everyone today, I can answer a question you've had, and it will save you some time and work. I'll be helping you get better results and lower your Aspirin budget. That's always the goal. So, we have an agenda. What is this? Why it matters? How do we do it? How do we do it?

And primarily today we're talking about web delivery. DVD encoding is actually quite a bit easier. I'll mention it here and there, but the focus today is delivering on web. You're on cell phones, all that kind of stuff. But feel free to ask DVD questions that are specific as they come up. But fortunately, 90% of the time DVD is pretty easy, with a couple caveats I'll get to.

So, preprocessing. Preprocessing is pretty much everything that happens between your source frame of video and your compressed frame of video. So, you know, any codec, you've got your capture your stuff, it's in Blackmagic codec, or it's in DV, or it's in animation, you've got a bunch of rectangles to start with, and then in the end you're going to hand a bunch of rectangles off to the codec, which is going to make it your bitstream. So preprocessing is all about taking one rectangle and then making a better rectangle out of it for the codec.

So rectangle to rectangle doesn't sound too hard, if only. For me, I find, you know the bitrate you want generally in a project. You kind of key that in, you know the right frame rate, a lot of those kind of stuff. Pre-processing is the part that's got the most artistic, it's the most crafty compression. It's the one I wind up spending the vast majority of my time on. You know, I mean, you may be doing a big project, you know the specs of it. You got different kinds of source coming in. You often wind up tweaking each clip a bunch of times and the filters and all that kind of stuff. And we're getting tools to make this better and better are more automated, which is nice. But it's always the part that I wind up sweating. It's all there. You just knock it over a couple pixels here and there and do it over and over again. Fortunately, there's actually a new compressor app I'm pretty excited about. It's got some pretty good features for actually seeing it in real time in your effect, which pays off a lot.

So, why does it matter? We're really trying to maximize what I call "bang for the bit," compression efficiency. We're trying to turn every bit of bandwidth we have available into information that serves our communication goals. We're not compressing video for fun. We're compressing video to try to communicate. And by preprocessing, we're trying to give the codec the right input it can to turn-- to deliver the best possible end result it can.

You know, nothing--no distracting artifacts, as much information as possible. Every bit of pixel count. Just a little illustration here. This is just, you know, I grabbed a, this is actually courtesy of DreamWorks here. They seeded me with the movie trailer for a movie called Biker Boys, which you've probably never seen. I've never seen it, but the movie trailer's kind of cool. And people always do, like, cool movies, like Terminator 2 for the demos. And the problem with a good movie like that or Finding Nemo is you watch the trailer and get excited about it. By choosing kind of a movie no one watched or cared about, you can actually look at the video frames and not be distracted by it being funny or anything. This is kind of typical interlaced frames, It's a little bit jumbly here just because of the scaling and key note, but you get the idea. Typical frame, you pause on it, you see with any kind of motion you get those lines like that. And then if we preprocess the frame, it'll look like that. So not a huge difference like this, a little bit. But when we compress these two, we get a pretty big difference here. If I just export-- at 800 kilobit MPEG-4 from QuickTime Player, just look for that frame, we end up with this. And I think you can see, okay, the audience here, it looks pretty bad, you know? All those sharp little lines really kill the codec, it's confused, it's messed up, you know, it's like a mountain or a motorcycle, it's really hard to tell what's going on. Same frame, same data rate pre-processed, you know, end up with a much clearer image. So even if they don't look, even if before you compress it, you don't see that big a difference. When you actually compress it, it really pays off. by just getting rid of data that isn't there. By eliminating the noise, you're trying to maximize the signal. And there's a lot of different kinds of that. Also, a couple of things is this frame was-- what's going on?

Anyway, so let me just start off with the first steps here. I'm going to walk through typically the filter chain that most tools use, the order you're going to apply the filters in the tools, conceptually in the order I'm going to do. First thing, de-interlacing. Example I gave you there. Traditional video, you shoot in interlaced mode like most video cameras are going to do. All your even lines and all your odd lines are going to be captured half of a frame apart in duration. So in NTSC, how many people are coming from a PAL country?

Okay, so a 60th of a second apart in NTSC and a 50th of a second apart in PAL. I will try to give PAL examples. PAL is actually a much, much better source format for doing video with. You've got 25 frames a second progressive. You've got 576 lines. I mean, I've been moving to England and doing only PAL video compression many times. But here we are in NTSC, and we'll make do. So just bear with us, PAL people. We suffer more than you know. Okay, computer video obviously is just drawn top to bottom progressive scan. So not normally a big deal. Various codecs like our DVD player apps can often like deal with the interlaces and show up progressive on the screen and it looks okay. However I see this a lot, people export into web video format and leaving the two fields in there. So basically you wind up with a frame where you have every other line doesn't match where there's motion.

That is just a total killer because the things that really hurt codecs where it would take a lot of bits to encode are motion and detail. And basically a lot of horizontal lines where each pixel is completely different than the one before and after is almost the most difficult thing you can imagine encoding. So when you have interlaced artifacts like that in a frame, all the bits get spit trying to draw those little horizontal lines that don't really matter. And those are bits that don't get spit on drawing, say, the face, you know, or the person in the frame. It does it a lot. Also, you want to get these double images of moving objects. You throw a baseball, you'll see like two baseballs translucent over the background because they're getting blended together. It doesn't look good at all. So if you have progressive content, I mean, you're great, but if you have interlaced, you actually have to deal with it in one fashion. A lot of tools traditionally do what's called the basic method or whatever. It eliminates one of the two fields. Okay, even and odd lines don't match. Throw away all the odd ones. You've got even lines left. The big problem with that is you've thrown away half your image data. If you throw away half your lines, you had 480 lines in NTSC originally. You've got 240 left. Okay, well, that's going to be an issue.

You've thrown away half of your image data before you started. The preferable method to use is what's called adaptive deinterlacing. Most tools support adaptive deinterlacing these days. They can call it different names, but the basic idea is find the parts of the image that move where you get the interlacing, deinterlace those. Parts of the image that aren't moving where you don't have the artifacts, leave those alone, so you get higher resolution in those areas. You know, pretty smart little thing. Why does it paint off hugely? I mean, you know, it's for like slow moving content, almost doubles your resolution.

You got to watch out. Some tools will wind up giving you a little, will just guess wrong as to what's moving and what's not and you'll get little glitches here and there. 99% of the time it's not a big deal. I mean, it's almost always worth doing. Just kind of keep your eye out for places where it could have some problems. But hugely pays off.

A special case of this is inverse telephony. How many folks deal with a lot of film content or 24p content, like movie trailers, TV commercials, primetime dramas, all that? A few of you. Okay. So when you're working with that kind of content, and I'll show you some samples of that in a few minutes here.

You take the 24p source. 24p gets slowed down 0.1%. So it's 23.976, and then it gets transferred to the video running at 29.97. Math doesn't really matter. The basic net effect is you take your 24 frames per second of video and you spread it out on the 60 fields of video and it doesn't really go very evenly.

So you wind up with this pattern where you have the first frame of film becomes three fields of video, next frame of film becomes two fields, three then two, three then two, three then two, hence the phrase 3-2 pull down. And the way you detect this, and we'll look at this later on, is when you're going through through a file, just go through, frame by frame, you're a QuickTime player, just the right arrow key, go through, and you're gonna see a pattern of three frames that are progressive, and then two frames that are interlaced, and it'll repeat. So just like find a long shot with some motion, key through there, like okay, if you see, if all the frames are interlaced, it's interlaced video, deinterlaced, if all the frames aren't interlaced, it's progressive, leave it alone, and do a little happy jig. If you see the pattern of three frames that are progressive and two that are interlaced, you're gonna do an inverse telecine and do a happy jig, 'cause that's actually a nice thing to have available.

So basically, Ambrose Telson, he looks at the pattern, figures out like, okay, well, these fields came from these original frames, and can reassemble them. And that's really good for a couple reasons. One, we've lost no data as we do with deinterlacing. You wind up with your full frames, so 480 lines of, you get back in NTSC. And also, you can actually restore the original frame rate. You're actually at 24 frames a second. So instead of having to encode like 30 frames a second, or even 60, if you're dealing with the fields, you could actually get the original 24 frames video out of it. And that means for every frame that was shot in the camera, you're doing a frame of video, but you're not spending any bits of frames that didn't really exist. I mean, these phantom frames get spent there. It always bugs me. I see people encoding video.

They take film shot stuff, and they encoded it. They deinterlaced it into an inverse telecine, and they encoded it at 15 frames a second. And so you wind up with this really kind of jittery emotion because you have, like, two of the film frames, and then one will be missing, and then one and then two, and what should be a really smooth experience winds up not being. By doing inverse telecine, like, delivering at 24 frames a second, you actually could deliver a smoother motion experience than television would have had of that same thing. 'Cause even the 3:2 gives you a little bit of temporal anomaly there. So, like a horizontal pan can actually look smoother and better in high data rate web video than it even would have on television. So. You PAL guys are lucky because none of this garbage happens with you guys. The way that film gets transferred to PAL is for 24 frames a second, it gets sped up 4%. It's made 25 frames a second. Transferred is progressive. So PAL film source becomes 25 frames a second progressive. You just have to turn off deinterlation, you're good to go. It's a wonderful thing. So many problems go away that way. So I hope you appreciate how fortunate you are. But of course it's all because you have 50 hertz power, which means your electricity kills you more. So I guess there's a body count associated with it. Yeah. Things work out that way. You know, you have moral execution deaths because you don't have to have inverse telecine.

Okay. Any questions so far? If you have questions, just come to the microphone. I'm going to go to the microphone and ask right there. Thanks. I'm trying to be a hard-ass on the microphone thing and see if that works. So you would either If it was shot on-- OK, so the three scenarios. One, it was shot interlaced on video. I'm just going to do an adaptive vinterlace. Scenario two, it was shot progressive, nothing at all. Scenario three, it was shot on film or 24p camera. Inverse telecine was applied. I'm sorry, telecine was applied.

They don't apply an inverse telecine. So what are those three? It's pretty easy to just go through, just frame by frame. If you just find five frames of film of video with motion, step through those. That's all you need to see to see how to do it. We'll look at some samples later on showing what that looks like.

But it's pretty straightforward. forward. And of course, 24p cameras are becoming more popular. Final Cut 4, obviously, has support for that. So it's actually kind of nice. You can just pull it on 24p and work 24p natively and not even worry about this stuff and leave it alone. OK. Is there another question over there? Or are you just wondering in? That's fine. We're good. So cropping. So we don't see all the edges of a TV signal on television. There's stuff around the edges.

It's part of the video signal we don't see. That's fine. Computers, of course, get every last pixel. You're kind of annoyed if you bought your LCD monitor and there was stuff you couldn't see on it. The menu bar doesn't show up. People get all peevish. So, the problem is, is stuff that's part of the video signal that is fine to go on television, it's kind of fine to go on DVD because DVD is the same safe area, but that you don't want to be seeing on the computer. Typically, a little like horizontal lines at the top or edge blanking, that kind of stuff.

A little bit of light letterboxing. You want that to take that out. A couple reasons. One is, I mentioned sharp lines are hard to encode. A letterboxing mat is a sharp line, you know, and if you've ever seen like a DVD or low low data rate video, you'll often see where there's any kind of letter boxing.

There's kind of a dimmer below where the black bar starts than other parts of the video. That's because it didn't fall on a macro block boundary. Details aren't important there. But that line, even though it's just a big area of black, winds up messing up a lot of codecs, especially MPEG-2 and MPEG-4. So you want to crop out everything that is in video. I like to go a couple pixels in from the edge of the video, just to make sure nothing's in there. So every pixel is something you want to see on the screen when you're going to computer video. For DVD, you don't want to do this, because then if you bring it up too much, then the stuff at the edges doesn't really happen. But at least on DVD, you want to matte out the edges. So if there's some kind of like noise up there at the top, draw a black area, just blank that out. You don't want a signal in there. And here's a trick for you DVD guys. If you matte along eight by eight, or especially 16 by 16 blocks, that will align with the macro block structure of MPEG-2.

So, and if you align that, you won't get that weird distortion effect at the edge of the letterbox. If you're doing letterboxing, try to make sure it's aligned with the 16 by 16 blocks in the video. So, like, you know, if you're a pixel zero, you know, block from zero to 15 or from zero to 31 or zero to 47 in those increments, and it'll be a much cleaner effect. We'll kind of do that later on. Actually, Compressor makes this really nice. It's got a letterboxing and a cropping filter, so you can kind of knock stuff out. Okay.

Okay. And we'll look at some samples later on, which are pretty bad. It's okay to cut out some of the noise. You don't have to grab going an even amount from top and bottom. I mean, VHS especially often has a lot of junk at the bottom, but it's okay at the top. Grab the rectangle that's the right rectangle.

Don't worry about being centered or not. Grab that. And another technique is if you're going to a really low resolution, like below 320 by 240, especially in the phones, Because video is composed aiming for the safe area, no one's going to stick critical information into the edges of the screen. So you know about a 10% boundary around the video. There could be stuff to look at there, but nothing important is going to be there. So you can go to a very low resolution device.

You can crop into that area a fair amount and make your foreground objects larger. So like this interview piece, the head will be bigger. You won't see the set behind them, but you don't care. You can read the lips better because you get more pixels. all matters This is kind of a little After Effects screenshot here, showing where the action safe and the title safe areas are. Instead of imagining you're coding the video, by cutting it into title safe, you get a much, his face will be a lot bigger than if you go all the way out of the edges.

It's a little bit dark in here, but there's some black edges around the screen that you probably can't see in that display there. They're over here, they're pretty cool. I have a much more frightening example of bad VHS source later on. That's Oboe Addy, the drummer, by the way. Anyone from Portland knows who he is.

Scaling. Scaling is basically changing the shape of the rectangle. So by deinterlacing, we basically, one way or the other, made a progressive source rectangle all by ourselves. By cropping, we defined what subset of that source rectangle we want to process. Then scaling is saying what output rectangle do we want to do? And then your tool's going to take your cropped area and then scale that into your output rectangle. So that's all scaling is. We do it in Photoshop with image size all the time. Pretty straightforward operation. There are a couple subtleties that really matter a lot. One thing that is biting lots of people that took us lately is the whole non-square pixel thing. Pretty much all the production standards of standard definition are non-square pixel. Your 720 wide PAL or NTSC, 4x3, 16x9, the pixels in your video signal are not square. So, you know, NTSC 720x480, PAL, 17x576, and that's the resolution. 4x3, 16x9, it's the same one. And so you wind up, you know, and when you play it back on TV, you can see they're 4x360x8.9. It looks good because it knows that. But I see a lot of this on the web. So it takes a 720x480 file, go, oh, I'm going to compress it down. Okay, I'll cut it in half. I'll make it 360x240. And you can pretty much say right now, no video is ever with a 3x2 aspect ratio is what that is. If you find yourself encoding to a web format at 360x240, and I'm sure everyone's done it at least once, You're doing it wrong, so that's your clue. You want to make sure you're matching the aspect ratio of your source. If you have a 4 by 3 source, after cropping, you want to be delivering in a 4 by 3 resolution with a web codec. 60 by 9, same thing.

So I'll give you an example, 360 by 240 and 720 by 480. A lot of numbers here, so we'll do the things. It doesn't work. If you're a 4 by 3, a 320 by 240 is a 4 by 3 aspect ratio. That'll be right. Your circles will be circles, not ovals.

And the real problem with 360 by 240 is it makes everyone 10% fatter. And if you're working with actors, you're just getting in big trouble that way. Just know that's not, doesn't go down well. 16 by 9, 432 by 240 is a match number for 16 by 9.

And so, I want to make sure your output aspect ratio, your output frame size matches the aspect ratio of the crop source. If you have like a letterbox movie, you know, like a 1.85 to 1 or 2.35 to 1, if you're cropping out all the letterboxing, you can get quite wide. Like you have a, you know, like a CinemaScope kind of thing, you might wind up at 640 by 272 and crazy things like that. And that's great. Because, you know, in terms of your bandwidth requirements, your processor requirements, all that kind of stuff, It's basically how many total bits in the area you have to worry about per second. So if you can do 640 by 272 instead of doing 400 by 300, that looks great because 640 feels full screen, and in fact it's short. That just looks like you're doing cool letterboxing. You're able to give the user what feels like a real full screen experience while you're actually not using all that many pixels. That's great. Again, it's all about bang for the bit. If you can achieve a good communication experience by using different aspect ratio, I always encourage people who are doing web production, think about shooting 16x9 or whatever. I always joke that I want to do a rock climbing video and shoot 16x9 sideways. With web formats you can deliver if you want to do that. As long as you're not using NDO 3.2, by the way, it will crash if you were in portrait mode. Every other codec works fine.

I'm a codec nerd, what can I say? Kids today don't know these things. They don't know their heritage. Okay, another issue is scaling algorithms. Most tools today are used by cubic, but some of the older tools have some other scaling modes. This matters a lot. Make sure you're not using something which uses an old nearest neighbor thing. We used it at Premiere 5. It had one where if you didn't manually turn on this checkbox, the scaling quality was terrible. So just be warned about that. Most tools are getting pretty good at this these days. Okay, so big goals from scaling, you want to make sure you wind up in the right aspect ratio for your output.

Okay. Also, we want to make sure we're trying to not scale up. And scaling and cropping have an interesting relationship. Typically, the higher your output resolution, the less you're going to crop. I gave the example of doing a really small resolution. You want to do a very aggressive crop to try to get the foreground objects to be bigger. But you never want to scale up. So just a little bit of math here. If you're cropping at 10% and you're doing a straight deinterlace or if it's video with high motion like sports video where the adaptive deinterlace doesn't do much good, And you can wind up with only like, you know, 260 lines of source. So even at 320 by 240, even though you're scaling down horizontally, you're scaling up vertically. And you want to reduce or eliminate any scaling up because that always introduces artifacts. You always want to shrink down or not shrink at all if possible. So if you're going to 320 by 240 or higher in NTSC, you want to crop as little as possible. PAL users, you've got 384 by 288 as your resolution where that really matters. You get a little bit more flexibility with your higher source resolution. Thank you. And again, using the adaptive deinterlacing or inverse telecine gives you effectively a lot more source pixels, so it's less of an issue, especially the inverse telecine when that's available. Thank you.

OK. Luma adjustment, and we'll look at this for a while. How many people have been doing video compression for more than two years? OK. So Apple's made Luma work well now, but it's been very confusing because things used to work really weird, and they're kind of the transitional period where not every tool has got the whole story with getting Luma processing right. So Luma basically is brightness. It basically controls, you know, with the, you know, and typically, you know, we think about in the computer world is black as being zero and white as being 255. But in the video world, black is actually 16 and white is actually 235. And it used to be codecs didn't do the right thing for you. So you had to actually manually add contrast when going from a video format to a web format in order to make blacks black on whites whites. And so we've got these rules of thumb. Probably-- how many people here automatically add plus 27 contrast when they're doing a little decompression there? A couple of you there? OK. Good thing is you don't have to do that anymore. But they never really told you that. So as long as you're using codecs that can properly deal with color mapping, and now you're going from, say, Apple motion JPEG or DV to MPEG-4, for example, they will automatically, behind the scenes, do all the color space conversion for you. So you don't want to add that extra contrast. You want to add the contrast you need to make the image look good, but you're not going to add any extra.

So, and it's kind of, some codecs aren't quite doing that now, so this is kind of messy, weird error right now, but for the most part, things are getting better. You don't have to do that a lot. So I mainly mention this because the people who have been automatically been adding contrast and also doing gamma correction, you know, for a long time, that's less important. Another issue is the whole gamma thing, you know, a Mac, a Mac will show the RGB value of 127, which is the middle of the range, is a lot brighter than a Windows machine will. But the good thing is new codecs, especially MPEG4, DV, MotionJPEG, will actually compensate for that. They will actually figure out the local gamma and will draw the image right on Macs and PCs. I've actually seen a lot of people who had the long struggle to learn how to raise the gamma to make Windows video look right, and they finally got over that hump. And then now, if you do that with a modern codec, QuickTime will do it for you again on playbacks.

You'll wind up with it totally way too bright on Windows. So the only real solution is test your codec, encode with it, and look at it before and after, and make sure you're actually using the right Luma processing mode for that. Most modern tools, you know, Compressor, Cleaner 6 gets this right. Cleaner 5, yeah. That's one of the things. You take a setting from Cleaner 5 that worked perfectly, you take in Cleaner 6, it's going to fail completely in terms of Luma processing because they changed the internal model, which, you know, it wasn't really the manual. It's kind of an important little aside there that, oh, by the way, how we process video is completely different now. It's better now, it's faster, higher quality, all that kind of stuff, but it changes a lot. There's also a creative element to Luma. One of my creative goals is I want to have any frame that's going to be black with black titles or credits or whatever, I want to have the black pixels all be actually mathematically black. Because on a TV, you've got a little bit of noise in the black, it kind of goes down pretty easy. TVs are pretty lousy display devices. But on a computer screen, where you actually have things that are big, flat rectangles of black to compare it to, you have a little bit of analog noise in there, Sometimes your black is a little too bright, it just looks terrible. And also it's hard to compress because you've just got a lot of stuff going on there. So by actually, you know, making a big 16 by 16 block of the number zero over and over again, it's easy to compress. That block will not take many bits. The bits you don't spend on that get spent on your actual--you're trying to communicate. In the case of text, that actually matters a lot because some of that sharp little small point text can be hard to encode. Making the rest of the frame as simple as possible, making everything really mathematically to black will make your text look better with fewer artifacts. So my typical, I'm going to go back to the last slide.

My typical rule is for brightness, I am never going to raise brightness up because that's going to add noise because it's going to raise my black floor. I will often take brightness down just because there's a little bit of random noise in the black. I'll take it down a little bit. For digital sources, you're using a Kona card, you're using DV or whatever that was actually shot digitally, black is really going to be mathematically black and you're okay. But with analog sources, even high quality analog sources like Beta SP, you're going to have a little bit of that random noise in there.

Take brightness down minus five, help a lot. It's often good to mix contrast and brightness together. So if you're reducing your brightness, add some contrast to keep your whites from getting too dark. So, I mean, typical for like a pretty clean source, I might do contrast up by five units, brightness down by five units, just to kind of move my black range a little bit down, but keep my whites pretty constant. Maybe toss a little bit of gamma in there if it's being a little too dark in the mid-tones, whatever. And you can spend hours doing this, and we will soon. Well, not too long, but you'll get the idea.

Contrast, this is, like I said, +27 used to be the rule of thumb. I've been quoted in a thousand articles I wrote saying, "The key to making great web videos is to add contrast +27." And now I'm here to say, well, that was true then, but now it's not anymore. So it'll be a different rule tomorrow. But I mean, thankfully it's all getting fixed, but it's awfully confusing right now. Contrast is also, if you've got bad coin analog sources, as a video filter, contrast can often help. It's getting your whites white and your blacks black, but leaving the middle of the range is relatively untouched.

Okay, gamma. Gamma has been a big source of pain for years, as I mentioned before. So gamma basically is kind of the inverse of contrast. Contrast affects the extremes, but leaves the closer to the middle the less effect it has. Gamma is the reverse. Gamma controls the brightness in the middle of the range, but it leaves black and white alone. And gamma is a useful filter in its own right. If you've got a video that seems a little bit dark, raising brightness is not a good way to make a video look brighter because it's going to raise your black floor and you're going to have washed out blacks. Raising gamma is how you make video look brighter, because it makes your midtones brighter is what you care about, but leaves your black and your white alone, which is assuming they were already right. So my typical workflow is all, if a tool does it the right order, which is brightness, contrast, gamma, I'll get my brightness and contrast right, to get them nailed so I get my black and my white point right, then I'll use the gamma filter to get the midtones looking right.

Some tools do gamma first, that makes it a lot more difficult, because things in the wrong order then. So I mentioned the platform difference before. The good news is now with modern codecs, it's kind of going away, automatically corrected for it, but not in all codecs. So, ugly. Um, Okay, so lumen processing is kind of complex. Any questions so far on the image processing kind of stuff? I will show you some samples of this later on.

Hopefully I can textualize a little bit more, but everyone with me so far? Or too shy to admit it otherwise? Okay. This is a little bit confusing, and we will look at some samples of this. Noise reduction. Noise reduction is filters that try to remove noise. I mentioned before, compression is all about trying to communicate, and we're trying to suck an elephant through a swizzle stick here. We don't have enough bandwidth to do a good job. Let's say one of the goals of compression is you're trying to achieve balanced mediocrity.

You never have enough bits to actually do it perfect, but you're trying to find the right balance of imperfect that gives you the best results for what you're trying to accomplish. Balanced mediocrity. Noise reduction is one of our tools to achieve that, appropriate mediocrity. We're trying to filter out stuff that's noise, especially like video grain. Video grain is a real killer because it's different in every frame. It's basically a random pattern of just, you know, distortions of the video that's different from your frame. So it looks like motion, the codecs can go crazy trying to make sure it draws this little pixel here that moved over here, but it wasn't the same pixel. They just, it's noise.

And codecs internally try to filter some of them out, but it often helps to do some to begin with. All noise direction algorithms will cause some blurring. The best ones do a good job of only blurring things you want blurred, leaving things that, or actually signal alone. Some things that are called noise filters are just blurs, just a Gaussian blur. And really anything more than a very subtle Gaussian blur winds up hurting a lot more than it helps.

A few tools to do what's called temporal processing actually compares the frames and do some stuff there. In general I say be very wary of noise reduction. If you have clean source, no noise reduction will typically be required. Really it's where you have analog source that has problems where noise reduction can pay off. And the uglier, more horrible your source, the more it can help. The problem is the uglier, more horrible your source is, it'll never look good. You're just doing shades of terrible, unfortunately. So I've got a sample later. I have the infamous VHS ugly.move source file we'll look at in a little bit. And you're going to see a very happy jig about Compressor, because Compressor has introduced the feature I've wanted for so long, which is chroma channel only noise reduction, which is a good thing.

Okay, let's talk a little bit about audio preprocessing. Typically, audio comes in cleaner, so you're going to wind up doing less work with audio. And also, computer audio versus analog audio is a lot less different. Typically, a couple filters we're going to do. We're going to do audio normalization, which is find the loudest point in the whole audio. You're going to raise or lower the volume of it overall so you're something a little bit below what the peak is available. Typically, minus 3 dB is a good one for most digital formats. We don't go all the way to maximum possible 16-bit audio space because some codecs do some rounding things and can wind up being digital peaks that way. So you need to leave a little bit of headroom before you go into the codec. But it's just a good idea to have, you know, that be pretty consistent. Most things like system beeps on the computer are going to be about minus 3 dB. And it's really important to have, you know, the loud things in your video match the other loud things the computer does. Because if you don't do that, what happens is you have a quiet video and the user turns the volume up and turns the volume up and leans over and it's kind and turns it up some more. And then finally hears the video and kind of listens to it. And then they get mail. Mail shows up. And this thunderous bing sound happens and like deafens their cats. And there's a lawsuit and everyone's sad. So you don't want people to have deaf cats calling you up and yelling at you. Just make it loud. You don't have to do a whole lot of like crazy compressor-limiter kind of stuff. But minus 3 dB for your peak and everyone will be happy. Yeah?

microphone, yeah, do the microphone, yes. I'm being a hard ass about the microphone, sorry. Thanks for reminding me to be a hard ass. - If you've got a low signal level that has 18, 20 dB, if you normalize up that loud boil, won't the, if you raise the noise floor, won't that play havoc with the audio side of the codec?

Yeah. Well, sucky audio is going to remain the equivalent suckiness, whatever loud it's going to be. Yeah. If you've got audio that's really quiet with a high noise floor, you've got bigger problems than, yeah. Go get Pro Tools and spend some time with, well, I don't know, fire your sound guy. I don't know. Yeah, that's going to be a problem. Codecs don't deal well with noise. You know, they try to filter it out. Speech codecs, actually, like, you know, like AMR or PureVoice, can actually--are so tuned to voice, they can actually filter out some kinds of noise, which is actually kind of cool. I've a couple times actually used the Qualcomm PureVoice codec as a noise removal feature, 'cause it just--there was speech, and I just didn't know how-- it couldn't encode, even though it wasn't speech, so it just left out all-- left out the noise. But, yeah, if you just got a high-- Yeah, it's just a problem. It'd be a problem with what you're going to do. You know, it's not-- nothing particularly about compression makes it dramatically worse problem than you already had.

Other audio processing. Compressor limiter is good. Typically, most computer environments, it's properly well-mixed audio. You're not going to need to do that kind of stuff. Most computer speakers are pretty good. You don't want to mix it for some kind of crazy AM radio thing, because then people have good speakers that sounds bizarre. 3GPP phones, obviously. Those are pretty small. We still put some headphones. You try to ask that someone's got a watch on a phone, you probably do want to do some compressor limiter stuff. You know, typical things, notch filters, noise removal, that kind of stuff, you know, it's what you would do to clean up audio that you were gonna do anyways, you would apply to compression. It's somewhat more useful in compression just 'cause it'll encode a little bit better, but still, it's, you know, if something sounded too bad to listen to before you were gonna compress it, it's not gonna get any better for the most part. But typically, it won't get a lot worse.

All right. Yeah. Microphone. Okay. Before you go to the demo, you were talking about preprocessing. Could you suggest something about preprocessing in a live situation? What do you do? Because here you are using clean. Yeah, preprocessing in a live situation. Well, you're not really, I mean, yeah, it's preprocessing in that case.

You can't really do it pre, but yeah. In the sense that before the encoder, right? Yeah, I mean, a lot of it's going to, well, definitely one, like, if you're shooting something live, you want to shoot in progressive mode, so you don't have any analytics you need to deal with. You want to... One issue you can kind of see, if you're shooting with a DV camera and there's black bars on the left and right, that's part of the video signal, hopefully using a software tool lets you crop the video signal out. If it lets you do that, you'll crop out to the active image area. If you have a tool that doesn't do that, you cry a little cry, I guess, and move on. But yeah, I mean, when you're doing it live, you have the flexibility that you can actually just get a bunch of stuff right.

you can control the whole experience. The hard part of pre-processing is when you get stuff you didn't author that someone shot 20 years ago before they even knew about compression and then you have to try to make it work. To the extent you can control the entire process, you can make it work really well. So yeah, I mean, shoot progressive, use a tool that supports native cropping, do good live audio production like you normally would, it's gonna be just fine for that. It should be okay. It's not, the hard parts for doing compression for live actually a lot of things like camera motion. I mean, you know, you don't want to shoot, you know, a handheld and you get all kinds of blockiness, you know, and really just trying to control the production environment is really where it pays off the most. So, question? Yeah, can we go back to, just real quick, scaling for a second. So you start out 720 by 480 for the, say, DV video.

And want to serve it on the web somewhere in the 320 by 240. It doesn't sound like you're a big 4 by 3 fan always. Well, that's what your source is. I mean, if you're shooting your own stuff, you can do whatever you want to. If it's 4 by 3, deliver us 4 by 3 for sure. Sure, sure.

Let's say if it's your own stuff. So it sounds like what you're saying, if I understood you correctly, is go ahead and crop to get rid of any stuff on the edges there. And then when you scale down, what proportion-- if you're serving on the web, if it's your project, what proportion do you think looks good and do you like a lot in actual pixels? What kind of bandwidth are you looking at?

like, 200 kilobits? Yeah. I mean, 320 by 240 is kind of my good default. I mean, that's what I start with. I'll encode it. I'll look at it. If we shot on progressive, you have a lot more stuff to work with, so you might bump it up a little bit or bump it down a little bit to be on the format. But 320 by 240 is generally a good starting point. So you would crop 640 by 480 and then... Ah, no. 720 by 480, anamorphic. So if you have a clean aperture 720 by 480 going to 320 by 240, you don't crop at all. You shrink 720 to 320 and 480 to 240.

Yeah, I see people getting in trouble with that. They crop 64 pixels left and right. Yeah, yeah. Right. Because you're going from a non-square pixel environment to a square pixel environment, you're actually correcting for that by just pure scale and not a crop. So you're going to do cropping to get rid of the stuff, but you're not going to crop for the aspect ratio. Great, great. Good, thanks. There we go. I'll save you hours right there. There we go. Well, WWDC just paid for itself right there, hopefully. So any more questions before I move on to demo land? All right, can we get the demo machine here? OK, no? Am I asleep?

And they told me to make sure I have my energy saver off. So that's my own darn fault. There we go. Does that look good? Are we 1024? Okay. So in the interest of being a show-off, I'm going to let you guys tell me what you want to see done, and we're going to do it. So we've got compressor, we've got squeeze, we've got cleaner, we've got Premiere, we've got After Effects, kind of the tools I'll often use for preprocessing. I've got film source stuff, I've got PAL, I've got NTSC source stuff, I've got ugly looking VHS stuff.

So we've got time for a couple different ones. So someone who's got a scenario that's kicking their butt, what do you want to see? You want to see how to make an Apple movie trailer? Like the QuickTime.com movie trailers? Is that a good one? All right. Okay, you want to see it in After Effects, you want to see it in Cleaner, you want to see it in Squeeze, what do you want to see it in?

Well, that's a clear goal there. OK, so let me show you-- do you want to see the Biker Boys trailer I did before, or do you want to see the Sinbad movie trailer? I know, it's horrible. We'll do the Biker Boys one. This has already had the output. Huh?

So is that? I don't have that. I don't work for Apple. I know. I got DreamWorks as my... I don't work for them either, but they give me stuff. So... Find Nemo wasn't out yet, so I had to do these slides ages ago. So let me just show you the source file here. This is the idea we're talking about. So this is a... So who doesn't have QuickTime Player Pro? Anyone? QuickTime Player Pro is the best $30 you're going to spend a year. Just think of it as a cheap magazine subscription. QuickTime Player Pro pays for itself in eight hours. So you're going to see lots of cool QuickTime Player Pro tricks here. Which isn't the purpose of this class, but it's like quarter of my lifestyle. OK. So this is a source file here. This is a typical motion JPEG source file. It's 486 lines, not 480. this is like what you get off a Cinewave or a Kona car, something like that. But it's motion JPEG and not uncompressed because I only have a 40 gig hard drive. So a couple things here. So it's a preview. We can see we've got some blanking here on the left and right, and we're letterboxed. That's a 1.85 to 1 aspect ratio.

Let me just find a good scene with some motion in it. And I'll mention before the pattern. Let me just find the start of the shot here. blah, blah, blah, cut, there we go, okay. So, I start off here, short of this thing, I'm gonna go frame by frame here. So you got progressive, interlaced, interlace, progressive, progressive, progressive, interlaced, interlaced, progressive, progressive, progressive, interlaced, interlaced. So we see we've got the typical pattern or three progressive frames followed by two interlaced. We know it's telecine.

So this is an inverse telecine project. So, pretty easy. Good. It's really easy that you only need 5 seconds to get it in place. You only need five frames of a video to find out what the right de-interlacing mode is. That pays off so much. A couple of things looking at this thing. It's moderately noisy. You can't really see it very well. There's some noise in the heel here. This is a pretty clean, you know, there's some, the noise in here is from film, honestly.

Here's a very de-interlaced frame right there. That's what happens when you have a frame that happens in the middle of a hard cut. When you get through to a pull down, you might wind up with the two different fields being of two different frames entirely. So here, that isn't a transparency effect or something like that. That isn't a cross-isolve. That's just two different frames on either side of a hard cut that is part of the telecine process wound up as part of the same frame even though they weren't originally.

So that's a very hard frame to compress. So we definitely want to get rid of that kind of stuff. Okay, so that's the source. Cleaner is a pretty slick tool for preprocessing for a lot of this kind of stuff. The biggest problem right now is the inverse telecine algorithm has been broken since version 502.

It got a little bit better in version 6, but you can still find some cases where to reverse the order of frames around a cut and that kind of stuff. So you've got to carefully QA inverse telecine stuff done from cleaner. But it's definitely kind of the easiest tool to visualize this stuff in. Thank you. I actually keep a copy of Cleaner 4 and an old G4 line around just to do inverse telecine-y stuff sometimes when I find things that expose the bug. And Cleaner has this great thing called the Project Window, which we'll do here.

There we go. So, and basically this is kind of where you define everything about your source. And we can do some things here. We can say, okay, this is 4 by 3 source. Some tools basically default to showing you in raw pixel modes. This is actually the 720 by 480 right there.

But it's not-- but clean or smart enough to go, OK, it's actually 4 by 3. So it's going to compress it on the horizontal. So you see the current aspect ratio. First thing I'm going to do is run a crop. So I mentioned before, this is a 1.85 to 1 aspect ratio. Just look at these things a lot if you kind of learn. Whoops. Yeah, there's a bunch of numbers. Um, Most films are going to be 1.85 to 1, or 2.1 to 1, or 2.35 to 1. You have a question? Microphone.

Gotcha. Almost. Why has Cleaner never provided that aspect ratio when it's such a common source call for my 4.85? I don't know. I wrote a bug report on it. The Cleaner UI has not been substantially changed for three years. So I mean, just to basically, so I mean, they've only added any of that kind of stuff for a long time. So I don't know. Yes, it would be obvious. I've requested that feature many times and I haven't got it. So anyway, so crop aspect ratio 2.35 to 1. I can-- basically, it'll draw me a box, and it'll constrain to that aspect ratio. And then you can kind of do it here, and kind of center it.

It's a really nice cropping dialog. I always like this feature of Cleaner. You're able to go in there and you can kind of scan through and make sure you're not getting out there. You know, and, yeah, that's about right. And getting a little bit of the imagery off is not a big deal. And also you can just do it unconstrained. Rule of thumb, you know, and if I just grab it like this, it might get a little bit distorted. Rule of thumb is that any aspect ratio distortion less than 5% is not going to be noticeable by the end users. So don't worry about getting exactly 1.85 to 1. If you crop 1.81 to 1 and you scale it to that, a few pixels off in terms of scaling is not going to be noticeable in terms of looking distorted. So as long as you keep it under 5%, it's going to look pretty good. Okay, so that's what we'll do here. Let's analyze video. Okay, so let's make up a setting here. Let's come up with a scenario. Oh, wait, we're doing Apple-style web video stuff.

So, okay. I'm going to make a new setting here. Ben fakes Apple trailer. for. Quick time thing, compress, we'll skip all that stuff. So, it's cropped. So, with the high resolution, they sometimes do the big woods up to 640 by 480. You can specify this thing so it's the same as crop source here. And 640 by 480. So kind of annoying bug here is it doesn't actually do the math right.

You use around it to divisible by their values. So, 640 by, so actually what I wind up doing all the time is going over to the calculator and actually doing the math myself. Cleaner tends to get it wrong. So it's 640 divided by 1.85. Comes in and it runs-- it's actually 345.

When you're scaling, most codecs work better when both resolutions are divisible by 16. So geeky little thing, that's the way compression works. If you choose a value that's not divisible by 16, what it winds up doing is it winds up actually encoding another 16 pixels and then not showing some of them to you, which winds up being more work that you never get to see. So I always round the nearest 16. So kind of the way I do that is I do the math.

So 64 divided by that number is that. I'll divide that by 16. Okay, 20, the nearest whole number is 22, times 16. We're gonna set it to be 352. I'll use 352, which is actually the correct value. And there we go. For so let me just do the real time preview here. Oh, they apply there. Okay.

So this is-- oh, look at that. I didn't quite crop it enough. That's good. It's real time preview mode. Is anyone here not familiar with Cleaner? Are there people here? OK. Stop me if I do anything that's too weird, but I'll kind of assume you have a basic idea of what's going on here. OK, so that's a little too much.

OK, there we go. So I want to make sure there's no black. I'm going to see if there's some filter in my default. Cleaner doesn't really like to be on this resolution. It really wants to be on a big monitor, so it's a little bit cramped here. Okay, image. So by default, it's going to do a deinterlace. And so we have frames with interlacing on them, like-- what do we got here? What's going on? There we go.

We'll get this after-- so it'll be by default, which does an OK thing like this. But really what we obviously want to do is we want to do an inverse telecine. It's called a telecine cleaner. And it's a little bit cleaner. You probably can't see it from back there, but it gives us more resolution and gives us a higher quality output.

Sharpen filter, I recommend never using a sharpen filter. A sharpen filter can make it look better before you compress it, but sharpening adds noise, it adds complexity to the image, and even though sharpen can make it look better before it hits the codec, at any kind of reasonable web data rate, it's going to look worse after having applied the sharpen filter than if you had just left it alone. So I always turn sharpen off. It's a little bit noisy, the mile adapter noise reduce is a good default filter, it's not going to hurt it too much, it's not going to help a whole lot, but it helps a little bit and it's not going to be too distracting.

So by default, we get a bunch of default settings here which are totally the wrong ones. I don't know. The default image processing ones were kind of like designed for Cleaner 5 and kind of ported over without acknowledging the whole image change thing. So never use the default processing image filters in Cleaner 'cause they're all wrong in Cleaner 6. Every last one of them.

Yeah, and you're at the microphone. Thank you. Yeah, you're good. How much of a difference would it make if your source material was not telecine? Like, I mean, if you can actually get it before they telecine, and how much of a difference, how significant? Okay, so basically if I had like 24p source frames, it won't make that much difference in that aspect. Obviously, if it's like, the question I'm getting, like, you know, my, you know, Cineon files straight off the machine versus like a beta SP tape, Obviously, that will matter more. But the inverse tells me, if my choice between a 720x486 series of TGA files and a 720x486 Kona file, pretty much the same net effect. It's not going to matter that much either way. The workflow is obviously better. It's going to save you time. Oh yeah, you don't have to do this filter. You get smaller files. You get about a 20% reduction in data rate of the source. So it's a very nice thing, but it's not going to have much of a quality impact.

Okay, so we got this guy here. I always say this movie, because I've watched this movie like 87 billion times, and I have no idea what the movie's about, really. It's got some very cool motor. I guess it's Fast and the Furious on two wheels or something like that. I never saw Fast and the Furious either.

I have two small children now, so my, whatever hypnosis I once had is long gone. Cars or something. Okay. So... So this is generally pretty good. So I mean, for a clean source like this, I don't actually need to add any filtering at all. I might just go for-- the one place I'll typically look at is a frame that's got some-- I know it's all black on it.

Get some credits frames typically have some stuff. It doesn't really have all black on it, though. Thank you. Oh, the key to a great trailer is you edit out this graphic here, which almost always looks bad, and you actually get an EPS file of it from MPAA, and then just render that into it. So you have no analog noise in that start part.

It saves a lot of bits right there. And so we kind of look here and see this is all the way black. And actually, one thing that Apple did that makes me happy is digital color meters. It's a great little utility. It comes with a back post down.

So you can kind of go through here and actually look at the color values. This is great. Like, is this actually all the way black? I don't care. Okay, this is actually pretty much all the way black here. There are a few things here and there, but that's flat enough. All I'm going to do here is I can tune in an actual brightness of minus one. And that should... Okay. The one thing about Cleaner is the preview window does not show up when it's not the foreground app.

And the way, yeah, there we go. And it seems like a really kind of tweaky, kind of goofy stuff. That's because it is. I mean, you could really spend all day going, oh, I don't know, let's try one off. And is it going to make a big difference? Really not that much. Well, plus it's got some gradients here. Yeah, okay, that's good enough. And if I could just turn these off, it's really not going to matter that difference. Anything less than about five is really not going to pay off that much. And another thing, if you're using inverse telecine, Yeah.

Restore, wouldn't you use that? Oh, yeah, the white and black restore filters. Basically, so we get two values, amount and smoothness. So Luma's encoded at a range of 0 for all the way blocks, white at 255. What amount does it say, basically, if I say 6, or 19, 19 for example, every pixel that is Luma's value of 19 or below gets turned to exactly be 0. So 0 through 19 becomes 0, a 20 stays a 20. And then what smoothness does is it basically fills range to interpolate. So I said this, so instead of making 20, zero to make 25, zero, and then 20 to 25 will get expanded into the range from zero to 30. So you don't have a boundary. And what's it going to look like is Thank you. You can see right here there's a very sharp edge now between these two.

Because stuff that was kind of near black turned all the way black, so we kind of lost our gradient there. I don't know if you can see it all the way in the audience on this projector. Can you see the edge there at all? Okay, let me crank it up here. Okay, if you're a truly bold criminal, you get like that.

And that's stupid, so you wouldn't do that. BlackRestore is one of those filters that on those video clips where you really need it because they're very noisy, it doesn't work very well because it's so noisy you can't do it enough. It doesn't mess it up. So it's frustrating. It's because there's this narrow range of stuff that's bad enough to need it, but not so bad that it actually works.

I don't know the solution to that problem. It's just the way it is. Cleaner's version of this filter is better than a lot of the other ones because it specifies both the amount and also the smoothness values. It lets you kind of have a transition gradient. It makes it not quite so obvious.

A useful feature. Okay. And another thing you do in inverse telecine, you definitely want to make sure when you're doing that in cleaner, the output frame rate is going to be 23.976. Cleaner and most other tools don't support temporary sampling. So even though the source is 24, it got slowed down by 0.1% as part of the Telsany process to match the 30 to 29.97 ratio. So you have to do 23.976 and not 24. If you do 24, you actually wind up with a repeated frame every few minutes. Not normally a big deal, but you know, might as well avoid it. Okay. So, and that's how you would preprocess this, this. Oh, and I turned on the audio, I normalized to a. And cleaner doesn't measure in dB, cleaner measures it in.

This is a slider, it's about 90% is about the same as minus 3 dB. So that's how you do the pre-processing. And I completely ignored the encoder settings for all that. So the guidance of the codec stuff, that was just the filter settings. So cleaner is a very efficient workflow. Cleaner 6 is for getting kind of pre-processing down this stuff. On really hard projects, a lot of things are after effects, which is if you want to spend 10 times as long to make it 10% better, after effects is your tool of choice. But for most stuff, if you can do it in cleaner, it's nice to get it done in cleaner.

Yeah? Actually, in that same concept or same thought, what's your opinion of Apple's new compressor and how does it compare to cleaner? Apple's new compressor, let's look at it, let's do a compressor walkthrough. Okay, that's a good one. I'll show you. Shall we do, let's do the same clip in compressor and you can kind of compare how it's different.

I love this. So I've only had compressor now for about 48 hours. I had the flu for some of those hours. So we'll be learning this together. Compressor. So Compressor is bundled with Final Cut 4, but it is usable standalone app. You can either export to Compressor straight from Final Cut, which is a very nice integration. But you can also just launch it as its own app and do it up like that, which is kind of cool. So its workflow is-- I mean, it's kind of reminiscent of a cleaner in the good ways. Which is cool. So it makes you a little plus button here to grab a source file. So it was a Becker Boys ROM.

So, and do that, we'll pick a preview. We're gonna do, it ships with no QuickTime presets. It can do QuickTime export, but its presets are either MPEG-2 or MPEG-4. So this is NTSC. What's material? I don't want to. A couple of neat things about it. So I assign this here. It's got this kind of cleaner-like before and after slider, which is kind of cool. It's got this neat integrated crop thing here. So I can just kind of grab these crop edges. And it's actually in the live update telling me how my current settings are going to affect the output compression. Which I haven't really set up yet, but I mean, it's just so cool. This program really takes advantage of the power available on a modern G4, Mac OS temper, multi-threading, and all that. It really-- there's live update features are just really awesome. I mean, compared to having to like, OK, change your settings, hit the Update key, change settings, hit the Update key. So the rapid turnaround in terms of developing things is really kind of nice. So OK. And we set some settings here. It's pretty light in terms of filters.

For the most part, it's okay. One neat feature is you can actually change the order the filters go in. So I can drag this and drag this up here in this order. So I didn't do anything first. This is a cool thing. The mode you want is actually sharp. Sharp means adaptive de-interlace. I don't know why it's called sharp. That's what it is. That's the one you want to use. A great tragedy of Final Cut is that it doesn't really have good inverse telecine functionalities. There is this thing called Cinema Tools which is now bundled with it. I'll show you that in just a second. Which is awesome if you have the right kind of source. Which is a source where you actually know what the cadence is. You know where the 3-2 pattern lies.

So, and what you would normally do in Final Cut is you go through here and what some of the tools you can do is you can reverse the telecine feature and you can go, okay, I want to make a thing. I can actually, we'll conform it out to 24.0 if you want to and all this. But unfortunately, you actually have to know, either you get like a telecine log telling you what the field order is or you have to like guess. So basically, your only option here is, okay, well, I've got a total of 10 combinations of A, B, B, B, C, C, C style. I'll just try all 10 and see what happens. If your first frames of video have content to them, you can just single frame through and say, okay, well, this is A, A, A, C, D, that pattern. Unfortunately, like most video out there, this particular video, the first five seconds are identical, so I can't see any motion, so I can't actually detect it. So it's kind of a pain. So this particular project is kind of annoying to the compressor. I can only encode it as video, because I can't use inverse telecine with compressor in this environment. So, I don't know. I've talked to the Final Cut Pro team a little bit about this. For their market it makes sense. They expect if you're doing 24p, you're either coming with a Telsa NEO log or you have actual 24p source. They're not expecting you to take 24p, produce content that's been done through analog and captured. But I get a lot of that stuff, unfortunately.

So it doesn't work out. It's kind of a pain for me. Other kind of cool things. It's got this totally awesome noise removal filter, which I was going to do. I think I promised a little jig for that one. Okay. This is a great thing to apply to all channels or to chroma channels. This is great for VHS source.

I'll show you a sample of this in a second. VHS source is much noisier in chroma than it is in luma. So you can apply this noise reduction filter to just the chroma channels. It'll make them all blurry, but our eyes aren't very attuned to color, so it makes the chroma channels look way better without actually making the video look that dull. and it's just the best thing ever.

Well, that may be an exaggeration. But I've been asking for that feature forever. And then the first guys that come up to play. So when I request a feature for five years and someone does it, I've got to give them props. New things. It's got the actual, like the whole three point color wheel of Final Cut is all in here. It's just kind of a weird UI for it. But you can actually do highlights, mid-tones, and shadows, and change your things, and all this. And these are all live in terms of preview.

If I'm doing this right, I should be able to just like, you know, as I drag preview for, and it's actually showing it actually as it is compressed, not even, you know, just the preview thing like Cleaner does, but it actually does a good job of guessing what the actual compressed output is going to look like.

Cleaner will do previews, but it doesn't do the data rate right. It kind of, it compresses it as if the current frame is going to be a key frame, so you wind of getting about only a tenth as many bits per frame, it just winds up not being very good proof. Another thing you can do is you can, a couple of modes here, where was I, okay, so encoder, yeah, gamma correction. You can also do letterbox, you can crop out the bad stuff, you're doing it to DVD, you can actually add your letterboxing back in. You'd use that to basically matte out any kind of noise in the blanking areas, which is a little tiny feature there, but really handy in my opinion.

Type scale, matte, you can matte out. So anyway, this is what I plan on this case. You can do a watermark and add text over it. And they have a black restore filter. It doesn't give you the smoothness boundary. So you wind up with a really kind of sharp edges there. But it's all in there. And then-- So anyway, and you can like script it and you can have, send an Apple script to an output folder, all this kind of stuff. It's got a really neat infrastructure.

Unfortunately, it only does QuickTime and MPEG-2 and MPEG-4 today. So it's, I don't see myself doing a lot of encoding with it because there are important features. Like you can't do two pass encoding with QuickTime or MPEG-4, which are pretty critical for quality there and that kind of stuff. So I think it's a very exciting 1.0 and I really look forward just seeing what Apple can do, you know, is incorporate other technologies into QuickTime. Like a better MPEG-4 codec is something I was certainly looking forward to, and hopefully an ABC codec is going to be a really awesome tool. It's designed for nice productivity.

You get the preview thing, works well. You can do batches. You can go back and hit history. It will tell you what kind of batches you've done in the past, and you can find your files and all this kind of stuff. I mean, they really sweat us some details on this. So... It is not Apple scriptable. I know. I talked to Sal. Maybe soon. Keep telling him for it. Everything should be Apple scriptable all the way, all the time. I know. And this is not one of them. None of Final Cut 4 is Apple scriptable. I mean, you can fake it with that whole UI scripting, script the buttons through the accessibility thing. But you've got more time than I do, if you do. This isn't a compressor demo. So this is pre-processing.

It's pre-processing is totally awesome, except it doesn't have an audio normalization filter, it doesn't have inverse telecine. And if it had those, that would be all the better yet. But it got lots of really hard things when all this is done, so that makes me happy too. Let me show you that thing I mentioned before about the temporal noise reduction filter. I mentioned VHS ugly. I think we'll agree this is an Apple named file. So this is my, you see all the colors there? No, okay.

Yeah, you know, it's okay. So does that look bad? Can you believe me there? So this is typical of VHS. You get all this kind of tearing at the bottom of the area there. It's hard to encode and it looks stupid too, so you want to take it out. And you got stuff and the colors are bad and the bricks aren't really green and whatever. Yeah, when I was working on my book, I called my editor and said, "Hey, do you have any really bad source I can capture?" And he's like, "Yes." So this is like a third generation LP mode dub of something. I actually got it and I can't, I've got a professional VHS deck as part of my editing facility and I can't even play LP tapes. I had to go haul out my old consumer deck from my upstairs media closet, which I hadn't even played a VHS tape in years. And I had to-- I didn't even capture this video. So this is the worst video I've ever seen. Which is what I asked for, so I can't complain, really. I was like, wow, that's-- yeah, you took me literally there, didn't you? So anyway, let me find a good-- so do you see any-- do you see how she's got some color in her black shirt there at all? OK, there we go. We'll make this happen. Okay, so we're gonna add a preset. Just so it looks pretty obvious, we'll do a MPEG-2 high quality encode in 60 minutes. Okay. That's why it's cool.

So filters, noise removal. Well, there we go. You see the color right there? That's good. So I can do a noise removal. And I could do a really intense thing. It's kind of blurry like this and like that. But I can also do just purely chroma channels. So it removes all that chroma cast, makes the colors a little more accurate there. You can see nothing, can you?

Okay, that's all right. The color precision work on the trade show is always a battle. Okay, so yeah, I love that filter there. So I'm so happy to see it. I should talk about that letter boxing filter, cuz I was talking about it before, but I can do some good stuff with it. Scale, center. Center, I don't know. Map, there we go.

You could. Yeah, the mix of things is kind of a battle. Probably not. I mean, it would be weird to have a clip that's got both-- it needs both inverse telecine and really aggressive chroma noise removal. Typically, either really bad or really good. I mean, film source is not that bad. So there's so many times where I'll use After Effects to make a master file that will then use other tools to encode. I do that pretty frequently. But just the levels filter, the whole Photoshop levels things. You see your black and white in your gamma and all that stuff. It's such a wonderful filter. I wind up using it all the time. So I use After Effects just to get the levels controlled. You can visualize the luminance so much better. But no, pretty much I'll typically use one tool for preprocessing and I'll use another tool for encode if I'm going to use two different tools. In a lot of cases, I mean if I'm doing a big complex I did this job last week where I did a bunch of different formats. I made a master file on After Effects and used three different compression tools. The optimum one based on the format I was doing. So I used like Cleaner 6 for the QuickTime file, you can also answer for the MPEG-4 file, so it's just dependent on what, you know, Windows Media Encoder, sorry, for the Windows Media tests. But... Which runs fine on virtual PC, by the way, if I can bring it back into the Mac world. Surprisingly swift, you can turn off the preview mode.

Okay. Good. So how are we doing on time here? We got... All right. - Okay. Sounds good. So what else do people still want to see? What do you want to see still? What's awesome you haven't seen yet? Who's got a problem I haven't solved for them yet?

Micron. Micron. Micron. Oh, sorry. I'm not going to be the hard ass. I'm sorry. I'm overcompensating. Let's see some Squeeze. Let's see some Squeeze. Squeeze has got a pretty slick pre-processing UI. Its preview isn't that great, but if you know what you want to do, it does the job quite simply. But the problem is it's hard to tell if it's doing the right thing or not.

So here I'll just do the same darn video over and over again, because such is my life. Thank you. Every now and then my wife will tell me, I cannot hear that trailer coming from the basement ever again. You must get a new clip. That's when I get new samples.

I was working with this Smash Mouth music video for Why Can't We Be Friends. I used it for like two years. That's why I said, you can't. Either the trailer goes or I do. Thought about it for a while, but it seemed that... Trailer went. Okay, so Squeeze has just one processing dialog. All this stuff goes on. It has inverse telecine. Like Cleaner, it requires you to specify what your field order is, which I always think is a lame of a tool to expect people to know what the field order of your source is. This kind of thing, it's in the file. QuickTime has a tag for it. Don't ask the user that question.

You can figure it out on your own. And also don't call it field dominance because field dominance is something else. But it's field order. But anyways, in this case, it actually is a lower field dominant source file. I just happen to know. It's got a little noise reduction thing, which is fine.

It's got this preview module that's kind of hard to work with. This is a high resolution source file, and it's-- I've actually been using Squeeze for-- And it's kind of hard to find the crop dialog, too. And you can't really do the aspect ratio thing. So I kind of frustrated because I know I've got so many more pixels here that I can look at. So it's kind of hard to know if you actually got it exactly right. And the finger thing to grab is kind of hard to get. But it's little things like that. But it does the job. But it's a little-- there we go. Something like that. But because I'm only at a 1/3 resolution, I can't really tell if I really got it all.

So I have to be a little more aggressive on them in Squeeze. It doesn't ever tell us any. You can do a fade in, you can do a fade out. You can do normalized audio, just no controls. So I mean, Squeeze is kind of like the-- has the 20% of the features used 80% of the time. So for lots of products I have, Squeeze is just fine for what I need. But if I need to go more esoteric, I'll go to a cleaner in After Effects or a ProCoder to do the other processing. It's got a video noise reduction. Light mode works pretty well. You've got all your filters. And it doesn't do live update on drag, but it does when you let go.

Yeah, so that's pretty much it for pre-processing. It's kind of weird because you specify how it's processed here, and then when you're actually in your setting, it's where you define what your output frame size is. It's got one kind of cool thing it does, which is that it will, if you type in a weird aspect ratio, but you say, like, I want 640 by 480, but I copied it to a different ratio, it'll actually add letterboxing to make it right. I don't want to turn that feature off, because I actually, you know, like I said, if I get it off by a couple percent, I'm not worried about it, I don't want to add a three pixel black line to the top and bottom of my video because it's going to make it hard to compress. So I'm almost always going to turn this off and just do the math to make sure things match up right. So 640 by 52 like before. So there you go.

Unlike Cleaner, where you can customize the filter settings on a per output basis, on Squeezie, do it on a per input basis, which is better in most cases when you're doing the exact same thing everywhere, but it's kind of a pain in the clear to have it constantly be replicating the same pre-processing settings over like five settings, and if you get one of them wrong, you have to go and fix it five more times. But when you actually want to customize it for each one, it's a little bit handier to have that kind of control. So, okay, that's squeeze for you.