Configure player

Close

WWDC Index does not host video files

If you have access to video files, you can configure a URL pattern to be used in a video player.

URL pattern

preview

Use any of these variables in your URL pattern, the pattern is stored in your browsers' local storage.

$id
ID of session: wwdc2006-507
$eventId
ID of event: wwdc2006
$eventContentId
ID of session without event part: 507
$eventShortId
Shortened ID of event: wwdc06
$year
Year of session: 2006
$extension
Extension of original filename: mov
$filenameAlmostEvery
Filename from "(Almost) Every..." gist: ...

WWDC06 • Session 507

QuickTime Streaming for IT

Information Technologies • 1:07:48

Take advantage of QuickTime Streaming Server and QuickTime Broadcaster, both built in to Mac OS X Server. See a number of real-world streaming solutions using QuickTime services in conjunction with third-party tools. This session will help you get your internal stream for training or external stream for financial webcasts up and running in no time.

Speakers: George Cook, Steev Dinkins, Dave Schroeder

Unlisted on Apple Developer site

Transcript

This transcript was generated using Whisper, it has known transcription errors. We are working on an improved version.

Welcome to QuickTime Streaming for IT. My name is George Cook. I'm a consulting engineer for Apple, and I work in our education group. And today, I'm going to introduce a couple of other speakers. But before I do, I just wanted to mention a class that we recently developed, podcasting and streaming internet media. So I did bring the course guide up here. It's a three-day hands-on class offered by our customer training group.

And it covers a workflow for streaming media, as well as using the podcast server, as well as a streaming server for delivery. If you go through the class and take the certification test, you get three credits towards Apple Certified Systems Administrator and more information is available at that URL.

So that's a more detailed version of what we're going to cover today. Today we're going to focus in on some enterprise-level deployments of streaming. One is by Apple. So this is Apple using our own technology. And Steev Dinkins is going to present our webcast studio, which is a very sophisticated environment for delivering live webcasts and also recording those webcasts for on-demand access.

And then Dave Schroeder, this will be his third year here at Worldwide Developer Conference, and he's going to give us an update on University of Wisconsin's Digital Academic Television Network, which is an innovative project to bring television to their IP network at the University of Wisconsin. So with that, I'm going to turn this over to Steve, and he's going to go through in detail the webcast studio that Apple uses internally.

Thank you, George. Hi. Hi, I'm Stev Dinkins, media producer/engineer at Apple. and I'm here to talk to you about the Webcast Studio System that we custom built for Apple for sales training. I've been at Apple computers since 2002 with 20 years worth of music production and audio engineering production experience and 15 years worth of video web computer and multimedia production experience, ranging from camera work to cinematography, motion graphics, video editing, DVD authoring, you name it. I'm in the creative services within sales training and we range in productions from high-end video to broadcast, webcasting of course, and DVD productions and print and web publishing. All in-house in Apple for sales training.

On to the webcast. So the webcast program was kicked off in 2002 and initially we were using a third-party solution for that. And that company got bought and dissolved and was no longer supporting that software and was no longer developing it. So we started researching. I should also add that software always had limitations that hindered us, such as having presentations that were just limited to still JPEGs and we couldn't do any kind of motion or anything in that slide presenter window. So the research went into how to replace this system.

So we went through pretty much all of the third-party solutions for web conferencing and net. Actually, I'm sorry. We went through actually net conferencing and webcast meetings. And all of them were falling short in some kind of regard. Either they were too complex or they were overbuilt or underbuilt. And they didn't have the paradigm of broadcast, which is the closest to what we do with webcasts.

There was also the high costs involved with those solutions. And high costs with setup, high costs with yearly fees, monthly fees, or the worst case scenario, with per webcast, per user, per minute fees. And we didn't want to get locked into those things. And we figured if we could take that money and build something in-house and not be locked to those kind of fees, we'd be winning.

So we took six months to develop and research. And we found that just using Apple technology as well as third-party hardware and software, we could develop a system that would be suitable for our uses. And six months later, we went live earlier this year with that system. So hopefully with what I'm going to show you, you get some ideas on how you can use it. And hopefully you get some ideas on how you can use the technology for your own purposes or for your clients' purposes.

So agenda, what I'm covering today, going over the requirements and objectives that led to the choices we made, going over the functionality of the webcast viewer, what the audience sees, as well as the admin app, what we use to configure and schedule webcasts. And then we'll go over the front-end technology, which is behind those two applications, WebObjects and Apple Server technology, as well as the back-end, which is all video, audio, and Apple computer technology, which you'll see in the studio.

Also be talking about software integration. We're using a lot of different software, Apple and third-party, working together as a solution. And finally, the production workflow. How we're using this on a daily basis to produce webcasts. And I'm going to go forward. So some of the objectives and requirements that led to this primarily is to train the entire Apple sales force. And this is also spilling into retail soon.

Of course, this webcast system is a real-time, QuickTime-based video streaming internet communication system with chat functionality going into the studio for distance learning and training. So the cost savings, if you think about we get audiences from 100 to 500 people per webcast. Then you think of the savings in travel time, travel costs, not having to rent out conference spaces, hotels, all of that. The savings are huge. Now multiply that by three to five times a week, monthly.

Yearly. And you can see the enormous cost savings there. And it's been an integral part in the program since 2002. Timeliness. We don't have to wait very long to get communications out to people. We don't have to schedule months in advance. We schedule weeks in advance or maybe a day in advance.

Minimal user, oh, computer output streamed real-time. This is a huge improvement over the old system. That's what I was alluding to before. Before we only had JPEG slides, and now we have real-time computer output streamed to the audience for keynote transitions, animations, builds, QuickTime movies embedded inside of QuickTime, as well as software demos.

Anything that's coming out of the computer, we can stream out to the audience. Minimal user setup requirements, we want to make sure that this is an easy experience for people to log into webcasts. And so it really fits in with the same kind of situation that people are used to logging into websites and seeing something immediately.

We also need a manageable production workflow. In the past 10 years, I'd say that syncing slides and video, you'd think it would be easy. It's not. It really isn't. So this system, an objective was, how do we make this really easy so it's not painful for us? We can do a whole lot of other things, as well as these webcasts. So we have minimal setup time. We have reliability and a lot of flexibility with this system.

And of course, we're talking about live webcasting, but the on-demand processing is also key. So you'll see how we have a workflow that's really simple to create on-demand content, but we can also use the captured standard HD and DV video to do post-editing. Maybe we can repurpose this stuff if that's requested, as you'll see later.

So I'm going to go over the functionality here and show you a screenshot of the webcast viewer. You can see there's two streams combined there into one QuickTime movie and a chat field at the bottom. And I'll go over this a little bit more in a couple slides.

And here's a demo of the webcast. Welcome. Thank you for joining us today for the post-NAB update. Just to underscore that fact, every major broadcaster throughout the world produced a production on Final Cut Studio last year. And not only that, but every major movie studio also released a production that was based on Final Cut Studio last year. Now, if I go over to Audio Effects, In the upper right-hand corner in the pane, I'm in my editing pane, and I'll zoom up. So within audio effects, we've added some really neat things within iMovie 6.

As you can see, we've got animations happening within Keynote. We've got a product manager demoing software. And we're thrilled to have this functionality and easy to manage. So I'm going to break down the viewer here. It's simple, but it's powerful. We've got video of the presenter on the left side there. It's QuickTime 7H264, 240 by 180 pixels, approximately 150 kilobits a second.

Over on the right side, we've got the larger window with the computer output streaming in real-time at 640 by 384, which is a healthy size. It's really nice. You can see all the print on there. People don't have to redo their slides for this smaller window because it's pretty healthy. And that's at 200 kilobits a second.

Now, the aggregate total of that data rate, we've been shocked to see that it actually goes down to about 150 kilobits to 200 kilobits total. And that's astounding thinking about the amount of pixels we're pushing there at 15 frames a second per video stream and the quality that we're achieving.

We've got title graphics on the left there for the product line title or the program title, as well as the description of the webcast and a date. And at the bottom there, we have a chat field that the audience can type in questions straight in real-time to the presenters, so presenters can tailor their presentation towards the audience's needs.

and the requirements are really simple. You just need a web browser and QuickTime 7 and that's Mac or PC. And you need broadband internet connection. 56K we're not supporting. Thank God. Okay, so usage. It's really easy. Users just log in to a front portal URL. All they need is a webcast ID and a passcode and they get that through an email invite or on a published webcast invite on internal web portals.

And there's an optional three fields in there for their name, email and location. That's a really huge value add for presenters. They can follow up with people with questions that they didn't have time to answer during the webcast or if they wanted to take a question offline one-on-one with somebody. They can do that.

This is on to the webcast admin. This is the application we use for scheduling and configuring webcasts. I'll go over some of the features really quickly on what that's all about. So as soon as a webcast is booked and scheduled, we can go and create that webcast with a date, broadcast ID, passcode, time, title, presenter information, etc. We also have a field for QuickTime URL that can be a unique URL per webcast, and we also have dimensions we can request or define in there. Because if it's widescreen or 4 by 3 aspect ratio, we can accommodate that.

Next is the session info. We're able to see the number of users who have logged in. That's imperative for us to know that everything's working right. We see the numbers grow, we're happy. We can also see who has signed in, especially if they filled out those optional three fields. We also have a transcript of that incoming chat, and I'm going to show you screenshots of what that looks like.

So we've got a screenshot of the session information. As you can see, we've got the name of the session. The user location and the email address. And here's a screenshot of the chat transcript. You can see the name of the user location and the chat that they entered. Now, both of these, the session and the chat transcript, we export out to a text file and we give that out to presenters after the webcast so they have a record of everything.

On to the front-end architecture, which is behind those two applications you just saw. And it's all built on Apple server technology. And that viewer admin is custom-coded web objects applications, custom-coded in-house by a guru mastermind by the name of Russ White. Thank God for him. He's a mastermind. It's incredible what he's done with this. So two XServes are driving those two applications, one for the webcast viewer and one for the admin app.

And we're also deploying XServe, XServe RAID, and XSAN for capturing the HD and DV video to centralized storage. And that's been a tremendous lifesaver. Running fiber channel from the server room to the studio. And this is really brief. So I'm going on to the back-end architecture, which I'll go into more detail on this. So there's a screenshot of the webcast studio, modest and powerful. There's a shot of a webcast in session.

There's another screenshot there showing the mixer. And there's from the presenter side. We got three machines there for keynote presos. And then there's a computer on the right side there for seeing incoming chat. So there's a diagram of the system. I definitely needed to put this in a visual form in order to even configure or wire this thing up. It looks deceptively simple to me now. I don't know about you, but it's complex.

Okay, so on to the video gear, breaking this down into pieces. Video gear, audio gear, and computers. So first, video. We've got three standard def cameras, various video sources. We've got a graphics powerbook, actually now MacBook. And that's running CG or titles into the system. We also have a DV tape deck for remote presenters that might be recording their presentation remotely and sending out that DV tape to us so we could feed it into the webcast system. All those sources are going to a video switcher, so we can switch between any of those sources. And we have preview monitors so we can see any of those sources independently of the program output.

On to the audio gear, we have several microphones, computer audio from the presenter machines, as well as a phone interface for remote presenters or interviews. And as well as an iPod. We have iPod music or music playing before the webcast starts. And it's not just to be pretty, it's so people know that they've logged in and something's happening there, it's successful. And it gives us a point of reference for establishing integrity on the stream.

We have headphone monitor and room speaker, so we can hear the remote presenter on the phone. And on to... Streaming machines, actually. That's a bummer. A slide is missing. I'm going to refer to the photo here, because that's what I'm talking about. So on the right here-- it's almost chopped off-- there's a chat monitor, as I said before, where we can see the incoming chat coming in. And that's using iChat, and it's using Jabber Server to relay all those messages from the webcast viewer from all the people into the studio and seeing it right there.

We also have the presenter machines, MacBook Pros. And usually, we just have one keynote prezo pushed via Apple Remote Desktop to all those machines. And we have this exact clicker, and we advance slides on all three machines simultaneously. So we'd have people spread out, and they can still see the keynote.

We have a VGA switcher at the bottom there. So if we wanted to configure it so we have a presenter machine and a software demo machine, we can switch between those or have different prezos on all three if they didn't combine them. advance. So, where is that VGA switcher going? It's feeding that computer signal into an HD scan converter, converting it into an HD video signal.

and David Koehn. But you can guess where it's going from there. It's going into a streaming machine. So we have a Power Mac G5 with a Kona card for HD input and that's streaming out to Akamai to reflect outside the firewall to the world. We also have another Power Mac G5 that's taking the presenter video off that switcher and it's converting it to a stream out to Akamai as well. So there's the two streams, computer output and the video of the presenter.

So capture machines. The question is why do capture when you have encoded streams that are already there and you could just go and trickle those and record them to disk? Well, in our opinion, that's not good enough. We're Apple and we want to set a standard here. So we're using compressor on the HD and DV so we could get better quality there. It also enables us to do post-processing and final cut if we wanted to do editing on that. You wouldn't be able to do that with the kind of ease and quality if you just had H.268. encoded video QuickTime movies.

So that's all captured to XAN, which again is beautiful workflow. On to software integration, we're using a number of software applications all working together. The viewer and the admin, as I said before, is Apple Web Objects, custom coded. Streaming, we're using VeriSoftware Wirecast and primarily, well there's a number of reasons. We're cropping prior to streaming. DV has the infamous black bars on the left and right. We get rid of those.

And the HD signal usually has a little border around that. We crop it out, clean it up and stream it. KonaCard Aware, Wirecast can see the KonaCards. It's a beautiful thing. And we're able to do source switching with Wirecast. So oftentimes, presenters are messing around with slides five minutes before we go live and we don't want the audience to see that. So we switch out to a title graphic or switch to black.

Capturing, we're using Final Cut Pro, of course. Capturing straight to XSAN, the HD and DV video. And for post-production, we're using Final Cut Pro to export ref movies and using Compressor to do the final encode. So, packaging. Everybody's probably, well, some people are probably wondering, how are we packaging the two streams together? And we're using TotallyHips LiveStage to do that.

So, two streams and a graphic element combined together for an embedded webcast viewer for live. And for on-demand, we're also using LiveStage for creating the on-demand version. So, the Compressor encoded QuickTime movies, just tie those together, and it's done. So, production workflow. You've seen the behind-the-scenes. You've seen the technology. How are we using this to produce webcasts on a daily basis? Well, from the start, it's been planned to be strategically simple.

So, live, we just prep the QuickTime movie with a unique graphic and a unique name. Get the keynote presos pushed out to the MacBooks, lights, camera, decks, computers, iPod. Get all that ready. Begin the streams simultaneously to achieve sync. And test those streams for sync integrity and audio-video integrity. Begin capturing machines and go live. And the post is even simpler. So, we use Final Cut to export out synced HD and DV, the in and out points.

Export those out as two ref movies. Throw that into Compressor and encode to H264. And then take those movies and combine them together with LiveStage and post it to internal or external websites for on-demand viewing. In conclusion, We had a lot of needs that we knew from the last three and a half years using a product that kind of worked, but it really wasn't anything that was going to wow anybody. So we took all of the, and we had a lot of customer or client complaints from the audience. Took all of that knowledge and research and plighted towards creating our own system in-house with Apple technology and third-party hardware and software that was engineered, tested, deployed.

And right off the bat, the audience was resounding applause. Before, we had complaints every single webcast of slides not coming in, not updating, not being in sync, whatever. That's all gone. Webcast adoption, we're at an all-time high for booking webcasts up to five days a week. The keynote drop-in is an awesome feature. Presenters don't have to dumb down their slides or they don't have to alter them. They can just bring in their slide deck, how they're used to presenting it, and there's no limitations on the system for that.

and software demos. We've had product managers in demoing everything from iLife to the pro apps like Shake. So expectations have been blown away on this thing and it's been a lifesaver. We're able to crank these things out and it doesn't bog us down. We're able to do a lot more creative things within the department. And this technology is just super cool. Seeing video, audio, server, software, computers all working together to achieve this goal.

And I have to mention QuickTime 7 being at the heart of this because without the efficiency and the quality of QuickTime 7 H.264, there's no way we would have tempted this. So that's it. And I'm going to hand it over to Dave. So if you're blown away with that, just wait to see what Dave has in store for you. Thanks a lot, Steve.

Well, I have to say that Apple's webcasting stuff is really cool. University of Wisconsin's actually done a webcast through their webcast studio before, too. And we had really good results with that. So I'm Dave Schroeder. I'm a systems engineer at the University of Wisconsin. And today we're going to be talking about an IP-based video delivery system that we developed.

And this has been something that I've talked about here at WWDC for the last two years. So this is our third time presenting this here. And we started this off as a pilot, and this turned into a production service that has been serving the campus for the last two and a half years and is going to be really expanding going forward. So some of you who have heard me talk before, some of this stuff is going to seem like you've heard it before, and that's because you have. But for the benefit of those who haven't, we'll just briefly touch on some of these things.

So we had been operating a cable television infrastructure that was using traditional coaxial cable for years. And we actually have two, one that was called the residential television network that delivered a full complement of normal cable channels to our dorms, housing, some of our remote housing locations, but this was all university-owned housing.

And it's something called the academic television network. Which was a network that we ran on campus that had some specific university content. So we had channels that the colleges and departments and different university units could actually put on it. And then we had some just local cable content.

So we had a selection of cable television channels available on it too. And as we were going forward, we wanted to explore what options we had for transitioning this to an IP-based system. And it turned out that after everything we went through, Apple server and software technologies were a good fit for delivering this solution. And one of the big things that was important to us was being standards-based.

So a little bit about the University of Wisconsin. We're, University of Wisconsin-Madison is one of 26 UW system schools. We're the largest, we're the flagship campus, we've got a lot of people, we have a lot of buildings, and we have a pretty big central IT component, but that only represents about a quarter of the IT staff on campus. So we're very decentralized when it comes to IT. But one thing that we were centralized on was our network. And that's one of the things we leveraged going forward.

So we already have a lot of Apple technologies that we use, like XServe, RAID, XServes and whatnot in mixed environments. In the center picture there you see some suns mixed in with some Apple equipment. And because we already were using Apple technologies, we were fortunate to be able to have some of the decision makers on campus consider this as a solution too. So this is what we're going to talk about today.

I already talked a little bit about ATN. And it turns out that one of the hardest things to do to deploy this system wasn't the technical part. It was actually getting an agreement with a content provider to be able to Provide channels over an IP network. And that's something that a lot of people really balk at, and that's something that our cable operator was a little bit skittish about too.

But we used the agreement that we had with our cable operator where we're providing this selection of channels on our old academic network. That was the network we ran ourselves as the basis for putting them on our IP network. We kind of made the argument that we were already distributing these things on a coaxial network.

We just want to change the media that we're using to distribute them. And it's kind of interesting to look at the history of what brought us to this point. So those of you who have been in IT and networking for a while will kind of appreciate the history here.

And when we started out back in 1980, and that was before my time at the university or really any kind of school at all, we started out with serial connections to the UNIVAC 1108 in the academic computing center on campus and quickly realized that the requests for connections to this would fill all the available conduits in the computer science building. So we went to an RF based system that was very similar to a cable television system. And it didn't take long for departments to start asking us if they could also share that network for video purposes as well.

And then the network grew and evolved over time until we got to where we are today, which is a 10 gig ethernet backbone. And that's the basis for what we call the 21st century network, which was funded in part by a very generous donation from John Morgridge, who's the chairman of Cisco Systems and also a UW alum.

And the 21st Century Network was really our answer to one of the greatest networking challenges we face, which is how to face exponentially increasing bandwidth without exponentially increasing budgets. And also to manage provisioning a network to 30 major departments, 30 departmental lands that go along with them, and then a whole slew of smaller departments and units and remote research stations and all sorts of different things.

And so these were our solutions. And the bottom line here is that we wanted to Converge everything to IP, which meant getting rid of things like IPX, AppleTalk, which we finally turned off actually just a few weeks ago. And... Also, kind of just gathering together, reining in all the different management models people were using on campus for managing the network, centralizing those, but still delegating the ability for people to administer the network.

And use as many automated tools as possible. And one of the big things we did was we changed the revenue model. The old model was based on bandwidth, so we had people running entire buildings off of half-duplexed 10 megabit Ethernet. And that was fine because all their people were doing was checking email and browsing the web, and they didn't see any reason to pay anything more for it.

And we decided that it was an important thing to make high-speed, high-performance managed network access ubiquitous across the entire campus. So that it wasn't a budget decision that determined what kind of bandwidth people had and thus what kind of capabilities people had for things like video conferencing. And some of the fancy newsletters. We had a couple of people who were using the Internet, and they were using the Internet for a lot of different reasons.

One of the things that we did was we had a lot of people who were using the Internet for a lot of different reasons. And this means we can also afford to upgrade the network as time goes on, because this is a cost that we assessed ongoing. And one of the other critical things for the video stuff I'm going to be talking about today was support for IP multicast.

That essentially allows one host to deliver something to many hosts with only the equivalent impact on the network of one stream. So as more and more hosts are added, they just listen in and it's kind of like a tree branching out. But there's not a separate broadcast or an additional load on the server or whatever the source is for each additional broadcast. And for more information, multicast.internet2.edu is a good place to start.

And this is a little diagram of essentially what multicast means. So A here would be a server and in a unicast model, if you have three clients listening, there are three streams that go out. And there's two clients behind that R3 route, you know, which could be a router or other network device there.

It has to have two streams going to it or 20 or however many things are behind it. In a multicast model, there is only one stream going out on any given branch to support clients. And if there are no clients listening on a particular branch, then there's nothing there at all. So that's the difference between unicast and multicast, and that was really important for us.

[Transcript missing]

We also are able to reach other clients other than Mac OS and Windows. With open source products like VLC, we can serve clients that are running on Linux, Solaris, and anywhere that product runs. And cost was another important factor. When we were first rolling this out, and of course costs have come down a lot on various products over the last three years, but when we were first rolling this out, some of these individual hardware encoders were upwards of $25,000 and even $50,000. And we were able to get a lot of these products to run on Linux, and we were able to get a lot of these products to run on Windows, and we were able to get a lot of these products to run on Mac OS X Server.

And now, today, we can have multiple streams on a single server as the computing horsepower has increased. And this is even more true, I mean, I said this when I first talked about this in '04, and this is more true today. The cost of the XSERVs have either come down or their performance has gone up and we get more bang for the buck. And so the software, QuickTime Player, free, QuickTime Broadcaster, free, QuickTime Streaming Server included, an OS X server.

A lot of these QuickTime capabilities that are just kind of a bonus that come along for using QuickTime, like the QuickTime text track, skins, which allow for things like custom interfaces, which we actually don't use right now because we have a separate application for playing channels that I'm going to show you later. QtKit on the Mac OS X platform allows us to really easily write these custom applications.

And because of the granularity of the system, because it's not some kind of a turnkey or dedicated solution, we can plug different pieces in and change around how we do things. And it allows us to do some of the really, I think, innovative things that you're going to see later.

And so what we started with when we initially deployed this pilot, and it's kind of interesting because our production system is actually running on the same equipment that it has been since the first time I talked about this. So we have a head node that's responsible for the web front end for the service, which is the primary way customers visit the service to start watching TV if they're not using the player application.

This node also serves up the function of monitoring the streaming nodes with server monitor, and it has its own local monitor keyboard and mouse for local administration. And then we have a bunch of XServe cluster nodes that are actually our streaming nodes. They run QuickTime Broadcaster and our services for doing closed captioning decoding, closed captioning archival, and of course the video and audio streams themselves.

And we have a bunch of support equipment with each stream because we actually get the channels via analog cable from our cable operator, use tuners to tune, throw that into a FireWire video converter, which is we're actually using some of the Canopus ADVC products to do that. And then it goes into the streaming node via FireWire.

The software we're using is really pretty straightforward. OS X Server, of course, QuickTime Broadcaster, QuickTime Streaming Server. We use it for some specialty applications, but QuickTime Broadcaster has the ability to send streams to multicast right out of itself. So we don't even need to use streaming server for anything.

We have one broadcaster instance running on each streaming node, and it sends it out to a multicast address that's valid for the entire campus. And whether there's one or a thousand people watching a channel, it's the same load on the server. There's just one stream coming out of the server. So it's really zero or one, no matter how many clients are watching.

And we use Apple Remote Desktop, of course, for administration, server monitor, and we actually have an HP OpenView infrastructure that our Network Operations Center used for monitoring hosts and services in our data centers. And so OpenView's in the mix, too. And then the client end, as you already know, of course, we have QuickTime Player.

We've got some open source players on other platforms, or frankly, Mac OS and Windows as well. We have the ability to play to any MPEG-4 compliant device. And we have it in the cloud. And we haven't actually done that a lot because we discovered that the Mac Mini is a really nice set-top box. And we'll talk about that a little bit more later.

So what we do is on each streaming node, they're really just set up as appliances. They've got all their stuff hooked up to them. The tuner's tuned to whatever channel it's supposed to be tuned to for that box. Boots up, logs in as the streaming user, which is an unprivileged user on the machine. It's just a normal non-admin user.

The machines monitor themselves, obviously, via the Watchdog process. And they're also monitored by some external agents. QuickTime Broadcaster starts when the thing boots up. A very simple Apple script does this. And then our closed captioning, capture, and database insertion script starts and is also monitored by Watchdog. Now, we could do this other ways, but, you know, Watchdog was the way to do it back then. So that's how we're continuing to do it on the production side. service.

Some people have asked, you know, how do you make sure that QuickTime Broadcaster always stays running? And we do it, you know, really kind of a simple, straightforward way. We have an entry in the cron tab that runs a script every minute. It simply checks to see if Broadcaster's there.

and if it's not, respawns it. And the way it gets respawned is by the same script that launches it and tells it to start broadcasting in the beginning when it boots. We have found that this is actually extremely reliable. And I wanted to get some of this stuff in the presentation so that when these go up on the ADC site, they'll actually be in there so you can see some of the stuff that we did.

One other thing that we do is after some of the hardware became powerful enough to start doing multiple streams at the resolutions and data rates we were interested in, we also wanted to run multiple instances of QuickTime Broadcaster. Now QuickTime Broadcaster, while it's very nice and of course free, isn't really designed to run multiple instances on the same machine.

So we have, I wouldn't even call this a hack because it's so simple, but what we do is we just duplicate the QuickTime Broadcaster application and in each successive instance, so in the case of having two, we open up the application package and then edit this Info.plist as you see right there.

Basically all this is doing is it's changing the bundle name of the application which also determines what its preferences file name is. This makes Broadcaster have, each instance of Broadcaster has its own app preference file so they don't trample each other and continually overwrite each other when you launch the other one or make changes by launching a separate instance of the app. So now two instances of Broadcaster can run and they're completely independent of one another preference wise and they can take inputs from different sources.

Now it might be worth mentioning here that as Steve talked about in his presentation, they use Wirecast. Verisoftware's Wirecast does multiple inputs and multiple streams by default, but for various reasons we couldn't use their product for this deployment and we're actually, I'm actually going to talk about that a little bit later too.

One of the things that we did, not at first, was added a player application to the mix. So some of these things will make a little bit more sense when I show them to you later. But this is a standalone player. It doesn't play things through QuickTime. It's its own application.

It was written by Brian Deeth, who's actually also here too, so you can talk to him about this later if you want. He's waving in the front row there. Who's with our school of journalism. And I think this was really an example of the kind of flexibility that going with QuickTime offered.

Because someone who really wasn't affiliated with this project at all decided that they wanted to write an app that would interface with our service and play our content and even do some more special stuff that we weren't doing. And the granularity of our solution allowed him to do that. And of course we were more than happy to cooperate with him. But some of the things that it added, which was a complaint that a lot of people had, and this is a Mac OS X application right now.

There's possibilities of it presenting itself on other platforms, but right now it's just a Mac OS X application. It provides kind of a TV-like remote control interface to our service. So that when you open it up, it looks like any other kind of TV tuner application would that you might get for a computer.

It's got a little remote control. It's got a channel list. But it dynamically changes the way you're playing. It dynamically updates this list from a central database. So if a channel gets added or a live broadcast starts and someone already has the player running, the channel is going to be dynamically added to the list of channels that are available to the player. And the player checks periodically with the database to see if there's any new channels. It supports the Apple remote and it also has the ability to respond to queries from external devices.

One example is AMX Netlink. AMX Netlink is an AV control system that we use in a lot of our classrooms at the university. And some of you are probably familiar with this. And the Netlink has the ability to send out kind of arbitrary commands via telnet, for example, to another device at a particular IP address. So these things can listen.

And when someone walks into a classroom, they can walk up to one of these Netlink devices and press the TV button and then press CNS. And then on the big screen in front of the classroom, CNN comes up. And other than the brief quick time buffering that happens, it doesn't appear to be anything different than ordinary TV to either the person who's changing the channels or the person watching it.

One of the other things that we realized we could do is we wanted to add closed captioning for accessibility reasons. And we thought, well gee, as long as we're collecting all this closed captioning stuff and decoding it anyway, why not take it all and insert it into a database? So again, in cooperation with Brian, we started inserting all the closed captioning text into a database. And then we thought, gee, that was a good idea. Seems like it would be nice to actually have the video or some kind of context from the video along with the text.

So we have thumbnails, still thumbnails from every minute for all the channels that are also inserted into the database as a binary object. And the way we do that is actually pretty interesting. On each of these streaming nodes, they don't have video cards or anything, but they all come together. And so we kind of think they've got their dummy local display. QuickTime Broadcaster's up on it.

Well, when QuickTime Broadcaster's broadcasting, it shows the preview right there. So with some open source tools, we take a screenshot of the screen, then we crop it down to the exact place where the image was. And then a script runs. And of course, all this is in one kind of script that runs.

And then a script runs that inserts it into the database. So we don't think that we would have been able to really do this with any other kind of solution, especially a turnkey solution. If we just decided, let's just put thumbnails in a database. The product would either support it or not.

But here we have the choice to do this. And one brief thing there, QuickTime Broadcaster has the broadcasting balloon in the preview window while it's broadcasting. Well, those are just TIFF images inside the application package. So we just remove those in our deployment. So that while it's broadcasting, the broadcasting balloon isn't there and therefore also doesn't show up in our thumbnails. And then we developed a web page.

And we have an interface for actually searching and interacting with the closed captioning data. And after we discovered that archiving text and still images was a good idea, of course, we decided that archiving video would even be a better idea. So we have some video and we're starting to do more.

And we use QuickTime Broadcaster's integrated recording functionality for this. Video is retrieved using some specially crafted URLs that use Smile. And just before we came out to WWDC, we started kind of an initial partnership with the UW General Library System to make 7 terabytes of video, which is the most recent video. And it's yet to be determined which channels it's going to be and everything like that, via, you know, kind of under the auspices of the library.

So... The first year we talked about this, we, actually I'm sorry, the second year we talked about this, we talked about what was new. And this stuff that was new bears mentioning again. And QuickTime 7, of course, was new. H.264, really a good codec. But what we found out pretty quickly was that we didn't quite have the horsepower to do H.264 encoding yet for standard definition TV live in real time.

The MPEG-4 codec was actually still giving us a little bit better quality and was a little bit easier on our encoding machines. So that's, we anticipate that that's probably going to change in the future, but right now we're still on MPEG-4. But the bottom line of all this is that standard definition quality TV is possible in about a megabit.

And that was really great for us. And improvements in QuickTime Broadcaster 1.5 and QuickTime 7, and this was the same. At the last WWDC, but I just want to make sure everyone knows this. QuickTime Broadcaster 1.5 with QuickTime 7 is now, it's now possible to do full frame DV and full frame from various input devices like the Miglia Alchemy TV card. Previously, it would discard half of the frames and you'd get about half the resolution. But now you can do full frame video, 640x480 or 720x480 with these cards. What's newer? Obviously, CPU power has increased.

So here's some examples of some of the results that we've been able to get at various bit rates and resolutions. Now, when you look at the Core Duo listings down there, I think that those deserve a little bit of explanation. The reason why those are there, and I haven't had an opportunity to test anything on the Quad Core Xeon Mac Pros and XServes yet, but I have a feeling that they're going to be quite nice. These are the ones that we've been able to get at various bit rates and resolutions.

These are low-end machines. I mean, something, we can take a Mac Mini, something that's basically, you know, a $600 or $700 machine and do almost with it what we could do with high-end server hardware, you know, not that long ago. So I think that the future is very bright for the Intel platform, too, when it comes to encoding. Now, even today, on a Core Duo Mac Mini, you can encode almost, we're almost there. where Full-frame H.264 using Wirecast.

And I think that as the Mac Mini matures and as some of Intel's newer and higher speed processor offerings make their way into Apple's products over time, we're going to be there with even products like the Mac Mini. And one of the reasons we use Wirecast in our testing on the Mac Mini is because Wirecast leverages the GPU.

Well, none of our X-Serve encoding nodes have GPUs in them. That's why we didn't use Wirecast on them. And Wirecast can eat just, you know, even more performance out of a machine by leveraging the GPU. So even the relatively low-end Intel integrated graphics in the Mac Mini, it can do a really great job with H.264 encoding. And if you don't have to do 640 by 480, if you want to do a little bit less or if you don't need 30 frames per second, I mean, these things, are great little encoding boxes.

So what's newest? Transcoding. And this is kind of the result of, we've been able to do this, this is a result of I think our good decision-making process to go with open standards, to go with QuickTime in the first place. We take one encoding format, which in this case is MPEG-2 Transport Stream, and convert it to something else on the fly. So our cable operator delivers MPEG-2 transport stream video directly to us via IP.

and it's never been analog before. It comes from their digital source. It gets put into their network, packetized and whatnot, sent to us and we convert it to MPEG-4. Now, if we didn't have these licensing encumbrances with MPEG-2, We might just take the video content as it is and have no physical infrastructure to support video at all.

That's not the case. And we decided to not try to fly in the face of MPEG LA or be a test case to see what would happen if we used something like VLC as our player. And then we also had some thoughts about do we really want to, you know, do we want to make a judgment to Use something like MPEG-2 that has to be licensed going forward, even if we pay for it.

and David Koehn, the founder of QuickTime, are also here to talk about the future of QuickTime. What was key for us is that when we do this transcoding, everything on the client end, how this appears to our users and customers, is all the same. The flexibility that QuickTime gave us in the first place is what allows us to plug in new pieces going forward. And so here's kind of a little chart of how exactly the video is coming to us from Charter.

And the reason why this is a little bit more complicated than it would seem to be is because Charter really wants to protect its multicast core. They don't want to expose that to any kind of outside entity. So what we're doing is kind of acting as an intermediary and putting these channels on our own multicast core. This isn't part of our production service yet, but this is just something that's going to be tested for a while to see exactly how well it works.

And actually, how we transcode is with VideoLand Client. It's an open source media player, and it can transcode between any formats it understands. It runs on OS X. It runs on OS X because OS X has a Unix foundation. This application would have never made its way to Mac OS 9. So we have Mac OS X's Unix heritage to thank for applications like this being available.

And it can really... And it can really transcode anything it can understand to any other format it can understand. And we found that we can do about one, two to four megabit stream per modern processor core. So that means things like the G5, things like the Intel Core architecture.

And like I said before, the decision to use QuickTime gave us this flexibility. Does this mean that we're abandoning QuickTime? Not at all. We are embracing QuickTime even more on the client end because we realize, because that's how, you know, A, that's how people will play this video, and B, it's treated us this well so far, and we don't have to do anything to any of our clients. We don't have to make people make changes in order to watch new channels because QuickTime can understand the content that we're delivering to them.

It's compatible with our existing support model, all of our existing things we've built for Dayton. And again, here's a couple of, here's, a lot of people ask me about this. Well, how do you transcode? How exactly, what do you put in at the command line? Here's a couple of things. These are obviously not intended to be written down, but they'll be in the presentation when you can download it.

One thing I would mention is if you look at VLC for doing these sorts of things, you can use the streaming wizard, which is in the newer build. That's kind of the, the old builds of VLC as an example. And it, it shows the actual commands it's using to generate the stream, do recording and that sort of thing. Now, I would also mention that If QuickTime supports MPEG-2 transport stream itself in the future, we may look into switching back to QuickTime. In fact, we'd like to use QuickTime to do the playback and/or transcoding of this content.

So here's a typical encoding node that's using Broadcaster and actually getting it from the external audio source. You can see the tuner that we have and a couple of other pieces of equipment in between. The closed captioning decoder is hooked up to the serial port on the XServe, and that's how the video gets in, or that's how the closed captioning gets into the machine, and there's a little agent there that sits there listening to it.

This is a typical encoding node if it's using an internal tuner card. This solution wasn't available when we originally deployed the pilot, but we actually have a couple of channels that are using this now. And that's an all internal solution, so you can have two tuner cards and do two channels in one U of space.

And a transcoding node is nothing more than a box with some computing horsepower. So that's also an all-internal solution. And so what does Dayton look like? Well, here's our pilot deployment that turned into a production service. It's a bunch of XServe cluster nodes, some tuners, and some support equipment, which is sandwiched in between each XServe because the tuners are only like three inches deep, so there's some dead space in there that we can use.

And we would have liked to have everything done in just kind of one box, which we could do today, but this is what we had to do to build this solution when we originally did, and it ended up working out very well. And it's a testament to the quality of this product that after we've set it up, and even though we've added things to it from time to time, we haven't really had to do anything in the way of maintenance or upkeep, short of just ordinary best practices like OS patching. And so what does Dayton look like? Well, it's a bunch of XServe clusters that turned into a production service. It's a bunch of XServe clusters that started out as a production service.

[Transcript missing]

And here's our closed captioning search page. This is something that we're going to be rolling out with the library. This hasn't become a public service to just the broad campus yet. An example of some search results that have come back from a search someone's done on CNN.

And the key for Dayton, anywhere the network is, Dayton is too. Whether it's an office setting or a lounge or conference room or classroom. Big screen TV up on a wall somewhere, a recreational facility. Here's an example where they've got five 42-inch plasmas hooked up to Mac Minis as set-out boxes.

So the Mac Mini has a set-top box. It's really perfect. Inexpensive, small, quiet, can be run like an appliance. It's got an array of connectivity that's perfect for this application. Supports a remote control. Dayton Player also supports a remote. When this thing is configured in kind of an appliance-type way, it's really incredible that it just works. And as a bonus, it's also a full computer.

and it's not much more expensive than some of the set tops we've looked at too, by the way. Mac Mini is a video over IP encoder. Again, a lot of the same pluses, and it's got still the same good connectivity that makes it good as a set-top, except some of them reversed inputs.

And we kind of centrally manage and update these things so the people who have these out at their sites don't actually have to do anything with them. Everything is centrally managed. And so now I just want to show you real quick how what Dayton is, what it looks like. And if the demo gods are all this is designed to work on the university multicast network.

So we're doing some special stuff to make it work out here at all. And we're having some problems with the network earlier, but we'll just see if this works. So I'm going to click my secret little button here on one of these channels. It's going to spawn QuickTime and I'll... Normally on campus this would just play.

Next call comes from Palm Bay, Florida. Go ahead. Good morning. So, that's what it looks like. I guess we don't have QuickTime Pro. Here's an example of the same thing with closed captioning in effect. In my opinion, the cultures of Jews and the cultures of Christians and other religions have learned to live with each other.

There's no doubt if you go to anywhere in the Arabian world, Persian or Ottoman... Now this is actually an interface that doesn't have any of the controls along with it, but that's just another example of how we can present it. And if I were to go full screen, it would just have the captioning at the bottom as you see blown up with the video on top. So that's what it looks like. And I'll show you our search interface here real quick too.

So let's say I want to search a couple of channels for... This is always something that will get some hits. And boom, we get our responses from CNN, which apparently has only mentioned it once. and some more results there. Now if I click on this, now you see our thumbnails. This is the nearest minute thumbnail and I also chose a one-minute aggregation which just will say give me a one-minute chunk around the search result. And if I choose one of these and say, well, I'll just click on the video.

Now if I go down here, now we start the video off about like 10 seconds before the search result and end it 10 seconds after. So there you can see it's going along and this is video that's been archived and, you know, really we can archive as much as we have space for.

So you can see the potential for this for even research applications and the library really sees this as, you know, why should we not archive just like periodicals, newspapers, journals, books. This is another tool that's a legitimate research tool that people at the university can use. So that's our search interface and we can obviously write any kind of interface we want. If I want to say, maybe we collected a show metadata and say I want to watch Law and Order from last week or I want to actually maybe see, you know, here's another example. Um... Let's see here, 10 a.m. to 11 a.m. last Sunday.

With no search term and one minute aggregation, that looks good.

[Transcript missing]

You know, an event that was going on at a particular time and So this has been a really useful tool for us. And now the player-- let's see here. I'm going to have to put it in unicast mode 2.

And if everything's working, we'll get it there. OK. Now, here's a standalone application. Here's how it does its closed captioning and we actually have some preferences in here that let you set the background transparency of the captioning pane and the color and the size of it and how many lines it is and things like that. So it gives you a little bit more flexibility than even normal closed captioning does. And QuickTime full screen without QuickTime Pro.

So that's state and player. And since I'm going to be running short on time here, I just want to touch on some of the things that we, where we said we were going when I was here last year. This was our bullet point list that we had last year. And everything with a green checkmark by it is now either done or in progress. And the two things that don't have checkboxes are two things that I'm very happy about this year.

Move to next generation XServe for encoding platform. Well, I have to say that I was really pleasantly surprised by the new XServe offering. And we're playing around with the possibility to be able to do things like invite TV channels into an iChat conference. And the stuff that I saw with iChat and Leopard makes me think that we'll be able to do some of those things.

Will it have any purpose ultimately? I don't know. But since we've really only had two of those, we're going to have to do some of those things. We've had two people working on this project over the last three years with about maybe 5 to 10% of their time or less, or sometimes 0% for weeks at a time. I'd say that this has been a pretty good investment.

We did everything that you see here for under $50,000. Oh, and since we didn't have a one more thing at the keynote, I will give you a one more thing. And that is that in cooperation with Brian, Dayton Player is going to be released for free. It's going to also be open source.

And I really have to say that Brian's done a lot of work on this. Last year, someone actually had a feature request in the audience, and he had it done by the time I was done giving the presentation. So we'll release the source, too. It plays any content QuickTime can play.

Full-screen playback without QuickTime Pro. Works with the Apple Remote. It obtains its channel information right now from a MySQL database, but it can be a local preference file, too. We have another tool that will be released called Date Administrator, which is just going to be a nice little graphical front-end.

And we're going to post it at this URL. We were hoping to actually just have everything ready today, but we didn't have time. So right now, we're just going to say by next Monday, we're going to have it up there. It could be sooner, so feel free to check.

Our version of the application is available for download now, but we're going to have the general version of the application up there, too. And there's some of our contact information. I'm in the middle. Dave Devereaux-Weber is actually the project manager for Dayton, and he's been the one who's been doing a lot of the political wrangling and meeting with the purchasing agents and charter and all this kind of stuff to get things squared away. And Brian, like I said, is here. So that's the website, dayton.wisc.edu. You can also go to tv.wisc.edu if that's easier to remember.