Application Technologies • 42:53
Instantly communicating with others using text, audio, and video has changed our lives dramatically. In Leopard, iChat enables your application to show its contents to others via video chats using the new iChat Theater API. Learn how to integrate this new capability into your application and how to use Mac OS X's Instant Messaging framework which provides your application with the ability to determine who's online and initiate connections with them.
Speakers: Jean-Pierre Ciudad, Eric St. Onge, Mike Estee
Unlisted on Apple Developer site
Transcript
This transcript was generated using Whisper, it has known transcription errors. We are working on an improved version.
Hi. Welcome to session 124, Connecting Your Application with iChat. My name is Jean-Pierre Ciudaad and I'm part of the iChat engineering team. And today is iChat's first session ever at WWDC and hopefully we'll have a lot more. Thank you. So what are we going to be talking about today? Well, first, we're going to tell you how you can use some of the iChat features and integrate them in your application, how you could do that in Tiger. And then we'll talk about how you can do that in Leopard. And we'll talk about the additional features we have in Leopard.
At the end of this session, we'll have 15 minutes to answer all your questions. So please refrain to ask your questions until the end of the session-- of the presentation, sorry. And This afternoon at 2:00 PM, we will have a lab session where you will all be able to meet the entire iChat engineering team and ask any questions you want. The lab session will be located in the Mac OS X lab.
So let's start right away with iChat in Tiger. So in iChat in Tiger, you are able to use the Instant Message framework to get information about presence and status of all your bodies in the body list. We also had some limited AppleScript support that would allow you to log in, log out, or send text messages and also get account status and information for all your buddies.
Now, in Leopard, we've done a lot more. First, we've extended the AppleScript support to most of the iChat features, and we're going to talk about that extensively a little later. In addition to that, we've added a new API, the iChat Theater API, which is part of the Instant Message framework.
iChat Theater is used to share your application content using iChat, and we're very excited about this feature that you've probably seen during the keynote. And we're so excited about it that we're going to spend at least half of this session telling you how you can do it in your application. So video chats today are limited from a person-to-person video conference, or maybe one to two or three people. What we've done is that we've extended the video chat model to a show-and-tell model where you can actually talk about some content and have a conversation about it.
So let me right away do a demo so you'll know what I'm talking about in case you haven't seen it during the keynote. Can we switch to the demo machine, please? Thank you. So what I'm going to do, I'm just going to initiate a video chat between those two PowerBooks here.
So here I'm talking to Eric and what I'm going to do here, I'm going to start a movie, an Apple ad, you know, of course. I'm going to stop playing it. And I'm just going to go to the menu here and say share with iChat. And when I do that, iChat automatically goes into iChat Theater mode and starts showing the content to the other applications. So this is from my side. So this is as simple as that. So that's iChat Theater.
[Transcript missing]
Could we also switch the screen to the presentation? Thank you. So-- iChat Theater is great when you have a Leopard machine talking to a Leopard machine, but that's somehow limiting. Especially now, I mean, only you guys have the Leopard build. So, what we've done, we've added a replacement, what we call a replacement mode, which is sort of a compatibility mode.
So, when you use Leopard and use iChat Theater and start talking to a Tiger machine or a Panther machine or even a Windows machine that's using AM 5.9, we actually replace the camera input with the output of your application. So, basically, the entire screen changes to what you're sending through your application. So, that allows you to actually take advantage of iChat Theater even with machines that don't know about iChat Theater.
OK, so some of the resources that are available for you. Again, I just want to reiterate Q&A at the end of this presentation. There is a website that I believe contains information about each session where you can find documentation and sample code. I would highly recommend that you go back there and download the latest information about this session because we updated it last night.
So if you downloaded it before last night, you should go back and get new sample code and new documentation regarding iChat Theater. You can also send us any feedback by contacting Apple Developer Relations. And as I said earlier, you can come and see us at the lab session today at 2:00 PM and meet the entire iChat engineering team.
So let's start right away talking about Instant Message Framework and Apple Script Support. Eric? Thanks, JP. Hi, my name is Eric St. Onge, and today I'm going to talk to you about the Instant Message Framework and AppleScript and how you can use it in your own application. So to give you a rough overview of the APIs that you can use to interact with iChat, there are roughly two APIs that you can use.
The first is the Instant Message Framework, which as JP mentioned earlier, you can use to get presence information. And this API is available in Tiger. New in Leopard is the IMAV Manager API, which you can use to interface with iChat Theater, and we'll have someone else talk about that later.
There's also an AppleScript API that you can use to actually take control of the iChat application itself. So first, I'm going to start by talking about the IAM service. And I'll just give you an overview. I'll talk about some of the key points of the API, and then I'll show you an example.
So the IAM service roughly provides you with presence information about yourself and other people in your buddy list. But when I talk about presence, what is it that I actually mean by presence? Presence is a term that we use in iChat sometimes. It might not make a lot of sense to you. But presence is basically the availability of someone in your buddy list.
So if we take a look at a buddy list here-- this is the new buddy list in Leopard-- the presence roughly corresponds to the status gem that you see next to a person's name in that buddy list. And you'll see the status gem has a different color depending on whichever status the person has.
For instance, there's green, meaning available. There's yellow, meaning idle. Or there's red, meaning away. New in Leopard, there is a white status gem, which indicates an unknown status. This is usually used for mobile devices. And of course, there's also no status at all, which is no status gem.
But along with the status, there's other information that you can get from someone who's logged in. For instance, the status message. So if someone leaves a message with what they're doing when they go away, you can get that. There's also a picture. Or a buddy icon, as it's usually called.
You can get that and put that in your application. And there are some associated capabilities that indicate what a person is capable of doing over iChat. For instance, this could be audio, video, iChat Theater, file transfer, or other capabilities. And if someone has gone through their buddy list and has added address book cards associated with screen names, you can actually get the full name of the person from address book that way.
So today in Tiger and in Leopard, there are already a few example clients that we can point to for the Instant Message Framework. The first of which is iChat's own Menu Extra. So if you've gone into iChat's preferences and enabled the Menu Extra, you'll see a balloon in the status menu of your system. And if you click on the balloon, you'll see the buddies who were logged in to whichever service you select. And you can pick a buddy and start a chat with the buddy from that menu.
Mail and address book also use the Instant Message Framework to indicate the status of someone who's online. So for instance, if you receive an email message from someone who might be online, you'll see a status indicator next to that person's name. And you can then click on that to start a chat with that person through iChat. The same is true of address book. And of course, we'd love to see this in your application as well. So if you're ever using an address book card, it might be appropriate to show a status gem next to that person's name.
So the IM service roughly has a few class methods that you'll need to know. So if we go and we take a look back at the Buddy List, probably one of the key methods that you would need to use is my status. And this basically corresponds to the current status that you have set while you are logged in in iChat. And this is basically the drop down you'll see at the top of the Buddy List and the status gem you'll see at the top of your own Buddy List.
And you can query iChat for that and find out what your current status is. Additionally with that, you can get an idle time. So if you've been away from your computer for a certain period of time, you can get that information from iChat if you want your application to respond to do something after you've been idle.
You can also get a list of all of the available services that iChat is currently locked into. Now, of course, you could have one service. And in Tiger, you could have an AIM service, a Jabber service, and a Bonjour service. New in Leopard, we've expanded it so that you could have multiple Jabber services or multiple AIM services. And you can really add as many of these as you want. So I've got two in this example here.
But if you ask IAM service for all of the services that are available, it's always going to return you three. This is kind of a key point. So it's going to lump AIM services together into a single AIM service through the framework. The same is true of Jabber. Of course, there will always only be one Bonjour service. Through the IAM service, you can also get a notification center with which you can register for notifications from iChat. I'll talk about that in a minute.
Thank you. So once you have an IAM service that you've gotten through all services, there's key pieces of information that you can get from that as well. For instance, the localized name. So if you needed to display which service someone is on in your UI, you could get the name that way.
And of course, this is going to be AIM, Jebre or Bonjour in English or whatever language you're using. It'll be localized. You can also get the status of the service. So you see in the buddy list here, we've got five buddy lists, four of which are logged in and one of which is logged out.
So if you were to ask Bonjour for its status, it would tell you offline. If you were to ask AIM or Jebre, it would tell you available. If you had one AIM service that was available and one AIM service that was offline, it's going to take the most available status, and it's going to tell you available.
You can also, through the IAM service instance, get a list of all of the screen names that are currently logged into that service by the info for all screen names method. This is just basically going to return you a dictionary of everyone in that buddy list that you can then go through and figure out. And you can also query a service directly for the screen names that are associated with that address book card. So that's the first step.
So for notifications, once you have a notification center from IAM Service, there are notifications that you can receive when the presence changes. There are several notifications that you can register for. The first of which is service status changed. This corresponds to when you log in or log out of your own account. So if you have an AIM service that logs out, you could respond to something by listening for service status changed.
When someone else logs in or logs out, you're going to get person status changed. And when someone else's capabilities change, for instance, if they start an AV conference or switch from available to away, you're going to get person info changed. And with each of these notifications, you're going to get a user info dictionary with keys that correspond to different pieces of information about that buddy. For instance, the person's screen name or the person's status.
And depending on the notification, the dictionary will be a little different. For instance, you're going to receive a person info change notification very frequently, so there's not a lot of information that's going to come through in that dictionary. And you can look in the headers and see which keys are available. to you.
So let me give you a demo. This is a demo that's a really quick demo using Address Book. So this is going to go through Address Book and it's going to get all your buddies who have screen names associated with them and display it in a buddy list.
And then it's going to interface with iChat and it's going to ask iChat for the status information for those people and it'll display a status gem. So let me show you the demo. So you can see over here I've already got iChat logged into one account, so I'll just launch AVPresent.
So now, I have a new list over on the left. This is AB presence. And as you can see, I've got a few buddies who are logged in. And since iChat's already running, it's going to go and show that status. So if I go back to iChat and I log out, you'll see the status changes just as the person logs in or logs out. And if I log back in, you'll get the status jumps to reappear.
So the source for this demo is available for you to download. I would encourage you to take a look at it. The demo is using the new Objective-C 2.0 API, so you're not going to be able to run it on Tiger. But it should give you an example of the things that you can do in a really simple way to use it in your application.
So can I go back to these slides, please? So moving on, I'm going to talk about Apple Script a little bit. Now, I'm going to give you an overview of the framework versus scripting, and I'll talk about when you're going to want to use the framework and when you're going to want to use scripting. And then I'll just give you an overview of the API, and I'll talk about a new feature of event handlers, and I'll show you an example to sort of tie it together.
So as I showed you earlier, there are three APIs. The AppleScript API is the one you can use to control the application itself. So as we said before, in Tiger, there was really basic AppleScript functionality in iChat. You could log in or log out. You could send a text invitation or a text message. And you could also query a buddy list for people who are logged in.
But in Leopard, this has actually really expanded significantly. So let me just show you a picture of the dictionary. You don't have to really pay attention to much of the dictionaries. If you have a Tiger system, you can just open this up in Script Editor. But just to show you just a comparison of what changed. This is the dictionary in Tiger. As you can see, we've got a few commands and a few classes. This is the new dictionary in Leopard. As you can see, there are a whole bunch of new commands. Thank you.
A whole bunch of new commands and a whole bunch of new classes that expose some new things. And of course, all the Tiger commands will still work if you have scripts that use that. So we're keeping that old API around. There are new commands to work with text chats.
You can actually get handles on text chats and interface with them that way, including chat rooms. And there are also new commands you can use to interface with AV chats. So if you wanted to start recording, for instance, one of our new features, or if you wanted to go full screen, you can toggle it that way.
So to talk a little bit about the object model, there are only three basic classes that you need to know. The first of which is a service, and a service corresponds roughly to a buddy list. As you can see here, we have five buddy lists, and this will give you five services. This is a little different from the way the framework works in that the framework will give you one AIM service, one Jabra service, and one Bonjour service. The AppleScript API will actually give you five services, so you can actually get, the different buddy lists are exposed separately.
Of course, within each buddy list, there are several accounts, which is basically a buddy in your buddy list. So if we go into just a single one here, you'll see that this AIM service contains several accounts. And then, of course, what takes place between a service and an account is a chat.
This could be a text chat or an AV chat. So you see we've got a text chat on the left, a fairly stupid joke, and then we have a text chat or an AV chat on the right just taking place. You can get a handle on that as well.
So sort of moving on. So here's a sample script that you could use to start a new text chat. And just to sort of walk you through, since some of you may not be familiar with AppleScript, it's basically a matter of going down the hierarchy and telling the right object the right command. So first you're going to just basically start-- the highest level scope is iChat itself. So you'll start there. Then you'll tell the specific service what you want to-- tell the specific service that you want to talk to.
At that point, once you have a service, you can start a new chat with a specific account on that service, with the start chat with account. And once you've started that chat, you'll get a handle to that chat back, which you can then tell to do things. For instance, in this case, you can post a new message into that chat. So if you were to go and run this script, this is going to go and it'll start a new AIM chat with someone named my buddy, and it'll post hello my buddy to that person.
So to work with a video chat, it's fairly similar. Again, you're going to start at the top level, which is iChat itself, and then a specific service. At this point, the command is a little different. It's start video chat versus start chat. And you can do the same thing with an audio chat, if you wanted to do that.
So once you've started that chat, you'll get a handle to an AV chat object, which you can then tell to do things. So for instance, here, the script will tell it to go full screen and to start recording once the video chat connects. You can automate that through scripting.
So the API is fairly straightforward. The syntax, as I said, is mostly a matter of sending the right command to the right object and going down through their hierarchy to find out what the right object is. But new in Leopard, there's a scripting bridge available as well, since I'm sure some of you don't know Apple Script or maybe aren't the hugest fans of Apple Script, that's fine.
If you like Ruby or Perl or some other language, there are scripting bridges available that expose all of this to you in your own language or whichever language you like. But realistically, how many scripts do you need to actually just go and start a new chat? I mean, it's nice to do, but probably not going to use it all that much, which is why we've introduced in Leopard a new feature called Event Handlers. An Event Handler is basically an Apple Script callback when something happens in iChat.
So as you can see here, this is the dialogue that you'll see in iChat's alert preferences. And you can hook up an Event Handler to any one of these events, and I'll show you how. So the event handlers are part of their own new dictionary in the iChat suite. And there's roughly a new command corresponding to each event.
And the event handler syntax is also fairly simple. It's a little different from a regular Apple script in that you're not actually telling iChat to do something. iChat is telling your script to do something. So instead of tell application iChat, you're going to say using terms from application for iChat.
At this point, it's just a matter of copying in the correct message handler or the correct event handler that you want to use. In this case, it's a message received handler, so with the message received, you're going to get the message that was sent in, the buddy that sent it, and a reference to the chat object in which this conversation is taking place. And once you've got that message, you can do work with that specific message.
And to install an event handler, it's fairly straightforward. What you need to do is you need to copy an AppleScript file into tilde slash library slash scripts slash iChat. And then you need to set it to run in iChat preferences. So if you go to iChat's alert preferences, or if you right-click on a buddy and get info and go to the Actions tab, you'll see there's a new check box that says Run AppleScript. From that dropdown, you can either pick an AppleScript that's in tilde library scripts iChat, or you can copy one in there if there isn't one there, or you can just go directly to that directory and put one in.
And then moving on, I'll show you a little bit of a demo. This demo's going to be Eliza, which some of you might be familiar with. This is an old Lisp program that I think has been around for quite a few years. So what it's going to do is it's going to take some input from a chat.
So if someone sends me an instant message, I'm going to take that message, and I'm going to send it onto Eliza. So for this demo, we're actually using a Pro library called Chatbot Eliza. So we're not actually writing Eliza itself in AppleScript, we're just using a Pro program and running a terminal command.
So we're going to pass this onto Eliza. We're going to take that text back, and then send it back through the instant message to sort of simulate a virtual therapist. And this is just using a message received handler that processes messages and sends it back. So let me show you the demo.
OK, so I've already got this system set up. And I've already copied the script into the correct place. So what I'm going to do is I'm going to go into the Alerts Preferences. Then I'm going to go into Text Invitation. So Text Invitation is going to be run the first time you receive a text message from someone.
So what this one is going to do is it's going to run-- when it runs, it's going to accept an invitation and sort of make it full screen. So it'll get out of the notifier mode. Then I'm going to go over to Message Received and run the same script. The script has two handlers in it. And we'll run it that way.
So at this point, Chatbot Eliza is already hooked up. So if someone were to text message me, like JP here, you'll see that Eliza is just going to go and respond. And conversation will take place really without me even touching the machine. So we can just let JP talk to Eliza for a few seconds.
So that's kind of fun, but sort of another thing you can do here now is if we go onto the second machine and we hook up the Apple script on this one and then hook it back up, we'll see that we'll have Eliza chatting back and forth with itself.
Another thing you'll notice here is that sort of as the conversation goes on, some rate limiting is going to kick in. So it's not going to just sort of go full force both ways. AOL is going to sort of step in and prevent you from sending too many messages. And this sample script is available for you.
Let me just shut it off here. So we can go back to the slides. So this sample code is available to you for download. You'll need to install the Pro module to get it to work, but the sample code is there. Additionally, we've included a few other samples, including one to run in iTunes remote control.
So as the bot receives commands, it'll parse it, and it'll tell iTunes to do something else. So you could use iTunes remotely if you wanted to do that. There's also a message sent preprocessor. So what this is going to do is it's a message sent handler that's going to go and look at the messages that you're sending out and respond accordingly.
There's also some file transfer automation that you can do. We've added some new handlers and some new commands in iChat to send files and to deal with receiving new files. And the sample we've given you will take incoming JPEGs and pings and import them into iPhoto for you.
I also just want to briefly talk about security since I'm sure some of you were concerned about handlers running automatically and losing control of your system. So a handler to be run first needs to be installed by a user. You can't just touch a file in a specific place or put anything in a specific location to get a handler to run. You're either going to have to go through the UI and set it up yourself or you're going to maybe have to mess with the preferences to get something in.
But it's never going to install itself. There's also, as you saw with Eliza, there's some rate limiting that'll kick in. So you can't just go and blast out thousands and thousands of messages. AIM and some Jabber services will actually prevent you from doing that. It's just innate in the service. And of course, there's also threat analysis of file transfers.
I'm sure some of you are familiar with Safari or Mail and iChat. But when you get an incoming file, sometimes you'll get a dialog that says the file may contain an executable. iChat's going to do that as well, even if a file is received through scripting or if there's a handler running on an incoming file transfer.
So you're still going to get that security. And to get through that, to move a file out of quarantine, a user is going to have to intervene and press OK. So that was an overview of the IAM Service API and the AppleScript API in Leopard. Now we're going to have Mike here to talk to you about the iChat Theater API.
Thank you very much, Eric. All right, my name is Mike Estee, and I'm going to talk to you about iChat Theater today. So in this session, I'm going to cover a few things. I'm going to talk about what iChat Theater is, talk a little bit about what it means from a user perspective, and what we think the interaction model will be. I'm going to talk about how it works in a little bit more detail and get down into the video architecture and talk about how you can use it in your own applications.
So first off, what is iChat Theater, for those of you who are just showing up? It's a new technology that we think is pretty exciting. It's a way for you to use your own applications to present content in a running AV conference. You can show off slideshows or 3D models or whatever it is you really want to show off. So to that effect, I'm going to do a demo. We'll get to Demo Machine 1 here. So we've got a couple of sample apps on the website here and I'm going to show those to you. So first off, I'm going to start a conference here with JP.
Hey Mike. Hey JP, how's it going? All right. And one of the demo apps that we've got for you is this thing called PhotoPlayer. And it's just a really simple little application. I just hit Play here and I can show you some pictures of my vacation in Africa. Just had a grand time.
So I want you to notice one thing there is that the playback of the slideshow starts as soon as the slideshow theater presentation begins. So we can do some synchronization there. And another one of the little simple apps we've got is this thing called OpenGL Player and it draws a pretty cube. And so I can just switch over to OpenGL Player there.
Show off a little cube, that's kind of cool. Go back to my slide show, switch back to that. So whatever the last application that started a presentation is the one that will be showing content in the running conference. And so I've got another little one that I did.
This is not on the DVD, but I just thought it was kind of fun. I'll call it. So I've got this pretty seashell here. I don't really understand what that formula does, but I can go and present this in iChat and get some of these windows out of the way.
So you can rotate that around, show it off, you know, it's the inside of the seashell. So I want you to notice something else here as well too. So if I go and change the background color of the content in my application, the presentation stays white. So you can actually have different content showing versus what you send to the other side. So I could show the formula in the presentation if I wanted.
[Transcript missing]
So we think this is particularly well suited for things like still image content. If you saw the keynote, you can show presentations. It's particularly well suited for video content as well. So you can play video through the iChat presentation as well. And real time 3D. And we'd also like to see it in your applications.
Okay, so it's not as ideal for things like text, line art, or other highly detailed content. Currently frame size is limited at 320 by 240 and due to bandwidth, the quality can vary. So if you've got something that's highly detailed like text, you may want to add something that allows the user to zoom in and see things a little bit closer.
Okay, so now I want to talk a little bit about our HI model. So first of all, presentations are always user-initiated. This means your application should never try and start one on its own. There's only one application that can present at a time, and the last application to begin a presentation is the one that gets to show the content.
As I showed in the demo there, your conferences can be synchronized to the start of the presentation or not if need be. If you're doing like 3D content, it doesn't matter so much. You can just start. So here's a little flow diagram of a typical user presentation. So the first thing you saw, we start the video conference. Then I'll switch over to your application and begin a theater presentation by pressing play or picking a menu.
So then we switch into a waiting state where we're waiting for the other party to accept. This is where you would maybe play a little animation, show a little theater curtain or something. Just wait for the other party to show up online. And the presentation starts. And I can go back to the application and stop and pause content. And then when I'm finished with the presentation, I can stop it in the application and resume talking to the person that I was talking to before. Conference ends, the presentation will also stop, or you can go back to the application and start another presentation all over again.
So we've got synchronization. Playback can be cued so you don't have to actually wait until a conference has started before pressing play. You can start it in the application and then go to iChat and invite somebody. Works either way. iChat will send notifications when a conference starts or when a conference ends so you can sync up your animation or your drawing or whatever you need to do.
Okay, so how does it work? We're using a couple of technologies that many of you may not be familiar with. We're using Core Video, which is new in Tiger, and it's a little bit expanded in Leopard, for doing video playback. This is an efficient way to get frames around. And we're using Core Audio for doing our audio playback.
So since many of you are probably not familiar with Core Video, let me talk about that for just a moment. Core Video has a couple of different types of frames that it uses. There are CvOpenGL buffer frames. These are used as backing buffers if you're drawing in OpenGL.
And we've got CvPixel buffer frames, which are just raster content-- you get a row by its width and height, and you just draw straight into them. Buffers are one-time use. You don't want to retain them. You get a lot of memory usage if you keep on retaining them. And I encourage you to see the documentation on the website and check out more about Core Video.
So here's a brief little block diagram of the iChat Theater video architecture. So frames start out in Instant Message, and we pass a frame through to your application. There are rendering callbacks that we have where you do whatever it is you need to do and do your drawing. That frame is passed back into Instant Message, which takes it, hands it over to iChat, and it's compressed and sent out over to the network.
So for Core Audio, we're planning to support multiple audio channels. For the current build that you guys have, we only support the first channel at the moment. For GM, we'll be mixing all these channels together and compressing them and sending them over the network for you. So you don't need to do mixing yourself. The audio device ID is provided when the presentation starts. And you can go and do what you need to do with Core Audio.
Okay, so now I'm going to get into a little bit more detail about how you can use iChat Theater in your own applications. So there are a couple of things you need to do before getting started. The first is you've got to link against the instant message framework.
Second thing is you have to register as a data source. We're using a data source pattern for this, which you're probably familiar with if you've been using Cocoa. Then I'm going to talk about how you control iChat Theater from inside of your own application, how the rendering callbacks work, and how you can provide video, and how to update UI when your state changes.
So you can find the Instant Message framework in the public frameworks directory. And as Eric mentioned, we've got a new class for Leopard, which is the IAM AV Manager. This is where all of the video conference starting and stopping and everything you need to do for video lives.
Okay, so first thing you need to do is register as a data source. So you're going to have some object that does your drawing, probably some class, maybe an NSView or an OpenGL view. And you're going to set this, this is a method on the IAM AV Manager. And you'll set the data source and implement the appropriate frame callbacks. We've got two sets of frame callbacks. We've got OpenGL-based ones and raster image-based ones. You want to pick whichever one is most appropriate for your application and only implement those callbacks.
Callbacks will be called from a background thread, so you'll need to do any synchronization. You can use a fancy new object to see 2.0 features if you like. Or if you need to do all your rendering on the main thread, you can defer that. And I'll show you how to do that a little bit later.
Okay, so for controlling iChat Theater, it's really simple. There's not a lot there. First thing you need to do is register for an AD Manager state change notification. If you don't register for this, none of your callbacks will be called. So you need to make sure to register for this. And this registration happens on the IAM service object. Then we have Start. This starts a presentation. As soon as the conference becomes available, you can press Start. You can query the state with the state method. And then Stop ends your presentation.
All right, so for starting callbacks, as I mentioned, you need to register on the IAM Service Notification Center. I've got a little example of that right there. Unlike a lot of other things, the Instant Message Framework has its own notification center which it sends out all notifications on. So if you registered against the regular one and you're wondering why you're not getting any notifications, this is why. So when state becomes IAM available, as in you can begin a presentation, just call start, and then your callbacks will begin as soon as the conference accepts.
So here's a little block diagram of the states. So we get out with "not available." This means that either iChat's not running or for whatever reason video conferencing is not available, so you can't begin your presentation. And as soon as the user logs in or video conferencing becomes available, the state will switch to "available." This means that you can make the start call.
And it doesn't necessarily mean that a conference is running yet, but you can queue your presentation if you need to. Pass through "starting up" and we get to "waiting." This means that the other party is not accepted yet. So this is where you would play a little animation that would show that your slide show is about to begin or something.
And then it switches to active. The animation in iChat is going to happen. Content will start being displayed. And your callbacks will start to be getting called back. And then we switch through shutting down. And it's either going to go back to not available or back to available again, which is the more likely case.
OK. So let's talk a little bit more in detail about the video callbacks. As I mentioned, there are two types of callbacks. We've got OpenGL ones if you're already doing your drawing in OpenGL and pixel-based buffer callbacks if you're already drawing, say, in NSViews or something like that or if you're using Core Image. Implement which is most suitable for your application. We can make gains in efficiency if you're already drawing in OpenGL. We can keep the content on the card as we move it over to iChat.
So callbacks, as I mentioned before, are called from a background thread, so you'll need to do synchronization if you need to. If you need to render on the main thread, like if you don't have threaded OpenGL rendering, you can use perform selector on main thread, wait until done. Now you need to make sure that you finish your drawing quickly since conferences run at about 15 FPS, and they'll vary.
So you need to render at about 30 FPS to get a good performance. There's also a timeout which will disconnect misbehaving clients. If your application hangs or for whatever reason you don't return in time, the presentation will automatically start, excuse me, will automatically stop, and the client will be disconnected.
Okay, so let's talk about the pixel buffer-based callbacks. These are the ones that you would use if you're drawing into NS views or using Core Image to do some compositing. There's two callbacks you need to implement, one to specify what pixel format you'd like and one to do your rendering.
So for the pixel format callback, this is called when you first set yourself as a data source and you specify what format you'd like your pixels in. This could be like 24 RGB or 2VY if you're doing video frames of some sort. Or you can use my favorite pixel format, which is 32 ARGB for OpenGL type stuff.
So for the pixel buffer rendering callback, there are a couple things that you need to keep in mind. First of all is that the buffers can change size on a per callback basis. So you need to always check to make sure that you've got the right size for your buffers before you draw into them.
It's called from background threads, so do your synchronization. And don't retain the pixel buffers. Core Video works with these things called buffer pools. And a frame comes out of the buffer pool, it's handed off to an application to perform something to it, and then it's returned. And you never touch it again.
So here's a little bit of an example. This is a minimum implementation, if you will. This would be implemented on your data source, so whatever object is doing your drawing. So I'm picking the 32 ARGB format. It's a lovely format. And then we get to the render callback here.
And I use some CV core video calls here to lock down the base address of this pixel buffer. Do my drawing, unlock, and return yes to signify that there's a new frame available. Optionally, you can also return no. So this is an optimization we've added. For example, if you're doing a slide show and your content is static for great periods of time, you can save some cycles by not re-rendering the frames.
And we can do some compression efficiency pieces to make compression work better. So you just return no if nothing's changed from the previous frame. All right, so now for the OpenGL callbacks. These are a little bit more complex. There's a bit more to set up, but it pretty much works the same way.
So for callback setup, you're going to pass in the context and the pixel format object that you'll be doing your drawing into. So these hands-- we need this information such that we can make compatible backing buffers to pass to your application when you do rendering. So you'll pick your format and pass these in. This is called when you first set the data source and they're going to be retained by the framework. To release them, you'll have to set the data source to nil.
So you'll pick your format and pass these in. This is called when you first set the data source and they're going to be retained by the framework. To release them, you'll have to set the data source to nil. You're probably going to need to do the perform selector on main thread and then wait until the main thread returns. But you can do that. So the next thing you need to do is attach the buffer to the context that you're going to draw. You'll get a new buffer for each frame.
There's a pool of them in the background. And we're going to pass you in a screen context that is ideal. You can use this as a hint to do your rendering, but you don't have to render on that screen context. And then we're going to ask that you pass us back the screen context which you used. You can get the screen number from the context object and you can find it in NSOpenGL view header.
All right, so here's a little example. So I've got my context in pixel format, which I've created earlier. I pass this off to when the conference starts to do the rendering. And I use this CV call to attach the buffer to the context. I do my OpenGL rendering and return yes if there's a new frame. And just like with the pixel buffer callbacks, you can return no if nothing's changed from the previous frame for optimization.
So for the audio callbacks, there's a couple of things. We have a number of audio channels method which you'll need to implement if you're interested in doing so. It specifies to the framework how many audio channels we'd like to send you. We'll mix them for you, but you need to ask for how many you want.
Again, we only support one right now at the moment, and we'll be expanding that. And also, there's another callback that's called when the conference starts, which is provide audio on device with UID. So this is a core audio interface. You'll need to check out the core audio documentation for more details. And again, the provide audio on device UID is called from the background thread.
Alright, so some of you may be looking at this core video stuff and thinking, that's a little bit hairy. We're also thinking about doing some high-level APIs on NSOpenGL LView and NSView, and we'd like to hear more about that if you're interested. There's some documentation on the website you can check out.
I encourage you to also check out the code samples. We have a little photo player example which I showed in the demo, and then the OpenGL player example which I showed you, which is a multi-threaded OpenGL application. And come see us in the lab session. Back to you, JP.
OK, so what did we learn today? Well, we learned that Mikey's favorite pixel format is 32 RGB. I don't know what yours is, but you can come and tell us at the lab session later. We talked about the Instant Message framework. We talked about the extended Apple Script support that we have in Leopard. And we told you how your application can take advantage of iChat Theater.
So if you want to know more, please check our developer website. Check also everything-- check what is on the conference website regarding this session, session 124. And again, I know I've said that before, please come to the lab session, meet the team, ask all the questions you might have about using iChat and how to integrate it in your application.