Media • 1:08:26
The iPhone SDK provides an amazing lineup of technologies for developing cutting-edge handheld games. Learn the techniques to harness these technologies efficiently for your users' entertainment. See how to incorporate graphics, audio, accelerometer input, touch screen controls, video playback, and much more as you walk through the process of creating a game for iPhone.
Speakers: Allan Schaffer, Kevin Quennesson
Unlisted on Apple Developer site
Transcript
This transcript was generated using Whisper, it has known transcription errors. We are working on an improved version.
Well, hello, everyone, and welcome. I'd like to welcome all of you. Look at this room. I mean, there are so many of you here. You guys are -- I just perhaps want to start, congratulate all of you. You're here at the birth of a new industry for the iPhone. So give yourselves a round of applause.
So my name is Allan Schaffe, I'm Apple's Graphics Technology Evangelist. In a few minutes, I'll be joined by Kevin, who's a graphics engineer with us and one of the authors of the Touch Fighter demo that you may have seen around the show earlier. Yeah, sure. Go ahead.
[Transcript missing]
All right. Also, another element of the iPhone is the ability to do networking. And you've seen this. And so this is something that has a direct applicability for sort of classic games like chess and checkers, backgammon, all those kinds of games where you're competing against an opponent.
But then likewise, just the ability to have big multiplayer games is available to you on the iPhone or with the iPhone. And so, you know, I can start to imagine games coming out where you're competing in a sports arena or a multi-level dungeon or at a poker table or some other kind of game where you're competing against or with or against multiple opponents.
The iPhone has the ability to know its current location. This is something that games haven't really taken advantage of in the past because console games are plugged into your television set, and your computer games are probably on your desk or on your laptop. But just the ability to have a game that knows its location maybe expands the realm of games that you can do beyond just simply porting a game over to the iPhone. Think in terms of new capabilities that you might be able to put into a game that you have, or new ideas.
The first thing that comes to mind with a lot of location-based games are sort of like treasure hunts. But imagine any kind of a game that you're able to change the gameplay. Depending on where the user is. And that opens up a whole new realm of possibilities. And I think it's a territory that hasn't really been captured yet, and so there's a lot of opportunity for you as developers to do that.
I've mentioned, you know, kind of going over to the technology side, the high-resolution display. So, you know, the iPhone, it has a big display, it's bright, it's high contrast, 160 DPI, the resolution is 320 by 480. And so, you know, when we've been working with mobile developers, they're looking at that and going like, wow, this is great.
You know, they're scaling up their artwork, they're adding complexity into their gameplay. And then likewise, kind of related to this, the whole thing is being driven by a screaming fast GPU. And I think you saw yesterday during the keynote some of the examples there of just very, very powerful graphics. We're going to be talking about OpenGL ES in just a moment, and so we'll be showing you a bit about how that's done.
A number of media technologies are available in the iPhone SDK, and in this session, really, what we're going to be talking about are those topics that I've just mentioned. So input, networking, and then things that you see here. And so OpenGL ES for 3D graphics, Core Audio, and OpenAL for audio playback and recording, and the Media Player Framework for full-screen video playback, especially for cut scenes.
In this particular session, we're not going to have time to get into Quartz 2D or Core Animation. Those are things that you would typically be using perhaps for a 2D game. And so -- but there's a lot of sessions here at the conference about those as well. We're just going to be screaming through a lot of different topics.
So if you had a chance to see the iPhone SDK launch event that took place a few months ago, the demo that we showed there was one that, as I said, was developed by Kevin and some of his colleagues. We called it Touch Fighter, and this is a space fighting game.
It took about two weeks to put together. Kevin has had a little more time to work on it since then, and so here at the show, we're going to be showing you, of course, Touch Fighter 2 today in this session. And this is really the example that we're using to demonstrate a lot of the capabilities in the SDK. And so with that, I'd like to bring up Kevin to show you Touch Fighter 2.
Thanks, Sebastian. Yeah. Thank you, Alan. So the iPhone is a fantastic platform for games. And this is because you have this unique set of features that Alan just presented. And so it makes it a great target for game play innovation. And so what we try to do in TouchFader was to take most of this feature in a demo and to see how far we could go. So I'm going to demonstrate in details what we have behind TouchWater 2. Let me go on the demo unit.
Okay, can we go to the demo machine, please? Okay, thank you. So here we are. So we have a new multiplayer mode that we're not going to demo now, but maybe later. So I'm going to start as a single player. And so here you see Touch Fader. You see these beautiful graphics. And so how do I control the ship?
So I steer and I use the accelerometer to steer and move around. And so it's very nice and it's very responsive. It's smooth yet. It's responsive. It's very nice to control, very nice way to control the ship and very intuitive. So to fire now, I can simply tap the screen.
So I tap on the right side of the screen to fire from the right. I tap from the left side of the screen to fire from the left. And you know, that's kind of fun. So you see, you can hear right now, sorry, the 3D positional sound. So as I go around, the thrust of the ship gets panned and specialized via OpenAL. So in the same way, the lasers get panned and specialized. The explosions are also located in space.
So it's a very nice and it's a very immersive experience. So I can do that. So what we did also in Touch Fader is to integrate gestures to target specific moves and special moves. So for instance, if I do an up swipe here, I'm throwing directly at missiles. And that's stunning.
It's really nice. Again, I send the missiles from the left, I hear from the right. It's very intuitive. You see these amazing graphics, all these particle systems going on. I'm on fire, but because it's a demo game and I can die and there's just one level, I'm not really scared about that.
So here you go. So you can also bring UI kit views, so a scenic space where I can put some information. For instance, here, let me show the frame per second counter and some gesture data. And what you can see is that even with all this graphic and all this blending and stuff going on, we're still pretty steady at 30 frames per second. So it's very nice. It's a very powerful GPU that we have here. And you see this little mark here that allows me to debug gesture, for instance.
So some of the things we did in Touch Fader is to add some ability to calibrate. So for instance, here I'm in that location, but I might want to change position as I play the game. So for instance, if I sit, so I can go over here, and Touch Fader automatically detects that you're in a new location under a new inclination and recalibrates.
So it's very nice to allow your gamer to do that, and we'll go more in detail on how we did that and discuss that matter later in the presentation. So... So now I can do the final move that leads me directly to the end of that one level demo game by doing a two finger upswipe that is sending me directly to the end sequence.
So you have a clean transition, and here you have right now an X264 movie playing right up in the game. So no PngR, X264 movie as a cutscene, and here I have some advice from Apple. OK. And I tap the screen. I have high scores. So here again, some view. I can tap my name, and a keyboard shows up, so I can change eventually. So here, that's a good one. I'm at rank two. I'm the only one better than Steve in that game. So yeah. So here you go. That's "Dash Powder 2."
So we learned a lot in doing TouchFatter 2. And so it's a great integrator of all these great iPhone features and technologies. And so we say, well, it'd be great if you could share what we learned so you can-- and hopefully it's going to help you in your own games. So what we're going to do today here at WGBC, we're going to make the TouchFatter source available.
Thank you. So hopefully, we hope you like it and you'll be able to find what's, you know, some content that you can include and integrate in your own games. And so what we're going to do today is that we're going to go through all the basics of the iPhone technologies and APIs to get you started in doing your games on the iPhone that leverage all these key technologies. And we're going to point you to relevant sample code, so to the APIs, to relevant sample code, and to locations within the Touch Fighter source that are relevant and that serves the purpose we're discussing. So first, let me talk about drawing.
So drawing is the first thing you do, first thing you want to do generally in a game. And you have lots of different drawing APIs on the iPhone that targets each specific needs that you might have in your game. So the one you're already familiar with is UIKit. So UIKit is this API and framework responsible for all these nice controls and views on the phone. And it's built over Quartz 2D and core animation.
So Quartz 2D is this API that does 2D graphic generation, so text and lines and gradients and are anti-aliased, so it looks very nice, and also PDF support. And core animation is the low-level API that's responsible for this fluid interaction that you see all around the iPhone OS. And core animation talks directly to the display hardware, so it's some sort of the iPhone windowing system. So animation and compositing engine. And so that's where everybody goes. Everything goes to core animation. That's the real power of the platform.
Now if you want to do 3D graphics, so you have OpenGL, OpenGL ES on the iPhone, and Eagle. So Eagle is a new API specific to the iPhone that makes the connection between OpenGL ES render content and core animation. So OpenGL took directly to the graphics hardware to be hardware accelerated to do these very nice and very fast graphics. So and then you have, for video playback, you have the media player that takes directly to the video hardware for hardware accelerated video playback and decompression.
So now let me focus on OpenGL ES. So OpenGL ES is a high-performance 3D graphics API for embedded devices. And feature-wise, it's very close to OpenGL 1.5 fixed function pipeline, so minus programmability on the desktop. And OpenGL for hardware acceleration uses the iPhone PowerVR NBX Live GPU, which is a very powerful GPU for embedded devices that maximize performance while minimizing power consumption. And to do that, it uses two key technologies on that GPU, which are the Tile-Based Deferred Renderer and Hidden Surface Removal, so TBDR and HSR.
And so what these technologies are, TBDR, is that the renderer, so the iPhone, the OpenGL context, will be split into tiles. And each of these tiles will process the visible triangles and discard the triangles that are hidden. So this before rasterization. And that makes it different from the GPUs that you have on the desktop, because, for instance, some invisible triangle will be discarded. So saving rasterization on these triangles, and for instance, if a texture is hidden, it will never get uploaded.
So that saves bandwidth, that saves power, and so that's a very powerful technique on that particular GPU. And so that's the Hidden Surface Removal part. And so the Deferred part means that some things are going to be decided only at the render phase, for instance, texture uploads. And so that means that if you come to the desktop, you might organize your OpenGL comment stream in some different manner. So we'll go in that section on what it implies to be optimal on embedded devices on that iPhone. So Eagle is what connects the OpenGL render content onto Core NMS.
And so Core NMS is basically a core animation, this iPhone composite windowing system. And it's pretty much equivalent to NSOpenGL, CGL on the desktop, on the Mac desktop, and other platforms. And so it connects to Core NMS using the framebuffer object extension to connect the OpenGL render content to Core NMS surfaces. And so it's a new API on the iPhone, but we made it very easy for you to leverage, so to get started in OpenGL ES.
So with a little bit of a break, I'll just show you a little bit of a demo. So you have the Xcode template, that is the Cocoa Touch OpenGL application. So you just open Xcode, create a new Cocoa Touch OpenGL application, and it sets everything up. So you have on the Eagle side, on the Core animation side. And so you can get started and start doing OpenGL code by implementing the draw view method. So by default, the template is drawing a quad that is rotating. So that's simple, but that's great to get started and to start writing some OpenGL.
So the next thing you want to do in OpenGL is going to be to bring in some assets, some textures. And so similarly, we made it very easy to bring in some textures using a texture 2D alpha class that is shared among samples. So it's used in Touchpadder, it's used in the crash landing sample, and it's very easy to create a texture from an image path or something in your resource, in your project, or including your project, or from some data, but also from CG images or UI images from the system.
So it's very easy to create a texture in OpenGL from the image file and format that you already have. Similarly, it's also easy to create a texture containing a text string using this alpha class. You can create a string with arbitrary font, size, dimensions, and then also very easy to draw using some simple textures. convenience methods that draw in the current OpenGL context.
So the iPhone uses a shared memory system, so all applications and OpenGL use the shared memory, the same memory, and as a consequence of that, the memory limited to OpenGL, so surface and texture, is limited to 24 megabytes. So you really need to only, so it's very important to only use the memory you need. And so there's this great texture compression format specific to the GPU on the iPhone, which is called PVRTC, that we really recommend you choose.
It's really powerful, and you can only win by using PRTC compression. So you win in performance, you win in power conception, you reduce the memory bandwidth that you use, and the memory footprint. So you win on every side. So it's really nice, it's really powerful, and so we recommend you to use that. There is a texture tool, a common line utility, that allows you to convert main known image file formats onto a PVRTC compressed texture format. And then there's also a simple pixel format in texture.
So it's a texture tool that allows you to create a texture to the object from that compressed texture file. But also you can look in the texture to the source to know the OpenGL code. It's also very simple, just a GL text image line and parameter. So really use PVRTC format texture.
So if you want to go further in OpenGL, there are some very nice sample applications that are a great place to get started. So GL Paint will allow you to connect that with multi-touch to stretch your fingers and to have it drawn in OpenGL. And also Crash Landing, which is a very nice and very simple OpenGL ES-based game that leverages Texture 2D following all the assets, OpenAIL for sound, and Accelerometer for control. So it's a very simple application that sets the things so you can really follow them, and it's a great place to get started in OpenGL games on the iPhone.
So that's it for the assets. So you have texture 2D that helps you to load textures, so the maximum texture size is 1K by 1K. And so for 3D content, what we did in TouchFader is to use the existing exporters from the 3D modelers that you have to export into known formats, vertices, and C file formats.
So you can import that in the game. So you can use that. Or you, game developer, already have a pipeline, a 3D pipeline in place. So it should be very easy to pour that pipeline onto the iPhone to getting your 3D assets in the game. So the iPhone uses a shared memory system and 24 megabytes of texture and surface memory. And after loading texture, because the memory is shared, so deallocate -- don't keep two copies of the same object. So deallocate the copy once you're on the GPU. And that's coming soon.
And now some tips specific to the PowerVR MBX Live GPU. So as I said, the PowerVR MBX Live GPU used two technologies, so tile-based deferred renderer and hidden surface removal. And so this requires that you might want to organize your common stream differently so that it makes it easy for the GPU to render, so more efficient.
And one thing you can do is to draw first opaque geometry, so the hidden surface removal thing can work and disregard the good triangle, and then blended geometry. So there's lots of sorting you can do to get optimal performance, and that's one first sorting that gives you good performance.
And then because everything is delayed, for instance, texture uploads, so it's some operation, just like GL Text Subimage, have the same performance that GL Redpixel, that depends on the rendering destination. So you should be careful on that particular GPU of the expense of these operations. So GL Text Subimage, GL Redpixel, all these operations that depends on the render content are expensive. And also, so avoid 2D state changes, for instance, viewport and caesars state changes. So OpenGL ES is going to be the most efficient way to do most of your work, in particular compositing.
So really, we recommend to use full screen OpenGL context with-- so avoid to go towards core animation to do compositing when you don't have to, when it's not needed. So do the most compositing you can in OpenGL for maximum performance, in particular, the landscape orientation in games, if you want to do that net orientation, do that within OpenGL, so changing the metrics and not using core animation. So then again, use PVRTC and MIP maps, which are pretty much a zero cost performance-wise on that GPU, just a little-- memory footprint, but it's great for image quality to have MIP maps using that parameter in the game.
So other tips now specific to OpenGL ES that also link to the embedded world is to minimize the number of state changes. So eventually, consider sorting your render objects per state. So first, running some objects in some model view, so the blended objects, as we said. So the object with lighting on, with the depth testing on, and so on. So to minimize the number of call that you send to the framework that have an overhead.
So sorting is a way to do it. Another way to do it is another option is also to a complementary option. And so to consider caching the current GL state in the code, so to intercept it before it goes to the framework so you can store the current base address or the current vertex array, the current bound texture, that sort of thing.
And also, so you use a lot of vertex array in OpenGL ES-based games. So maximize the OpenGL, the GL draw efficiency by minimizing the number of draw records that you make. And to do that, there are some tricks. So you can use the general triangle method to put multiple strips together and to send them with one draw record.
And that's what we do in TouchFader for all the particle system going on. It's just one draw record when they're the same color and so on. So that's a big win. And also, so use indices. And so only leave the required state on, in particular, lighting and blending, because the fill rate is something that you need to watch.
So use bending carefully to have the hidden surface removal on the GPU. That is not going to work when blending is on. And finally, when you play a movie-- so a movie, for instance, an x264 movie requires the current frame to be present, but also other frames. So a lot of memory needs to be allocated to play the movie optimally.
And so in that case, if performance is an issue, you might want to consider releasing your GL objects and eventually the whole GL context. So before playing the good scene in TouchFader, we release everything to play the movie, before playing the movie. And now let me have a little overview of how we organize that within the TouchPattern source. So can I please go onto this machine, please?
Thank you, so here we have the-- let me go to application didFinishLaunching. We have the application delegate class that does all the work of setting up and organizing the big structural part of the game. So in application didFinishLaunching, we set up some properties on the animation. We create the window. We set the background color. We create the game view, that is the open GLES base view that will do all the main actions, the main part of the game here.
So we add this as a subview. Then we turn on multi-touch on that view. We'll talk about that later. So the big part is that we initialize the timer that we call the gameLoot method at a particular interval that you set. Then we just start everything, make the window visible.
So if we look at the game loop, which is right up there, so the game loop is simply calling the game loop method, associate game loop method, on the game view, and when the game is over, it's calling this play movie method. So if we go to this play movie method, that's really the next part of the game. Whenever you won, the mothership had exploded, and so you get into the exclusive for movie playback part. So we release the timer, and eventually the network timer in multiplayer. We set the background color to white color for good transition into the movie.
Then we release the view, remove it from Super Viewer, release it. We generate the high scores to store the high scores while the movie plays. Then we set up the end movie. And very simply, we can start the movie by calling the play method on that movie. So boom, and it plays. And associate to the end movie playback didFinishNotification, a selector that would be called whenever the movie stopped playing so we can go into the full screen into the high scores.
So if I go to the movie ended method here, so this movie ended method creates an image, a UI image view from these little Winner don't use drugs screens that we show up here with just these four lines. And then we add -- we also show the high scores, so we add it as a sub view, but these high scores are actually not visible at this current time. They're outside the main display, the main rectangle of the display.
So what we do in these high scores, here we implement the touches began method that whenever we tap the screen, we'll create this background view with the sheep rotating, so open GLES view, and we'll begin an animation to bring everything on display. So in the current rectangle, visible rectangle. Then we create this view that allows you to enter your name whenever you are in the high scores. good enough player in Touch Fighter.
So now let's have a simple peek at the main 3D view, the game view. So here in the initWithFrame method, we allocate all the necessary objects and set the associated OpenGL state. So here we set the lights, the lighting, then we create all the textures using texture2Ds. We set some parameters. We set the cameras. We create the star field.
So we create the cube map here. So PVR4 on the iPhone and PNG on the simulator. Then we create the little nice planet. So all these resources are located here. We set them. We allocate. We create them. We set the position, velocity scale. So the same for everything. So here the enemies and-- so here the enemy is a spaceship.
So we also create the accelerometer instance. That would get us the accelerometer data here. Set the smoothing and sensitivity values. The gestures. And then we-- So we restore the NSUserDefault that stores the settings pane. And then we have the UpdateScene method that updates whatever needs to be updated to the position of the objects.
And then the RenderScene method that renders using OpenGL all these objects. And all these UpdateScene and RenderScene methods are called through the GameLoop method in the view, so we can have the rendering on and off. So these are the key and core elements of TouchWriter. So let me go back to slides, please.
Okay, so that was the TouchHunter 2 code tour. So you really see the big elements of the game. It's pretty simple to get all that things together thanks to Core Animation, which makes it very easy to bring in any kind of content in the same application. So you don't have to choose one technology. You can bring them all together smoothly and animated. And so it's a very powerful part of the iPhone.
In particular, it's very easy to bring UI key controls. So you can use programmatically. You can create other views and so on, just like you're used to do. But so you can bring them in. It's very nice to bring in native UI inside a game. But also you can design your UI in an interface builder. For instance, the settings pane, that's what we do. It's just a Nib file. The high score, the same thing. It's a Nib file. So just in one line, we can create a UI view from that Nib file.
And bring it into the game. So if you want to animate it, we simply add this view, add a subview of a current application window. And we begin an animation and set it from a point outside the display to a point inside the display. And in four lines, we have some Nib file coming into the game. So as we said before, this sort of compositing might be an issue. So if you have expensive rendering, you might want to pause it or that sort of things. Or reduce it or be aware of it at least. So that's really nice.
And then to play movies. So playing movies. So movies play full screen. They take the whole display. You can set some properties on how the scaling is made. And if you want some HUD display and so on. And it's great to enhance the visual quality of a game. To punctuate the cutscenes, the big start screen, the end screens.
And so on. To really add some very nice visual content and create an engaging experience in games. So that's really nice. And so very easy to bring in a movie in a game. You simply create a mpmovieplayer controller instance from a URL. And then you can set optional properties on that instance.
And then you need to listen to the mpmovieplayback.didfinish notification. So that your current application is called whenever the movie stops playing. So it can start up whenever you were. And then you just need to call the play method on that instance. And boom, the movie starts playing. Start playing full screen. And that's as easy as it can be.
When that's done, so the callback will be called once the movie starts playing. So you can start up your pngel context, your 3D content, or whatever you want to start up again. So lots of sessions on graphics on the iPhone today and tomorrow, 3D, 2D, graphics animation, control views, and video delivery on the iPhone that we recommend you to go check out. So these are a great complement of that current session.
So now let me talk about inputs. So the iPhone is a fantastic platform because it has this unique way of getting data and of controlling a game, of getting data into a game, and of controlling the game. In particular, multi-touch. So multi-touch is great. So it's super intuitive. So you can really touch the action when it happens on the screen. You can move objects on the display directly in the game.
So you can also create some new ways, some new hot zones, or some new controls. For instance, you type on the right, and it fires from the right, that sort of thing. So you don't need-- you can bypass the usual controllers. And that's very intuitive. That's very nice.
Another thing you can do with multi-touch are gestures. And so gesture is the evolution, so the trajectory of the finger over time. And so that's very nice for games because you can actually create some special moves. For instance, let's imagine in a golf game, you can just describe the swing and how accurate the swing is, how good the swing is going to be in the game. You can do some punches in some boxing game. And that's other things. And you can also use the gesture that you're already familiar with, for instance, swipes and rotates, as we did in Touch Fire, to throw some special weapons, like missiles and so on.
So let me-- So main concepts in multi-touch, so an event and a touch. So an event is shared over the application and among all the views in the current application. And they contain all the information on the current touch event going on. So they contain all the touch instances of the current event. So the touch is contained within an event, contain the information on a particular finger. So the location, the phase, the number of taps, and the timestamp.
So how to get multi-touch in your game? So first on the view, to get it in the view, you set the multi-touch enable methods to yes on that view. And then the touches begin, touches move, and touches ended methods of UI Responder will be called whenever the phase of the current event, so touch event, is changing.
So it allows you to follow the event over time. And so the event will always contain all the touches, and the first argument of these touches began will be a set that contains the subset of these touches of the event that match the particular phase. So let's say I start touching the screen, so the touch began will be called with a set of the first touch, and as the first argument and the event containing that touch, I touch the screen a second time, and the touch will be called.
So the touches began will be called with first argument, this only second finger, and the event containing the two fingers. And when I move the screen, the touches move will be called with both two fingers on both arguments of the method. So you get the idea. So it's really easy to get some multi-touch into a game and to start creating some interesting interaction with a game.
So if you want to now do gestures. So to do gestures, you need to connect the current touch with the previous touch to create some trajectories and to understand the trajectories of that trajectory evolve over time. And so to do that, so it's very easy. You get the touch instance from the current set to your method, and you get this location.
And then to connect that location to the previous location. So there's similarly this previous location in view method that allows you to connect it. Or you can similarly simply keep a cache of the current touch instances, current touch objects, because they will remain the same all around the event.
So the touch object will remain the same per finger all along touch events. You can track it that way. So it's pretty easy to get this. And so to do that, you can connect these gestures into a game. And so to go further, there are these sample applications, which are great. Touches for multi-touch tracking. So it's very nice.
It allows you to follow and understand whatever touch things you're doing correspond to what phase and what method gets called. And so it shows that to you in a HUD on the application. So the moveme tracks a view and do some animations. It's nice. So in TouchFader, we did some lot of work to analyze gestures and map them to known swipes and rotates and so on. In particular, in the gesture class in the TouchFader source, you can get the gesture type as soon as it's recognized.
So it can be a swipe or rotate. And it's going to tell you the angle. So absolute, complete angle and the center. So you can really start doing some nice things from there. And in particular, you can build from there to create and to analyze more complex gesture or texture that you like. So that's it for multi-touch.
Now, let me talk about accelerometer. So accelerometer, it's a fantastic controller. It's a fantastic way to control the game because it's so natural. In Touch Fighter, you can just move around to steer the ship, but in a racing game, you can also throttle and brake. In a boxing game, you can swing and punch, that sort of things. And it's very intuitive and it's very nice. It's just you do something and you have immediately the effect in the game and you understand what's going on right away.
So it's a very powerful way to control a game. You can use that as a controller, but you can also use that as a way to design some unique input method. For instance, you can flip to jump or shake to reload again, that sort of thing. So it's really nice.
The accelerometer data is three-axis information that you get, which is the current orientation, the vector of the current acceleration on the phone. It's stored in a UI acceleration instances with three read-only properties, X, Y, Z. And so that's pretty simple to get information on what the current acceleration is in your game to start using that as a way to control it.
So first, you send the delegate method on shared accelerometer instance to your class that will get called. So the accelerometer didAccelerate method will be called on that class at an interval that you can choose by sending the updateInterval property on the shared accelerometer instance. So the accelerometer didAccelerate method will be called at this frequency and will contain, as the second argument, the current acceleration, so your acceleration that you can query for the xyz value.
So to go further, there's some great sample, in particular the accelerometer graph, that tells you and allows you to understand what the accelerometer sees, what the accelerometer doesn't see. In particular, it's doing a graph of XYZ values of the current acceleration. And so in particular, you'll see that the acceleration is invariant as you rotate around the current acceleration. So that sort of motion will not be seen by the accelerometer. Other things, so, I mean, game developers know always lots of its physics.
But, so, it's going to be pretty trivial, but if you move at constant velocity, so acceleration is constant, so it's not going to change in constant velocity displacements. So something that you can see by playing with this accelerometer graph, a sample application. Another simple sample application, which weighs up, shows you how to connect accelerometer values to orientation on the UI.
Okay, and so in Touch Fatter, we did a lot of work to get accelerometer as a controller right. So in particular, so accelerometer gives you raw value of the acceleration. And so to get a very smooth control in your game, you need to find the good compromise between smoothing and responsiveness.
And so that means that you want to smooth the data to get some smooth interaction, but too much smoothing will make it less responsive. So what we did, the solution we found in Touch Fatter was to do a little bit of smoothing, but also some noise reduction, some signal processing very targeted at noise reduction to get this very nice, smooth, yet responsive control and interaction with the game. So it's really nice.
There's this accelerometer smoother class. We did a lot of work there, so we encourage you to have a look. It feels really nice when you play the game. So you have three properties on this class. In Touch Fatter, that tells you the position, allows you to calibrate as we discuss in a second, also allows you to control the smoothing and the sensitivity of the current acceleration.
So some other thing we learned while doing this work in TouchFaller is that because the accelerometer callback might be in the main thread, and this main thread will be the thread where you're going to do the geo-rendering, and in an expensive game, this geo-rendering might take a lot of time.
So even if you ask for a very high frequency of the accelerometer callback, they might not come as often as you expect them. And so that's an issue in particular when you want to do some smoothing and want some signal processing where the sampling rate, the fact that the sampling rate is regular, is very important.
So you can find your own solutions to that. The solution we found was to actually do the signal processing and do the smoothing from whatever is regular in the current game, and that's the geo-rendering loop. So doing that smoothing in the geo-rendering loop, so in whatever is expensive in the current situation, allowed to bypass and to avoid some... Some jitter and some irregular motion in the control. So again, Chirag, the TouchFaller source for smoothing and noise reduction example using accelerometer data.
OK, so another thing I show you in the demo is-- so it's very nice to be able to give the gamers the ability to, as they wish, while playing the game, to change their default position. So you can start the game by playing like that, and you might want to sit down and to then continue playing the game with that sort of inclination.
So how do you give the gamers the option to do that calibration? And so that's really up to you. So that really depends on the user case scenario that you have. You might be in a game when you want to force the user to hold the iPhone parallel to the floor, and so to create some analogy between the iPhone and the ground.
But you might want to give the option to do this sort of recalibration. So some options, you can use three-finger tap to automatically recalibrate whenever -- well, to recalibrate whenever you do the three-finger tap. So in TouchFader, we do this technique that detects whenever the player is in some outer region, so outside the screen for some time, so statically over there for some time, so with low variance in the movement.
We say, well, that's likely to be the new position the player is in, and so we use that as a new center of movement. And so that works really nice. So that's also something that you might want to have a look at if it makes sense to your scenario. But we find out that it works really nice. So that's what we do. But again, so there's no universal answer. It's a case-by-case kind of problem, but it's nice to keep that in mind. So more on multi-touch and accelerometer. in these sessions on Wednesday and Thursday.
So now let me talk about audio. So the iPhone is a great iPod, and the sound quality is really nice. And so it's great to be able to leverage that in the game. So Core Audio and Audio Toolbox are the place to start to bringing sound in a game. So that's a primary API for audio playback on the iPhone, and it's a C front-end that takes directly to the audio hardware for optimal sound quality and decompression.
So Core Audio has several APIs, and so you need to choose exactly what makes sense to what you want to do in the game. So there's this very simple API called Audio Services that allows you to play short sounds. So it's not for everything. It's really for some punctual sounds in some UI, some beeps and boops.
So not really for mixing, but that's very easy to leverage to start up with. Then there's Audio Queue Services that allow you to play longer sound and compressed sounds. And so that's nice because it automatically loads chunks in memory, so you don't have the whole source file in memory at once. And also for recording audio and playing stream audio content. So now, what if you want to do some sound mixing, and even better, some 3D sound mixing? So for that, there is OpenAL. OpenAL is just a line here, but it's really nice. It's really big.
So OpenAL allows you to do 3D specialized sound mixing. And what is very nice with OpenAL is that it's going to share the same coordinate system as OpenGL. So it's very easy for you OpenGL developer to transfer and to go from OpenGL to OpenAL. You simply associate to the position in your referential of your 3D object a sound with the same coordinate or close coordinate in the OpenAL referential. And then boom, from now on, you have a 3D specialized game. And you know how this sort of audio richness of immersion can really add to the already very rich experience that you have with the graphics API on the iPhone.
So OpenAL is there. It's on the phone. So you heard in touch, amazing it sounds. It's really nice. So OpenAL is there on the phone without audio capture. So OpenAL 1.1. So you can bring it in. You can bring your existing OpenAL code on the iPhone and it should work just as you expected.
But we made it easy to leverage OpenAL in a game by providing you with a simple OpenAL interface, soundengine.h and cpp that is shared among samples. So crash landing and touch fatter use that engine. And it's a wrapper over OpenAL and an audio toolbox to play 3D positional sound and mix. So you can play mixing with simple methods, simple functions.
Load effect, set effect position, start effect and boom, you have your effect playing in space and being specialized. Also, you have background audio track support to load background music, so compressed audio or compressed audio as a background track in the game. So we have an Objective-C wrapper in TouchFader that does a simple, very thin wrapper over that API to set position and really maps and associate. It maps very well with the 3D objects that we have in the game.
So some notes about audio. So depending on the requirement that you have, you might want to choose a good compromise between performance and quality. In particular, in 3D expensive games, you might want to get a sampling rate that requires a lot of CPU for the game, for the current configuration.
For instance, in Touch Fader, with all the sounds going on, so 22K might be a good compromise to get a great sound quality while having great performance. So in some of the games, you might want, because graphic is cheap, you might think that it's fine to bump up to 44K, and that's perfect.
So that really depends on the user scenario, on the expensiveness of the other things going on in the game. So think about sampling rate there. So but one thing that is important is to avoid mixing different sampling rate together, because this will produce or incur some extra conversion step between the sampling rate, which is a cost that you want to avoid.
So you can play multiple compressed files and mix them, but the hardware can only play one compressed audio file at a time. So if you try to play multiple compressed audio files, they will either play in software if there is a software decoder or not play at all. So keep that in mind and consider using compressed sound files in some cases, in these cases. So now more about audio right after this session in Procedio, Audio Development for the iPhone. And so the lab is coming on Wednesday also where you can ask all your questions.
[Transcript missing]
So yeah, so it's a great place. So that's for these host multiple client, multiple conversion. But it's very easy to start from this sample source and to build whatever network configuration that makes sense to your game. So it's a great place to have a first look at how you can do networking on the iPhone.
So some notes on networking. So the name of the device in Bonjour are the device names. And the device names are the device name that you choose in iTunes that you associate with your iPhone. So if you want to do some multiplayer, you might change that from my iPhone to something more personal. And so there is the system configuration framework that allows you to know what network is available.
So Edge or Wi-Fi or not 3G. So there is a sample application called Net Reachability that allows you to get that information and to see how you can do that and how you can get that information. So, and similarly, you can -- the sample source is a great place to start if you want to create peer-to-peer or other sort of networks.
So there is this great thing in Objective-C that allows you to, when you are in a game, that allows you to send data over a connection. So when you're in a game, in Objective-C, you might have objects with a set of properties, and so you have your Objective-C object.
And what would you like to do is to send that object, all that information associated with that object, over a connection so that other clients can get exactly the same object. And so if you make your object conform to NSCoding, simply by implementing these two methods, encodeWithCoder and intWithCoder, you have then an easy way to send your object data to your custom objects over a connection.
So just with this method, ArchiveData, with root object, you can get an NSData object from that Objective-C object, send it on the connection, and then on the other side, we construct the object very transparently. So it's very nice to get an object on both sides of the connection. So allow that. And then we have this Objective-C NSCoding protocol.
So for an example on that, you can check the touchpad or source. So that's what we do to send information. And that's it. So another thing is that the NSArray and lots of Objective-C objects already NSCoding compliant, so you can easily leverage that to send complex structures over the network.
So now let's talk about location. So as Allan mentioned, the iPhone knows where you are. And so what it means is that it opens the door to location-aware multiplayer games. And that's really big. And that's an area that is still getting -- that is very hot right now, that is getting explored. So there's lots of things that are very exciting in that domain. So the iPhone can create a Wi-Fi tower and cell phone tower and a GPS to your current location.
And then it gets this approximate. And so when you're in your network, so you know all your clients and hosts in your game, so configure, you can then put these locations in relation, the one to the other, to know who's close, to know who's far, and to eventually create some interaction with the people around you in the physical world.
So Core Location is the API, the framework that you need to get an estimate on the current location. It tells you altitude, longitude, and altitude. And so on the iPhone, it's very important to use Core Location, and Core Location allows you to do that to only get the estimate that you need so that you can minimize the battery that you require, so the number of cell towers that you will query to get your current estimate.
So to get location, you configure the distance. So you create a co-location manager object. And then you set the distance filter and design accuracy properties on that instance. And so the distance filter say, how much do I need to move before my current location is invalidated? And the desired accuracy say, how accurate I want that estimate to be. So then you call the start updating location method and the start updating location method to start getting an estimate and then stop whenever the estimate is good enough.
So it's very important to set that parameters right so that you do not require too much power for the information you need. If you want some broad, I want to know in what city I am in, so configure that properly so that you don't require too much. So to know more about location and accelerometer, so there is this presidio session on accelerometer and location, so tomorrow, and so we encourage you to go check it out.
So now we're going to do a demo of Multiplayer in Touchpad. So we build over these game classes that allows you to see how it works and to try out. And it works very nice, so we thought we would make you a demo. So can I go on the demo unit, please?
So, and what we have here, we have Allan, who has an iPhone here, and he's on the same network in here. So I can just start here having a multiplayer game, and I'm going to host a game, and Allan's going to join the game. So you have this very nice UI when I can change my shape and it follows my finger, just like in CoverFlow, so it's really nice. So I can swipe and send these people around.
So Allan joined the game, and so I was selecting a little early. So you can see that Allan is moving around, so that's Allan in front of me here, who is moving and playing with his iPhone here. And so I can see he's firing, and he can send missiles too, and... Here we go, they're sending missiles.
And so the whole game state is sent in real time, in background, to Allan Schaffer. So he sees exactly the same thing as Anthony. So it's really nice to do some multi-tayer. You can start playing together and eventually compare the scores and see how much he's winning and how much he's killing more ships than me and so on. So here the whole thing is going on. So he sees the same thing, the explosion, all that stuff is transferred. And yeah. and you see all that stuff going on over wi-fi Okay. So yeah. Thank you. So that was Touchpad or Multiplayer.
And so before asking Allan to come on stage, I really want to give you my final words, which are that I can't wait to see what you guys are going to do, because the possibilities are really there, and the features are really there. And so there's a lot of innovation going on in Gameplay, and I can't wait to see what's going to happen. It's going to be really exciting.
All right. Well, so over the past few months during the beta period, I've been getting a lot of questions from all of you about some device-specific issues. And so what I've compiled is just a quick FAQ of a few of the things that just seem to be questions from everybody. These are not specific to game development at all, and they're going to be covered in other sessions during the show.
But I will go ahead and go into a few of them. So first, what should you do if you're running on an iPhone and a call comes in? Well, what the user is going to be presented with is a screen that comes up over your game where they're given the option to decline or answer the call. When that screen appears, a particular UI application delegate is going to fire. You see it here. It's the application will resign active delegate. And that's when you should probably pause your game. Just the user isn't going to be able to get to it.
And they're deciding, they're seeing who's calling them. Now, if they take the call, your application is going to be terminated. Before that happens, you have just a few moments in this application will terminate delegate method to save off your game state and do anything you want to do for the next time the user comes back into your game. Now, if the user declines the call, now your game is back in active again. And so you get a notification of application did become.
And that's where you should probably resume your game was just paused in the meantime. And so you would do some sort of game specific behavior there. Maybe you want to restart the level. Maybe you just put them, you know, let them hit a continue button. You know, it's something specific to how you would implement that.
A question I get a lot is, where can I save high scores? So the issue is that application data on the iPhone and on the iPhone OS is sandboxed to your application bundle. And so only your application can access its data, resources, and other things from its documents folder and so on.
You don't have the ability, for example, to reach over into another application's resources to read their high scores, and nor is there a centralized location on the iPhone for you to just store stuff to. And so there's a simple approach, though, and it has to do with the NSUserDefaults, which is something you typically can use for preferences, but it works great for this.
And it has just a setObjectForKey and then objectForKeyRetrieval method to be able to just get and set your high scores. Now, if you're doing something a little more complex, you want to be able to have global high scores from all of your users all over the Internet, that's a custom implementation. You're going to have to somehow connect over the network to a server that you host. Maybe you would have it come through a web page. Maybe you would connect directly through the CF network code. That would be your choice.
A couple of things about the status bar. So typically in a full-screen game, you want to hide the status bar or change it, or you want to change its orientation. And so all of these things can be controlled by the Info.plist that's associated with your application. And these are just different settings.
So the status bar hidden, set that to yes if you want to hide it. You can set it from the default, which is the sort of a gray color, to black translucent if most of your games have a black translucent color. Same colors are dark. Or you can orient it to landscape mode. In this case, landscape right for landscape orientation.
Two more things. If you want to customize the launch image of your application, and so what that is is when an iPhone application first launches, it displays an image while your application state is all being initialized and so on. And the name of that file is just default.ping.
If you have that in the top level of your application bundle, that's what's going to be shown while all your game state is loading. Now, you probably want the contents of that default.ping to be something like the 0% appearance of your loading screen or just something that shows, okay, you know, the game is now firing up.
And last, an issue that comes up if you're holding the iPhone and using just the accelerometer for input, maybe you're never actually touching the screen. And touching the screen is what keeps it awake. And so there needs to be some way for you to tell the iPhone, the SDK, I still want to keep awake even if I'm not touching anything. And this is how you can do that. It's just a method on the shared application object to disable this timer.
All right. So that's actually my email address. You're going to see that all over the show. But so we've given you kind of a lot of homework here. You know, you saw this. There's a lot of Touch Fighter source that we've shown you. We've told you go look at that code. You know, so come, go do that tonight, you know, after the parties, of course, and then come to the lab tomorrow. And we'd be happy to take questions there.