Graphics & Media • iOS • 37:30
iPhone OS lets you easily integrate still images and video into your apps and provides direct access to the Photo Library. You can also combine live camera input with positional information and other data to create a visually stunning app. Hear about best practices from existing use cases. Also learn how to take advantage of new features such as programmatic access to all library assets and live image buffer access.
Speakers: Emilie Kim, Hernán Eguiluz
Unlisted on Apple Developer site
Downloads from Apple
Transcript
This transcript has potential transcription errors. We are working on an improved version.
[Emilie Kim]
Good morning. My name is Emilie. I'm an engineer on the photos team and I'd like to welcome you here to Incorporating the Camera and Photo Library in Your App. So, I'd like to start off by first saying thank you to you guys for creating more than 3,500 photography-related applications in the app store already, so.
[ Applause ]
So, with already 3,500 photography apps in the app store, you might be asking yourself, "How can I make my application stand out amongst the sea of other applications?" So, it's a good thing you're here today because that's exactly what we're going to be talking about.
We're going to be talking about how you can really make your application shine among the rest of the applications. So first, we're going to be talking about how you can customize the camera experience in your app to really make it feel like it's a part of your application and the user is never left at all.
After that, we'll be talking about a new framework that we've added for iOS 4 and this allows you to do things like build your own image picker. After that, we'll be talking about how to go even beyond an image picker and to really interact with photos and videos in new ways in your application. So first, let's talk about the camera.
As most of you already know, if you want to add camera functionality to your application, what you want to do is use the UIImagePickerController using the camera source. That's pretty straightforward. What we've done in iOS 4 is we've added new APIs to allow you to even further customize the camera experience by allowing you to programmatically control even more features of the camera as well as even create your own camera controls.
So, a lot of you might be asking yourself well, what about the iPhone 4 that was announced on Monday? How can I interact with that? So first, let's talk about Flash mode. We've added a new property to UIImagePickerController called CameraFlashMode and what this allows you to do is obviously interact with the flash on the back of the camera.
You can turn the flash on which will fire the flash when you take a photo as well as turn it into torch when you take video and you can also turn the flash off which is pretty straightforward. You can also have the flash fire automatically but why stop here? You can be really creative with this flash in your app. You could be the first app in the store that creates a strobe light or maybe a paparazzi effect in your application. You can go crazy with the flash.
So, now that we've talked about flash, what's the other great new feature that was announced in the iPhone 4? Well, that was the front-facing camera. So, we've also exposed camera device in the UIImagePickerController. So now, you can not only take pictures and shoot video with the back camera, but you can also take pictures and video with the front camera and you now have control over this. So, maybe you don't like the way that the flipper switch looks on the camera UI. You can create your own switch and tie it to the camera device.
Now, let's talk about Capture mode. Video is not new in iOS 4 but what we've now allowed you to do is programmatically switch between taking a photo and taking a video. Before, you had to rely on the user to switch these themselves in the UI but now, you can create your own control or programmatically tie it to something like shaking the camera.
So now, let's talk about video quality a little bit. Before iOS 4, there were three types of video quality. You had low, medium, and high, pretty straightforward. We've added a new quality type which is 640x480 and let me explain the reason we added this. Relative qualities are actually device dependent. So, let's take for example quality type high. On an iPhone 3GS, this would record video at 640x480 VGA. However, on an iPhone 4, the same quality type would actually record a video in 720p HD because as we know, iPhone 4 can record video in HD.
So on top of this, relative qualities are also camera dependent. So, on top of the iPhone 3GS and the iPhone 4, the iPhone 4's front camera will actually record video at 640x480 VGA with the same quality type high. So, what this really boils down to is you should use the quality type that's appropriate for your application.
Keep in mind that low quality type is appropriate for cellular transmission, the medium quality type is good for Wi-Fi, and 640x480 and high are both pretty big videos and so, that's really only appropriate for syncing that from the user's device to their computer. So, that's really what you have to keep in mind when you're using quality types.
So now, let's talk about programmatic capture. We had previously exposed to take picture which allowed you to take a photo programmatically from the camera. We're now allowing you to start video recording programmatically and so, what this allows you to do is maybe you don't want to record to actually start when the user pushes the button but maybe you want to do some kind of custom animation before the recording actually starts. Maybe this is part of your movie taking app or something.
You can now do this because we're allowing you to record video and stop recording programmatically. So last, let's talk about the custom overlay view. The custom overlay view is new in 3.1 but we're talking about it now because what this allows you to do is to take your own view hierarchy on top of the existing camera preview view. So, this allows you to do something like build your own camera controls.
If you're upset with the UI that we have in the camera, you can create your own controls hopefully, not as ugly as these and stick them on the preview view. You can also really create an immersive camera experience like maybe using augmented reality to show in your camera preview view. So with that, talk is kind of cheap.
So, I'm going to invite Hernan, another photos engineer, up on stage to show you a demo of the new feature we've added to the UIImagePickerController for the camera. Hernan is going to show you how you can programmatically control video capture and how you can also create a custom overlay view to really make the camera feel like part of your application and also how to expose custom video quality settings in your app.
[ Applause ]
Good morning everybody, my name is Hernan and I will be doing all the demos today. The first one is, as Emilie said, simple application that would show you how to create your custom UI and control video capture. So as you can see, this is an overlay view with a button switch cameras to enable and disable flash and to switch between recording and some definition for-- for the app, this is 640x480 or high definition 720p.
The overall UI requires is a double tap and it starts recording and then when the user is done recording, you can double tap again. What we do is we save the video to the camera roll and I can show you this in a second. See we have the video that has been recorded. So, let me show you now in the Mac with code how easy this is to do.
By the way, with all the demos that we have, have a lot of scaffolding code for the UI. Given the time that we have, what I'm going to do is concentrate only on the code that pertains to the talk. You can download all the sample code by the end of the week and you can see all the other parts if you're interested. So, the first thing that we need to do is set up the image picker.
We need to set the capture mode to video and you have to remember to explicitly set the media types in this case, to movie. Because by default, the media types are set up for camera shooting and still shots. We set default values for the camera device, rear, flash off and what I call center definition and all these properties are basically set every time that the user taps on the controls on the screen.
When the user selects to start recording, we just tell the image picker to start video capture. When it double taps, the user double taps again, we stop video capture. We have to delegate to the ImagePickerController so we get a delegate callback, didFinishPickingMediaWithInfo. We take the path of the video that was recorded by the camera and we just save it into the camera roll on the saved photos album. And that's all that you need to create your own custom UI and your custom camera experience, Emilie, back to you.
[ Applause ]
[Emilie Kim]
Thanks Hernan. So what Hernan showed you in his demo is really how fully customizable you can make the camera in your application. You can make the controls look however you want and because of the new APIs that we've added to UIImagePickerController, you can really control pretty much every aspects of the camera now. It's also really easy to use. It's pretty much just a matter of setting certain properties and now, you have a fully functioning camera in your app which leaves you more time to actually develop your application and less time trying to figure out how to use the camera.
Now, if you really want even more control for you power junkies out there, I encourage you to check out the AVFoundation framework which is also new in iOS 4 and there's a session using the camera with AVFoundation which I encourage you check out on video after the conference. So now, we've talked about how you can really customize the camera experience in your app.
But what happens after you've taken that picture? What if you want access to the user's photo library? Traditionally, this has been done with UIImagePickerController. You bring up the UIImagePicker and you allow the user to choose a photo. Now, this is really great for basic photo picking but as evidenced by all of your feature requests, this isn't enough for you guys.
Among your feature requests, you guys really want more control over the behavior of the image picker. You also have more control over the look and feel as well as access to full-sized images, not the little dinky images we give back, as well as access to the image metadata. Now for those of you eagle-eyed viewers in the audience, when you saw the iMovie demo on Monday in the keynote, you might have noticed that they were actually picking photos and videos out of the photo library on the device.
So, but you might be saying to yourself, I never actually saw the white table of the UIImagePicker come up or anything and that's because they weren't using UIImagePickerController. iMovie has actually built their own image picker using a new framework that we've added to iOS 4 and so now, we're going to talk about how you too can build your own image picker.
So, we're now going to talk about the new framework which is the Assets Library Framework. This is a new framework like I said that we added in iOS 4 and this gives you access to photo library images and videos just like an image picker but what else does it give you. You also have access to the original photo and video data as well as the metadata but before we get too much into that, I promise we'll talk about that but let's take a step back and talk about the basics.
When you're thinking about how to structure a Photo Library Framework, you're probably thinking of something that's very similar to the existing photo app. So, you have a library and a library contains album. Within each album, you have photos and then within each photo, you have certain data that you can get such as the full-size image.
So, let's talk about the classes. We have five new classes in the Assets Library Framework; ALAssetsLibrary, ALAssetsGroup, ALAssetsFilter, ALAsset, and ALAssetRepresentation. That's very hard to say. So, we're going to talk about each more, each one of these classes in more detail. Let's first talk about ALAssetsLibrary. This is the root photo library object that contains the user's photos but before you can actually enumerate through the user's photo albums. The user first has to grant your application permission and the reason for this is because photos can contain potentially sensitive information such as the date and time that they were taken or the location where they were taken.
So, the first time your application tries to enumerate through the user's photo library, a dialogue is presented on behalf of the application to ask the user if it's OK to access this data. I'd like to point out that this does not affect the UIImagePickerController because the UIImagePickerController has not and continues to not bend potentially sensitive information. After the user has granted your application access, they can also revoke it or re-enable it in the settings app. So, the user grants your application permission. Now, you have access to the user's photo library.
So, once you have, you can enumerate through the photo album and once you have a photo album, you have what's called an ALAssetsGroup. This is analogous to a photo album object. There are five different kinds of groups that you can get. First is the saved photos album. This is like the camera roll on an iPhone or the saved photos album on an iPod Touch.
You also have access to the photo library which is a synthesized album that contains all of the synced photos and videos that came from iPhoto through iTunes. You also have access to the albums that come from iPhoto as well as new events and faces that also come from iPhoto.
So now that you have group types, maybe you want to filter the kinds of assets that are in your group. Maybe you want just photos or just the videos. You can do that by setting a filter on the group using an ALAssetsFilter which Hernan will talk about more in the demo.
There's also group property that you can get on a group which are appropriate for displaying in the UI. So, if you're going to create your own image picker, you want the name of the album. So, we've exposed the name as well as the count which is actually dependent on the filter that you've set as well as the album thumbnail. Again, appropriate for displaying in your own image picker.
So now, how do you actually get the assets that are in a group? We've exposed some blocks-based enumeration API which is actually very similar to the ones that are now in NSArray. So, if you're already familiar with the NSArray enumeration APIs then you already know how to get the assets in a group.
You can enumerate chronologically or by setting the right enumeration options, you can actually go in reverse and display maybe the most recent photos first or maybe you really want to optimize scrolling performance. You can enumerate certain ranges and assets at a time by just-- by setting the index set that you want and then you can only-- you only have to load certain assets at a time as the user is scrolling through your app. So now, let's talk about an asset. An asset isn't what you think. It's not just a photo or just a video. An asset actually contains multiple representations.
Now, let's talk a little bit more about this. Let's say you had a camera that shoots both RAW and JPG. So, it's really the same picture just in two different formats. So, another example of this is actually AVI + THM. Let's say you have an AVI movie and then you have the THM file which is the thumbnail that's associated with that movie. These are two different representations but they are really about the same video or the same photo in the case of RAW + JPG.
So, an asset is really just a container for these multiple representations. We give you access to the default representation which is what we think you really want access to. So, in the case of RAW + JPG, this is the JPG and in the case AVI + THM, this would be the AVI movie file.
But you probably know better than we do. So if your application really actually wants to access the RAW file or the THM file, you can also have access to all the representations. Assets also have properties which are generic to all of the representations that are contained within the asset.
This includes the thumbnail as well as the date and the location which we'll talk about more in the third part of the session. So, if you want even more information, we have an ALAssetRepresentation which is the representation object. You can ask for a persistent URL for this asset representation and this is really key, because what this allows you to do is restore state in your application as well as retrieve previously accessed assets. So, if your user has decided that they like these three assets and you want to store those away, all you have to do is store the URL and the next time the user launches your app, you can just bring back the assets for them.
You can also get at the full screen image for an asset representation and this is a CGImage that's appropriate for displaying an image full screen. I'd like to point out that this does not mean that the image is exactly the size of the device screen. It's actually a little bit bigger so you can allow the user to zoom in a little bit. This also means that the dimensions of the image are not or are actually hardware dependent. So, we've talked about the five classes in ALAsset Library.
So now, I'd like to invite Hernan up on stage again to show you how you can use these five classes to create your own image picker. What you're going to see is how you can create your own image picker by enumerating through the different groups and assets as well as filtering the group's contents.
You can also display full screen images just like you can in the image picker and Hernan will also show you how you can handle error conditions, Hernan.
[ Applause ]
Hernan Eguiluz: Thank you Emilie.
[ Applause ]
So as Emilie said, what we did is using the ALAsset Library created a simple replacement for the image picker.
So at the top level, we have a list of all the albums, events, faces, if we had any data for that. That is basically an enumeration of all the groups in the AssetsLibrary. What you would expect is when you select one of the groups or the albums, what we do is we retrieve all the assets associated with them, we get their thumbnails and display them in a grid.
Tapping one and we go into the full screen image not the full-size image. We, as you can see, you can zoom in but not too much. It's optimized for displaying basically on the device and that's basically, that's a simple UIImagePicker. Let me show you in code how you can do this. Let's switch over to the Mac.
The first thing that we need to do is create an ALAsset Library object. Once we have that, we can start the enumeration. We basically select what group types we want, in this case, all the albums, events and faces. You could also say saved camera albums. What do you want in the library? These ones were correct for this demo.
So we call-- EnumerateGroupsWithTypes in the AssetsLibrary as in the groups and two blocks. One of the blocks is the one that is going to be receiving results from the AssetsLibrary and the other one is used for handling failures or errors. So, let me show you what they look like.
The enumeration one is very simple. For each one of the ALAsset group that we get which is for the mean and array and update the UI. What I want to point out here is that we have to check whether this is really a valid object because the termination condition is that the group pass in here is nil and then you can detect that the enumeration is finished.
For the failure block, the important thing to know here is that we, in cases where you can not contact the framework or particularly when the user has denied access either to your application or all the applications to our data. You will get called back in through the failure block with an error.
We already provide you with a localized description and a recovery suggestion that you can use in the UI so that you can tell the user what to do if they want to see content and in this case, well we just put up a special UI that tells this information to the user and you should probably handle this differently than the other cases and you want to communicate. You don't, you shouldn't be dropping this on the floor. You'd really want to communicate this information to the user.
The last thing that we need to do at the group level is get information that we display in the cell. We ask the group for its name and then for its poster image which is a CGImage so we need to create a UIImage out of it and put it in the UI.
For assets, once the user has selected one of the groups, the first thing that we want to do for this demo is create a filter that would only show photos. So we ask the ALAssetsFilter class for the all photo filter and just set it in the group that we got from the root view controller then we call enumerate assets using block on the group and the enumeration here works pretty much like in the case of groups.
It's a little bit more involved because you get a result but again, it will be nil when we have finished the enumeration, but it also can pass you back an index of which asset for the group that you are getting because we have special enumeration APIs where you can pass ranges. As Emilie said, maybe you want to do that to optimize the performance of your application. In order to display the thumbnail on screen which has asked the assets for this thumbnail, again it's a CGImage. We convert into a UIImage and we put it in the UIImageView.
Finally, when displaying the full screen image, keep in mind that you need to do this from an asset representation. For the demo I'm just using the default one but you can access any of them using the UTI types. We get the full screen image from the representation scale and an orientation because we need to preserve the orientation that will go from the camera and then we have this convenience method in UIImage which is new to iOS 4.0, image with CGImage scale and orientation. Then you set that in the UI and you're basically done and this is all the code that you need to create your own image picker. Back to you Emilie.
[ Applause ]
[Emilie Kim]
Thanks Hernan. So, what Hernan just showed you is how you can build your own image picker using really simple and concise API that we've provided you in the AssetsLibrary framework. Now, some points to remember, you should always provide the failure block so you can fail gracefully in case the user denies access to your application and that the image data really comes from the asset representation not the asset. So, now you've built your own image picker. That's cool but there's already an image picker on the system.
Why would you build something that's exactly the same? So now, let's talk about how you can go even beyond the image picker and really customize the photo and video experience in your application. The first thing you can do is create an image picker but really customize the behavior to do exactly what you want it to do. You can also access full-sized images as well as integrate with other system technologies. Finally, you can create innovative UI based on the metadata that we provided in the AssetsLibrary framework.
So first, let's talk about customizing image picker behavior. Many of you have requested the ability to select multiple images when using the image picker. Well, with the AssetsLibrary framework, if you build your own image picker, you can just do this yourself. So, if you want to implement multiple selection, you can do that. Many of you have also requested the ability to have custom image sources in your application.
So, if you want to bundle some resources in your app and then have the user pick from them as well as photos in their photo library, there's no way to do that using UIImagePickerController but using the AssetsLibrary framework, you have full control over the UI. If you want your image source to be a first class citizen along with the rest of the user's photo library, you can do that.
What this really means is you can do whatever you want. If you want the photo to spin before going into full screen, I hope you don't. That's kind of weird looking but if you want it to do that, you can. It's your app, your image picker behavior. To bring up the iMovie demo again, you never even notice that you had left iMovie even though they were really picking images and videos out of the user's photo library and this is because they customize the look and feel of their image picker so that it really felt like it was one continual iMovie experience. This is what you can do now in your app also. So now, let's talk about full-sized images. Many of you want access to the full-sized image. So now, we're giving it to you.
You can get the full resolution image which is a CGImage that's appropriate or that contains the original image content. However, you really want to make sure you only use the full-sized image when it's really necessary for your application. These images can be really big. So, you don't want to load them just to display full screen when a full-screen image would do.
This also depends on our ability to decode the kind of media that you've chosen. If the hardware can not decode the certain RAW type that you're trying to choose, then you're not going to get a CGImage back. But don't worry, we also give you access to the raw bytes of the data.
You can now read in large images and videos using raw byte as well as image formats that we can't decode into a CGImage. You don't have to worry about handling file descriptors or anything like that. You just pass in a buffer and a range and then we give you bytes back. It's pretty straightforward. So now, let's talk about framework integration. Because the AssetsLibrary framework is new in iOS 4, you can now integrate with many other system frameworks. So, let's take ImageIO for example.
This is another framework that's new in iOS 4 and what this-- you can use ImageIO to create custom thumbnails, for example. Maybe you don't like the square thumbnails that come back from the AssetsLibrary framework. You can now create maybe aspect thumbnails if you want or maybe you want the thumbnails to be even bigger. You can do that too using ImageIO. I encourage you to read the developer documentation for more information about ImageIO.
Another framework that's new in iOS 4 is AVFoundation. You can use the persistent URL from an asset representation like I mentioned earlier to create an AVURLAsset and using this AVURLAsset, you can work with the AVFoundation framework to really do anything you want to videos. You can play, compose, edit, whatever your application wants to do and so, I encourage you to attend the Discovering AVFoundation session which is later today. So now, let's talk about metadata.
All of you have wanted metadata return with the images. So now, you can do this. Maybe you want a badge some thumbnails that are in your image picker. We expose duration for videos so you can badge duration. We've also exposed the kinds of formats that the representations are in. So, if you want to badge the thumbnails that way, you can do that as well.
You can also, maybe you want to filter on photo orientation. Maybe you just want to display the landscape photos. You can do that using the orientation metadata. You can also create a timeline with the date and time metadata that we return you. This is something that you couldn't do with the image picker before and now, you can do this.
You can also display images geographically much like the new places feature that we've added to the photos application using the GPS location information that we return in photos. Finally, you actually have full access to the EXIF metadata in the image. So, maybe you want to filter your images based on exposure, I don't know. You can do whatever you want to do. So with that in mind, I'd like to invite Hernan back up on stage for one more demo to show you just exactly how powerful the AssetLibrary framework is in enabling you to build whatever applications you want to build.
First, he's going to show you how you can store and retrieve assets using an asset URL as well as how to create a custom image source. After that, he'll show you how you can read original image data and explore the metadata that's associated with an asset. So, Hernan.
Hernan Eguiluz: Thank you Emilie.
[ Applause ]
So for this third demo we were thinking of how can we make-- how can we give the user a different user experience and how can we have something different than what the image picker gives you all the time. How can we be different? And we centered our demo on using metadata in different ways and also in a couple of things for the user experience. The first one is when you list the albums you don't have to have small thumbnails. You can have big ones.
So, the user-- it's easier for the user to select them. In the normal ImagePickerController, when album and thumbnails is selected, when an album is selected, you go into a grid of photos. So, why do that? Why not just go into a map where we show you each one of the assets where it was taken.
But, you know, this is something similar to one of the photos up there. So, I think that we can do better than that. How about we'll show the user the path of how the photos and when the photos were taken given that we know when the photo was taken. So, we added this Play button basically using MapKit overlay API which is new in iOS 4.0, shows you how I took the photos in this case around the main campus an infinite look. Another case, these I went for a hike.
You know, where did I hike, how did I move around, I got lost, came back, got lost again and then found my car.
[ Laughter ]
When you select one of the images, basically, we have a standard MapKit call out. We leverage in the thumbnail that we already have for the asset and we can show you in this case the full-sized image. See, we can really, really zoom in and out on this image. Another thing that we can do that the photos application doesn't give you is show you all the metadata associated with the file.
For example, in this case, the GPS, pixel height, pixel width, Y density, et cetera. And this is basically I'm going to show you in the code in a minute. This is just a dictionary that we get from ImageIO that we passed back to you and you have to interpret, so we don't do any interpretation and what information is here is dependent with the camera that took that photo.
But I mentioned before that it will be cool to all support other kinds of user interactions to the application. So we thought well, leveraging the idea of the persistent URLs, we could create our own groups or our own albums. So, we came up with the idea of favorites album.
So, as you can see at the bottom, left corner of this star which let us tap it and mark this photo as a favorite. So, when we go back, now we have in the list of albums a list of favorite albums-- favorite photos, sorry. Like when you select, we basically retrieve all the assets from the AssetsLibrary using the persistent URL.
So, with that, let me show you the highlights of this demo in code so, let's switch back to the Mac please and again, I'm not going to go over the enumeration of groups and things like that I already covered. Basically, when we get-- when we enumerate assets, we get the location. We need to check that the location is valid because there are photos in your photo library that may not have location information and then only those that are valid will be added to the map as a map annotation, standard annotation.
When the user taps in a color view, and we display the full-sized image, what we want to do is in this case, use the get bytes from off the length or API. We basically need to pass in a buffer of the right length for the asset that you're reading. In this case because we know that it's a small image, we can just pass a buffer that is big enough so we say read it from the beginning in all of the image.
If you are not going to display the image or if you are maybe within a video that you don't know how big it is. So, if it's really big, we will recommend it so you read it in chunks, process and keep reading like that because you're going to run out of memory otherwise.
Once we have the data, we create an NSData out of it and then we go through the normal CGImage APIs to build a UIImage and display that in the UI. For metadata, that screen that I showed you is extremely simple. Again, the metadata is coming from a particular representation, not from the asset itself so we get the full representation.
You could be getting a different one if you want it and then just ask for the metadata and that is just normal dictionary but then, in this case with the demo which is formatted somewhat quickly to display on the UI but basically at that point, you have it EXIF, TIFF, et cetera, all that information available to you. And the last thing that I want to show you is in the asset's list, which is an object that manages what the map displays.
When we are enumerating based on the favorites group. Basically, we don't have a-- it's if you will as a synthesized group. Something that we created is not stored in the AssetsLibrary. So, we won't have an ALAssets group. We just have in this case an array of URLs, of persistent URLs.
So, what we do is we call the AssetsLibraryAssetForURL and then we have a return block that we basically are adding our ALAsset objects into an array and please do use the failure block because this one can also trigger the request for access to a user and with this code plus the code that you've seen in the previous demo, you can basically create whatever you want for displaying information to the users in whichever way makes more sense for your application and for your users. Back to you Emilie.
[ Applause ]
[Emilie Kim]
Thanks Hernan. So, what Hernan showed you was really cool. You can basically take images from the user's photo library and put them on a map, really customizing your image picker behavior. Now keep in mind, you want to use the full resolution images only when it's really appropriate.
So, maybe you're uploading to a printing service or something, that's really when you need the full resolution image. Any other time, the full screen image will probably suffice. You can also create new and interesting UIs in your application using the metadata. I mean, really, you can go crazy with the metadata.
If the picture has it, why not use it? So with that, we're really looking forward to see how you guys are going to interact with photos and videos in new ways in your application? So now, coming back to how to make your applications stand out. There are so many photography-related apps in the app store. How do you make yours shine? Well, we've armed you with all of the knowledge you need to know in this session to really make your applications stand out. You can build and design your own camera controls really creating an immersive experience in your application.
You can also take advantage of the AssetsLibrary framework and do amazing things like well, building your own image picker, that's not really that amazing, but really customizing the behavior to make it yours, customizing the look and feel so the user feels like they've never left your application. You can enrich your application with access to full-sized images if appropriate. You can also take it to the next level with system integration. ImageIO and AVFoundation are only two of the system frameworks that AssetsLibrary can play well with.
Obviously, you just saw in the third demo that you can also use MapKit really and integrate really well with the AssetsLibrary framework. So, the system is at your fingertips now that you have access to these photos and videos. Now, we're really looking forward to see what you guys are going to come up with using the metadata.
This is something you guys have been asking for for a long time and we're really excited to see what you can come up with. So for more information, please contact Mark Malone who is our Integration Technologies Evangelist. I also encourage you to read the documentation at developer.apple.com as well as ask question on the Apple developer forums.
For related sessions, if you're interested in learning more about AVFoundation, there are three AVFoundation sessions that you should definitely check out on video. One of which will be repeated this afternoon as well as customizing maps with overlays. If you're interested in how we implemented the third demo, I encourage you to attend that session later this morning.