App Frameworks • iOS, macOS, tvOS, watchOS • 49:21
Discover what's new in accessibility in iOS, watchOS, macOS, and tvOS. Learn how to audit an App for accessibility, and take advantage of new assistive features. Topics include how to best customize how Voice Over describes an App UI to the user, how to enhance the accessibility user experience with new features like Drag and Drop, and how to take advantage of sophisticated accessibility APIs to solve common issues.
Speaker: Skylar Peterson
Unlisted on Apple Developer site
Downloads from Apple
Transcript
This transcript has potential transcription errors. We are working on an improved version.
Good afternoon everyone.
[ Applause ]
I hope everybody's having a great WWDC so far and that some of you managed to go see Todd's talk just a little bit ago, he's a great speaker and an all-around awesome dude, so I hope you enjoyed that. My name is Skylar Peterson, I am a member of Apple's iOS Accessibility Team and I'm here today to talk to you about what's new in accessibility this year.
So, for those of you that don't know exactly what I mean when I say accessibility well first of all, I'm happy that you're here and second, when we think about Accessibility it's really making technology usable by everyone regardless of whatever their unique needs might be. At Apple, we approach this problem by examining four major classes of user ability, cognitive, motor, vision and hearing. Cognitive encompasses conditions like dyslexia or autism.
Motor examines the way in which a user physically interacts with the system, whether they need special accommodations for a condition like Parkinson's or cerebral palsy. Vision which encompasses a range of visionability from those with low vision conditions to those who are completely blind. And finally hearing which like vision encompasses a spectrum of hearing ability from those who are hard of hearing to those that are completely deaf.
Worldwide there are over a billion people who have some form of disability that fit into one of those categories. Put another way, one in seven, one in seven people have a form of disability which means that you're going to have and already do have users that have a disability. But at the end of the day accessibility isn't about numbers it's about people.
Accessibility is about the way in which people interact with the world. It's about technology's transformative power to enable. It's about expanding the possibilities of what people can do. Accessibility is full of incredible stories, stories like Todd's. In fact, we availed a video series highlighting some of these stories and I'd like to share one of them with you now.
Now, are you going to be able to play today or not, Morgan? All right, guys. Ready to try it? Let's try it. Clarinets, you up?
[ Music ]
Clarinets ready?
[ Music ]
I love seeing the I got it face when they understand and get something, you can actually see it in their faces and that encourages me to do more.
[ Music ]
Clean music joke. Howard, go. We tend to joke around the classroom, I think that's part of teaching let them see that you're human. I said a joke not two words, sit down.
[ Music ]
[ Music ]
I can stream the music from the iPad into my hearing aids and actually it sounds like they're right next to me, so I can grade it right then and there. Band is a family.
[ Music ]
[ Music ]
[ Music ]
[ Applause ]
So, we've gone about tackling problems that people like Shane face every day by creating a wide array of baked in system-level features across all of our platforms. Which includes making sure that our software works well with an assortment of hardware accessories made specifically for people with disabilities.
But at the end of the day what really makes Apple's platforms great is the work that people like you do every day to make sure that all of those amazing bits of content that you work on work for everyone. So, with that in mind, let's take a look at what we're going to talk about today.
The first thing I'd like to do is highlight some of our new assistive features across all of our platforms. And from there I'd like to shift to talking about what you as a developer can do to make sure that your stuff works with our stuff. And that really starts by auditing your own app for accessibility, looking at what the current accessibility experience is like. And from there we're going to take a look at some of the basics of our accessibility API and the way that you can take advantage of it to solve some of the issues you find during your audit.
From there we're going to go beyond the basics to look at some of our more advanced APIs to help you solve some common problems that you might find while making your own apps accessible. And finally, I'd like to spend a little bit of time talking about some accessibility considerations that you should make when implementing the drag-and-drop in iOS 11. So, let's get started.
The first feature I'd like to talk about today is text detection. So, let's imagine for a second that you're a VoiceOver user who's blind and you're scrolling through your favorite social media app and you come across an image like this where somebody has embedded text within an image.
When you tap on it you're probably not going to hear a description of what's actually in the image unless the developer has gone through and done the work to put a label that describes it to you. But this year in iOS 11, as well as on macOS if you focus on an image we're going to do some basic text detection to try and figure out if there's text within that image and if it makes sense to speak it then we're going to speak to the user and it sounds a little something like this.
- Possible text, I can't believe I took this with my iPhone.
- Now along that same vein, we're introducing improved photo descriptions for VoiceOver users systemwide. So, when a VoiceOver user focuses on an image we're going to run some basic analysis to try and figure out what's contained within that image and speak details about it so that VoiceOver users get a richer description. We look for things like the number of faces within an image, what expressions are they making, what's the overall scene of that image, and what's its blurriness or brightness level. So, a description for an image like this one might sound like this.
- One face, one smiling, nightclub, blurry, bright.
- Now if you're a developer who's already gone through and done your due diligence of labeling all your images don't worry we're not just going to hijack that description from you. Instead, if a user wants to hear the description that we ourselves have generated that all they need to do is a single three-finger tap on the image and they'll hear our description.
Next, we have large texts. So, this year our team did an audit of the large text experience systemwide with a real focus on the experience of those users who use our largest accessibility text sizes. And we worked with teams across Apple to really, really improve this experience, which also includes some new APIs.
Now there's a lot that I could say about this feature. In fact, there's so much that I could say about this feature there's a whole other session dedicated just to dynamic type it's called Building Apps with Dynamic Type. It's going to be in the executive ballroom Friday at 1:50 and I highly, highly recommend that you attend to figure out how to best adapt your own apps for dynamic type in iOS 11.
Now there are two other sessions up here that I'd like to point out that also contain relevant information about dynamic type. The first Design for Everyone is a session that focuses on the design of accessibility, but will also contain a portion that will cover dynamic type, including some of the rationale behind decisions that we made when adapting the layouts for large text and design considerations that you yourself should make when designing your own apps. And second, Auto Layout Techniques in Interface Builder will have a portion dedicated to adapting your layouts that are built with interface builder for dynamic type. So, I hope to see you all at these sessions.
Moving on, we have the Accessibility keyboard. This is a feature built for macOS for users who have enough dexterity to use a trackpad, but can't necessarily use a physical keyboard. It's an onscreen keyboard that has things like system controls and a predictive text bar and is fully customizable, which means that if I have certain actions that I do on a daily basis that I want to create a custom panel for so I just have to click it once and it'll initiate that action then I can create those panels. So, for instance if I email my manager every day I may create a custom panel that opens mail, composes an email, and adds her address in there.
Next, we have Type to Siri, so like the title suggests this feature allows you to interact with Siri via text input like you would via speech. And we're really excited about this because it's going to open up Siri to a class of users who are unable to use it before, those who are nonverbal.
Finally, we have Smart Invert Colors. So, previously on iOS if you enabled the invert colors feature it would invert all the content of your device regardless of what kind of content it is. But this year we're introducing Smart Invert Colors, which looks at specific kinds of content like graphics or images and doesn't invert them so that you can see the content as it was actually meant to be. Now I'd actually like to do a demo of that feature for you now.
So, I have my device here and I'm going to enable Smart Invert Colors. Now as you can see, the wallpaper wasn't inverted nor were all of the app icons. And if I go into my Photos app, you'll see that I still have the dark UI that is important for users who use invert colors that have certain low-vision conditions where a white UI makes the display too difficult to look at for prolonged periods of time. So, I can still see my photos as they originally were.
Now if I go to an app like Clock which already had a dark UI you'll also notice that we haven't inverted that UI to keep the dark look around that is so important to our invert colors users. Now that's just a very basic overview of that feature and I highly recommend that you check it out yourself and see how it plays around with your own apps.
But now I'd like to pivot into talking about the experience of you as a developer and how you make sure that all of your work works with all the things that I just covered, as well as all of our old features. And that really begins by auditing your app for accessibility.
So, when we're starting an audit the first thing you're going to want to do is enable an accessible technology like Voiceover or Switch Control and see what the experience is like for a Voiceover user. So, to do that we're going to tap on General Accessibility and go down to this accessibility shortcut page and add VoiceOver to our shortcut.
What this does is that now whenever I triple click the Home button VoiceOver will turn on and if I have VoiceOver on already it will turn VoiceOver off so I can quickly cycle between the two as I'm going through my app. And now that we have VoiceOver on.
We're going to Home out of settings and go to the app that we want to audit and just start playing around to see what the experience is like for a VoiceOver user. Are all the things that I would expect to be visible visible? Are the things that I would not want to be visible to a VoiceOver user not visible to them? Does every interaction with the interface have some sort of corollary that a VoiceOver user can take advantage of? And does the UI itself make sense, does the flow make sense? As well are elements properly labeled? For instance, I would expect this button that's an upload button to be labeled upload tell me it has three votes currently and that it's a button.
Now we have a tool called the Accessibility Inspector that can also be used for auditing your apps. You can get to it by going to Xcode, open Developer Tool and opening the Accessibility Inspector. And what this tool allows you to do is look at specific elements of your screen and get their accessibility information, as well as run a full audit of the accessibility in your app.
Now I'm not going to cover using this tool today, but there was a session at last year's WWDC that covered extensively how you take advantage of this app and work it into your own workflow. So, if you're interested in that I recommend that you check it out at this link. Instead, what I like to do now for you is demo auditing a sample app that I've been working on.
So, I'm going to come to my device here and you can see I have this app down in the corner called Pre-Poster. Now this is an app I've been working on that allows me to post posts to a social network to a close group of friends before I post it to the wider social network to make sure I have all my friends' approval of those things I'm posting.
Now I have things like up vote buttons and down vote buttons so that friends can tell me whether it's good or not, comments, I have even star ratings so that I can rate how good a post is, and I have settings that allow me to set which social networks I want to automatically post to when the timer runs down. So, let's turn on VoiceOver and just see what we find. I'm going to triple click the Home button and as you can see, I have my accessibility shortcuts to include the VoiceOver and Smart Invert and I'm going to select VoiceOver.
- VoiceOver on Pre-Poster Tom McNeil, [inaudible] button, three button, button, two open posts.
- So, I'm swiping my finger with VoiceOver to navigate between elements onscreen and I noticed a couple of things there. First of all, elements were out of order. I swiped from one label to the buttons to the second label. Both labels didn't give me any indication that I can actually tap on them to go into my own profile and see all my own posts. And all of those buttons were missing labels to tell me what they actually do and what they're meant to be. So, let's keep going.
- Lexi Torres five minutes remaining, just finished my set three comments, two button, three button.
- Now those two buttons have their values because I'm storing the number of votes in the buttons text label so VoiceOver is able to read that, but again they're unlabeled because I have no idea what those buttons actually do, what is the three ore the two associated with.
I'm going to come down here and I'm going to try and focus on the star view. What you're hearing is VoiceOver indicating to the user that there's no accessible element underneath my finger. And that means that the star view is not accessible at all so we're going to need to fix that.
- Button.
- Let's check out the settings page.
- Settings close button.
- That seems to be labeled properly.
- Facebook, Facebook.
- So, two things I notice here. First of all, the cell isn't indicating to me whether or not the setting is actually enabled. And secondly, when I did a double tap to activate it I would expect it to enable or disable the setting and that didn't happen either.
- Close.
- Now one other thing I noticed.
- Tom McNeil.
- Is that when I was swiping through items on my post it gave me no indication that I can actually delete the post. When I'm not using VoiceOver if I do a long press on one of these cells it brings up an action sheet to initiate a delete. But that interaction isn't clear to a VoiceOver user at all and they have no way of initiating it.
- Alert accessibility short VoiceOver off.
- So, let's take a look at some of the basics of our API to solve these problems. Just to recap, this is what we found. First, we had elements that were out of order and my profile elements didn't indicate that I could tap on them to get to my own profile so it really makes sense to group them together as a single element that acts as a button.
Next, I have unlabeled buttons on the top, as well as these on the cell. The star view was entirely inaccessible, as was initiating a delete of a post. And finally, the settings didn't tell me whether or not they were enabled and an activate of them didn't actually change the value.
So, how do we solve them? Well when a technology like VoiceOver is trying to get information about an element for an app, so when a VoiceOver user focuses on an element on the screen it's going to ask a couple of questions. Questions like what are you, I'm a button? Who are you, I'm a Notifications button? What's your value, three unread notifications? Now all of these questions and answers correspond to properties in code and return specific kinds of return values. So, let's take a look at this. Here we have the five basic properties in the accessibility protocol and we're going to go through them one by one looking at an example.
So, the first isAccessibilityElement indicates to a technology like VoiceOver or Switch Control whether the given element should be visible to VoiceOver at all. So, if we're looking at the memory view up in the corner there that has a slideshow of images within the memory, a label with the title, as well as a play button that's a great candidate for making a button, the overall element a button that when I tap on plays the movie. So, I'm going to set isAccessibilityElement to be true on that memory view.
But now because of the way that I've coded this it's possible that this play button is still going to come up as an accessible element to VoiceOver and because I'm making that overall element the target instead then I want to set this play button's is excessively element to be false so that it doesn't come up and we don't have redundancy there.
And one thing that I forgot to mention is that it's a good idea to make the big area the play button instead because if I'm a VoiceOver user navigating around this UI with by just feeling my finger around the screen it's much harder to find that tiny play button to play the movie than it is to find that big area.
Next, we have accessibilityLabel which describes what an element is. So, for example with our memory view it's a memory and its title is February 18th, 2017. So, that seems like a reasonable label. But now because we're going to be treating it like a button you might wonder why we don't add the word button into our label and that's because of this next property, accessibilityTraits. This indicates to VoiceOver what category an element fits into. So, for example we're going to set our memory view accessibilityTraits to include the button trait so that VoiceOver knows to treat it like a button.
This also means that when it reads the description for the item it's going to say button at the end of that so that the user knows what it is. Now traits are also important for two more reasons. One, they can afford certain UI behaviors which you'll see you later in the presentation. And two, there's mechanisms for users to navigate the interface based on specific kinds of items that will be surfaced based on what traits are associated with it.
Next, we have accessibilityValue which is for elements that have some form of state. So, for instance if we're in our memory video and we're editing there's a scrubber along the bottom that allows me to scrub through the whole video. And the current valve of that item is the second that I'm focused on of the elapsed seconds that have already happened. So, I can set its accessibility value to be that elapsed number of seconds that I'm currently looking at.
Finally, we have accessibilityHint. Now this is meant to be a more longform description of how to interact with an element, what it does or what it's meant for. You can think of this as being a property that allows you to describe to a user that's new to your interface and help them learn how to use your interface. But it's important that you never put critical information to whoever's using it in a hint because hints can actually be disabled to keep VoiceOver from being too verbose for experienced users.
So, going a little further. New this year we have attributed versions of label, value and hint. And what these allow you to do is pass through specific keys to tweak the way in which VoiceOver speaks a string. So, for example if I have a string that I know I want to speak -- that's in a different language and should be spoken with a different voice synthesizer specific to that language then I can use the UIAccessibilitySpeech AttributeLanguage key and pass in the correct language code so that when VoiceOver speaks it, it speaks it with say in this example, the French synthesizer.
Now as well let's say I had an announcement that I wanted to post and speak, but I felt like that announcement isn't important enough to interrupt whatever current speech is happening. Well in this case I can use an attributed string with the UIAccessibilitySpeech AttributeQueueAnnouncement that will queue the speech behind whatever existing speech is already there so that it will be spoken whatever VoiceOver was already speaking is done.
Also, new this year we have the concept of containers. So previously, when a VoiceOver user would swipe into a container of some kind they wouldn't really get an indication of what kind of thing they were swiping into, what does those content actually even mean. So, we're introducing accessibilityContainerType where you can define this and there are several different kinds of containers. The biggest one there is dataTable which actually requires conformance to the UI accessibility container dataTable protocol and improves the way in which a VoiceOver user can navigate through your dataTable and understand where they are.
So, everything that I just covered there applies to views that you would use through UIKit you. But what if your controls exist out of UIKit, you have some sort of non-view element that perhaps you've drawn using core graphics. Well in that case we still need to surface those controls to VoiceOver and so we're going to use UI accessibilityElement to represent logical accessible regions of the screen to basically stand in for those controls. So, UI accessibilityElement you set all the properties that I covered before on it like you would on any other view and it's initialized with an accessibility container that would be the view that it's contained in.
And then you override on the container the accessibilityElements protocol which tells VoiceOver what sub elements within a view are the accessible elements. And the order in which elements are returned in this array is the order in which VoiceOver will swipe through them. So that seems like a great candidate for combining those profile elements into a single element. So, let's come back over to our sample app and take a look at implementing some of these basics.
Okay, so I have -- I'm in my home view controller which is the main screen I'm on and the first I want to do is fix the labels for those buttons that are up on the top. So, for the add button I'm going to say new post because it creates a new post.
The notifications button I'm going to set its label to be notifications and its value to be my current number of unread notifications. And then the settings button I'm going to set to be settings. Now I'm going to go take care of those other unlabeled buttons that were my upload and download buttons.
So, for my approved button I'm going to set its label to be up vote and for my disapprove button down vote. Now I need to also remember to change the accessibility value of these buttons whenever my number of approves or number of disapproves changes. So, when I come into here I'm going to make a switch statement based on number of approves. When it's one I'm going to say it's the number of approves vote or when it's more than one I'm going to day number of approves votes. And I'm going to do the same thing.
For the disapprove. So, let's go to the star view which was previously inaccessible and we're going to make that an accessible element. The first thing that we're going to do is override isAccessibilityElement to return true so that VoiceOver knows that this is an element it should focus on. Going to set its label to be star rating because that's what it is. And then for its value based on the number of stars we're going to return either no rating, one star or the current number of stars.
So now let's come back to the home view controller where we had those labels that we wanted to join together to be a single element that acted like a button. So, what we're going to do for that is a couple things. We're going to create a UI accessibilityElement the container of which is the header view and then we're going to set its label to be my profile name, its value to be my number of open posts, and its traits to include the button traits so the user knows that they can interact with it like a button.
And then I'm going to find its frame and set this property accessibilityFrame InContainerSpace. Now this is important because for a couple reasons. It's telling VoiceOver where the element is onscreen and the coordinate space of its container which is used to draw the bounding box, as well as when a VoiceOver user is feeling around the screen for elements it's using this property to detect whether or not an element can be found underneath their finger. Finally, on our header view, we're going to set our accessibilityElements to contain our new element plus those three buttons that we had already.
Last but not least, I want to come to the switch cell and make this the whole cell an accessibility element and for its value based on whether or not the switch is currently on I'm going to say on or off as its accessibility value. So, let me build that to my device.
- VoiceOver on mail, no unread emails. Double tap to open. Sorry, give me one sec.
- Alert, selected.
- Okay well I have thankfully a premade version of this that actually already has the accessibility in it so we're going to launch that and I'm going to turn on VoiceOver.
- VoiceOver on Pre-Poster EX Tom McNeil two open posts button.
- And the first thing that you see is that I have joined those elements together and that it's spoken that it's a button.
- Tap the add button, notifications three, button settings button.
- All those buttons have proper labels.
- Down vote two button, up vote three button.
- Those have proper labels as well.
- Page two of four star rating, no rating, adjustable. I can focus on the star, sorry. I can focus on the star rating view, it's actually a view now. If I go into my settings.
- Settings, setting, Facebook off.
- I'm finally actually reading whether or not my setting is currently off or on. Now there are a couple of problems that we found during our audit that we still haven't actually fixed. I still have no way of bringing up the delete sheet. I have no way of actually changing the value of the star rating view. I can see what it is, I can see what the current number stars is, but I have no way of changing it. And when I double tap on my setting it's still not changing that setting value.
So, the first thing I want to look at are custom actions. So, we're going to use these to bring up that delete sheet in our sample app. Basically, a custom action is something that you add to an element that is an action that it has available to it.
So, views have the property accessibilityCustomActions which you can override with an array of UI accessibilityCustomAction that are all of the actions available to it. So, what does an action look like? Well it has a name for instance, our delete action is called delete because it tells the user what it does and it has a target and a selector that get called whenever that action is initiated.
So, when a view has custom actions it's going to tell that to the user and then the user can swipe up or down with their finger to cycle through the current actions of that element. And when they land on the one that they want to activate a double tap instead of doing whatever the default behavior will do will initiate that action.
So, for instance, with our delete we can create a custom action called delete, set our self as the target with our delete cell action method that returns a Boolean and that calls the code for actually initiating the delete of the cell. On our cell, we're going to override accessibilityCustomActions to contain this custom action.
Now because we're overriding it on the cell and the cell itself isn't an accessibilityElement, all of its sub elements are. All of its sub elements are going to inherit the custom actions of its parent. So, I can actually initiate this delete action from any of the views on the cell.
Next, we have the default activation. So, what if we want to actually override what the default behavior is when a VoiceOver user double taps on an element. And that's what we're going to use to fix the problem of actually changing our settings. What you can do is override accessibilityActivate, which returns a Boolean indicating whether or not the activation was successful. So, for example with our switch cells we can override accessibilityActivate and set whether or not the switch is on based on its current on value. And we're always going to return true because we know that that action is always going to be successful.
Next, we have adjustable elements, so our star rating is adjustable and VoiceOver has a mechanism for adjustable elements to respond to increment and decrement calls. So, the first thing we want to return in that element's traits that it is an adjustable element and that tells VoiceOver that element is going to respond to the accessibilityIncrement and decrement calls which we can override to do whatever behavior we need when we increment and decrement. And a user adjusts an adjustable element by swiping up and down with their finger. Up as an increment, down as a decrement.
So, with our star's view we have the number of stars that whenever it's set has code that actually adjusts the view itself. In its traits, we're going to include the adjustable trait. And in increment we're simply going to increase the number of stars by one and in decrement we're going to decrease the number of stars by one and now that element is fully accessible to our VoiceOver users.
Now I'd like to look at a different problem that we didn't see in our sample app. So, VoiceOver has this concept of pass-through which happens when a VoiceOver user double taps and holds their finger and it passes through the panning gesture so I can actually run my finger across the screen and get a more fine adjustment of a slider like you see in this photo's page here.
But we want a way to tell VoiceOver where it should actually focus that panning when I initiate it because if I'm a VoiceOver user then I probably am not going to be able to accurately land my finger on that little sliding nub in the center right, I don't actually know where it is.
So, we're going to use accessibilityActivationPoint which is the point that tells VoiceOver when a pass-through occurs, what part of the UI do you actually want to hook into. So, for instance, if we have our slider view where we have some sort of slider nub then we can override accessibilityActivationPoint to be the center of that nub so that whenever the user passes through that gesture it will hook into the center of the nub and start sliding there.
Next, we have custom scrolling, so you may have views that have swipe gestures on them that you can use to get to different parts of the UI. For instance, in photos if I swipe up then I get to my details. And there's a way that we can get this behavior for VoiceOver users as well.
It's called an accessibilityScroll, it's initiated by scrolling with three fingers when VoiceOver is on. And it returns a Boolean to indicate whether or not the scroll was successful. And it passes through a direction so that you can change the behavior based on what direction the user is scrolling.
So, for example with our image view if we wanted to show details like we do in the Photos app we'd have some property that tells us whether or not we can currently show the details and we have code that actually goes through and shows those details. And so, what we're going to do is override accessibilityScroll, we're going to check can we show our details and are we scrolling in the right position or direction. And if we do we're going to show our details and return true, otherwise we're going to return false. And when I return false VoiceOver is going to make a bonking noise to indicate to the user that whatever swipe that they just did was not a valid swipe.
Now I'm not going to go through actually adding all those things into our sample app, but they are in the finished version of the sample which you can find and download at this URL. And I recommend that you go and check it out and see how that code is implemented and see how it works in action, play around with it yourself. Instead, I'd like to spend a little bit of time talking about Drag-and-Drop.
So, for those of you that missed it at the keynote we unveiled Drag-and-Drop finally coming to iOS where I can drag pieces of content and drop it in other places in the interface. For instance, if I have photos open next to notes I can drag a photo and drop it into one of my notes. Now because this was just unveiled two days ago I don't really expect all of you to already understand how that API works.
But there are just two key concepts that you need to understand before I talk about the accessibility and that's drag sources and drop points. So, drag sources are all the points on my view that I can initiate a drag from and drop points conversely are all the places on my view that I can drop content.
So, for instance, on the Home screen which now uses Drag-and-Drop for rearranging icons I have three drop points for each app when I'm moving another. The first is the left drop point which allows me to drop the app I'm currently moving to the left and shift everything else to the right. I have the center drop point that will create a folder between the two apps. And I have the right drop point that will drop my current app to the right.
And we need a way of describing those different drag sources and drop points to our VoiceOver users. And to do that we're going to use this new API accessibilityDrag SourcesDescriptors and accessibilityDrop PointDescriptors. They're arrays of UIAccessibility LocationDescriptor that described the drags and the drops for review. So, we create a UIAccessibility LocationDescriptor for each drag-and-drop.
The view is going to be the view that has the drag or drop interaction associated with it. The point is the point where the drag is initiated from or the drop is associated with. And the name is meant to describe what that drag or drop does. And there's even an attributed name where you can use all those attributed keys that I was referring to earlier to tweak the way that VoiceOver describes the name of the descriptor. So, let's take a look at how this works in code. Now this is just a hypothetical example this isn't an actually springboard code. But let's say that I was implementing rearranging apps and I have a method that returns an app view for a given index path.
Well I'm going to start by initializing that app view and then I'm going to create a drag descriptor and I'm going to give it the name drag whatever the app name is because that's what I'm doing. And then the center of where the app is dragged from or sorry the point where the app is dragged from is its center. And the view that actually has that drag interaction is myself.
Then I'm going to override the app view's drag source descriptors to include that new drag descriptor. Now let's look at the drops. So, first of all, we need to identify the points on my app view where I actually want to use the drop, the left and the right point, I know where the center is already. And I'm going to create a descriptor for each of those. My leftDescriptor I'm going to say drop left of blank, folderDescriptor I'm going to say create a folder with the app and the rightDescriptor drop right of the app.
And then I'm going to override accessibilityDrop PointDescriptors to include all those new descriptors and I'll return my app view. And that's it, pretty simple. But important to convey for the VoiceOver user what each of your Drag-and-Drop interactions are like. So, if you're going to implement this feature then please take the time to do the accessibility for it as well.
Now I talked about a lot of stuff today, we looked at some new features, we looked at auditing your own app for accessibility, and we looked at various portions of our API to solve the problems that you find while auditing. And that's all great technical stuff, but I want to bring it back to the beginning to remind you why we're all actually in this room.
Accessibility is about people, the common person's life is enhanced by technology, but for people with disabilities it's transformative. Accessible technology enables people to live their lives the way that they want to, to accomplish goals and tasks that were once unattainable, to have jobs and roles in their communities that they couldn't have had before, and to interact with the world in ways that were once impossible. The work that you do to ensure that your apps work for everyone is critical. So, we thank you for the amazing work that you've already done, that you will continue to do, and that you will start today as we work together to make Apple's platforms more inclusive.
Now for more information, you can go to this URL where we've collected some URLs that are helpful for you like some developer documentation, as well as our accessibility website. And there are a couple of related sessions, there are the three sessions related to dynamic type that I covered before, and there's a media and gaming accessibility talk specifically for apps related to media and gaming that's in this room right after I'm done. Thank you very much, have a great rest of your conference.