Mac • 1:09:40
Mac users rely on the accessibility of your application in a growing number of ways, whether using VoiceOver, using AppleScript and Automator to build custom workflows, or using any one of the many third party assistive applications. Discover how to make your custom Cocoa views accessible to all of these users by implementing the NSAccessibility protocol, including many tips and best practices. Also learn about SnowLeopard accessibility additions and how to use them for your custom views.
Speaker: James Dempsey
Unlisted on Apple Developer site
Downloads from Apple
Transcript
This transcript has potential transcription errors. We are working on an improved version.
Good afternoon, my name is James Dempsey. I am an engineer in the Cocoa Frameworks Team and I focus on application accessibility. So wow, what a week, I have had a tremendous week. Just by show of applause, what kind of week have you had here at WWC 2009?
[ Applause ]
And just 5 days of tons and tons of information and it is 2 o'clock on Friday and you are all here, tell me more things, fill my brain more.
That is a long plane ride home, I need more to think about. So excellent, thank you for coming out to Advanced Accessibility. So what are we going to do today, first this is marked as an expert or an advanced session, so as I am talking, I am assuming that you have some familiarity with application accessibility on Mac OS X, things like the accessibility hierarchy and UI elements, tools like accessibility inspector and the Voiceover feature on Mac OS X. Let's talk about what we are going to talk about. So first, we are going to walk through some of the larger things that are new in Snow Leopard, things we have added.
Then we are going to do a very brief, technical review of application accessibility, because it is hard to talk about advanced things if the conceptual model that we have in our head doesn't quite match reality, and then we are going to spend the bulk of this session on making or talking about making custom views accessible. And for our example, we are going to take the venerable Cocoa sample application Sketch and we are going to make the graphics view in Sketch accessible using some new accessibility roles and attributes that we have added in Snow Leopard.
It is a very nice example in that it lets us focus on some design issues, some of which I don't think we have covered in previous WWDC sessions, and walk through a recommended process, kind of step by step how do we tend to make views accessible at Apple, and finally we will talk a little bit about some trouble shooting.
So what is new in Snow Leopard? One of the things we have added in Snow Leopard are new sets or kind of a couple suites of new accessibility roles and sub-roles, very much focused on the things that you might find, well one particularly on the web or office style or applications, like spreadsheets or presentations or even the page layout portions of a word processor.
And so we have added support for cell based tables, there is a new AXCell role, and it is able to convey information about things like column and row spans. We have added some additional parameterized attributes to make it very simple for an assistive application to move through tables that are cell based without necessarily having to read in everything to figure out the column and row spans and how to navigate. And some new cell related notifications and attributes, so you can find out when the selected cells change and the like. These are in use, actually, you can see them in action in iWork 09.
They are also present in the latest version of Safari and WebKit. And then layout areas, these attributes and roles are designed for use in any case where you have a canvas like area, a page layout area, anytime you have those rectangles with the resizing handles, and you can change the layout of things. That is where this set of roles and attributes is designed to be used.
Now that is exactly what we are going to be using to make Sketch accessible, so I won't dwell on that, you will see more details when we get to Sketch. And then we have also added some roles for level indicators and the various types of level indicators and content lists.
And those are roles for opposed to user interface that you might interact with as a list, like maybe a single column table, a content list is more if you had a bulleted list on a webpage. It is not something that is a control that the user can necessarily interact with, but we still want to call out that additional structure in the content. We have also added some new functionality in Snow Leopard.
The first is NSImage accessibility descriptions, where you can very easily take an NSImage and just set the accessibility description string on the image itself. When that image gets used in a button cell or an image cell or a segmented control segment, Cocoa will automatically pick up that accessibility description and report it for you. This was talked about in greater detail in the session earlier this week.
We have added support to NSMenu items can now support-overridden string attributes, so if your menus happen to be just a graphic with no text, it is very simple now to add a description to those. We have added-- or I should say we have exposed three methods in Snow Leopard that are optional methods in the accessibility protocol, and these all deal with attributes of an object that are array attributes that are array attributes that return many items.
Of course in our accessibility hierarchy, the most important of those is AXChildren. And by default when an accessibility client asks us for the count of one of these attributes or for a sub-set of these attributes or sub-range of these attributes, by default, AppKit will get the list of all of them and then perform an array operation on them.
Now sometimes, especially when you have many children, you may have a better way of doing that, where you don't have to necessarily create every single accessibility object to return the right answer. If that is the case in your code, then this is when you would use these methods. The other nice thing is that these methods actually will work all the way back to, no not even Panther, Jaguar. So implementing them today for Snow Leopard, you will still get that performance benefit even back on previous versions of Mac OS X.
And there are many more kinds of smaller changes or improvements that we have made in Snow Leopard and they are noted in the AppKit release notes for 10-6. OK that said, let's do a brief talk about review, the architecture of application accessibility on the Mac. It is a client server architecture essentially, now we have three applications running, an application or process like Voiceover launches and it uses accessibility client APIs, public APIs on Mac OS X called AX APIs that is the prefix that they use, and it begins sending accessibility request to your process. And your process responds to them, and is basically a nice little client server set up.
But what is every application on the system vending? Well apps vend a hierarchy of UI elements and so each node in the tree represents some piece of the user interface in your application. Note that the root of the tree is the application itself, and that is the exception to the rule.
It is not a visual thing, you can't see the application except for that root, every other node in the tree represents some visual element in your applications, something that is taking up some degree of screen real estate. Now each of these nodes is very powerful in that an accessibility client can ask for information about each of these UI elements through attributes, and in this case, one very important and useful attribute is the role attribute.
And underneath each of these element reps, which is the lightweight representation on the client side, you will note something like AXApplication, AXWindow, the role is how you can tell or how an accessibility client can tell one UI element from another. And that is an example of an attribute that can be requested by a client about an element.
There are also actions, so for instance in accessibility client could tell that AXButton element to press, and that request would come in and the button would actually press in the application itself. And then finally, an accessibility client can register to receive notifications about any particular node in that tree to find out things like when the value of a text field changes, or when a window moves or resizes.
Now how does that information get handed back to the accessibility client? Every node in the tree that was reported to an accessibility client has a one-to-one mapping to some object in a Cocoa application. These are all kind of possibly self-evident the NSApplication. There is that NSApplication singleton instance in your app. Every AXWindow is backed by an NSWindow.
And when a request comes in about that node in the tree that request gets routed to an object in your Cocoa application, and a method is called to ask that object for information. And we have batched all those methods up into a protocol called the NSAccessibility protocol. So again, very client-servery, the client makes a request, but the request gets routed to the right Cocoa object, which implements a set of methods and returns from that method the right answer for that request.
Now let's just take a brief look at that protocol, we will be seeing it in code much later, but there is a set of methods for getting and setting attributed values. We also have some fancier attributes, which take a parameter for things like text cell based tables and lay out areas. Sometimes you want to have the client provide some information to get a more specific answer for performing actions.
And then in addition, although it is not part of the protocol, the convenience function for posting a notification, so your custom Cocoa class can post an accessibility notification. Now how does the client get a hold of this information in your app? Well it can do so top down, the client can just say, Hey for this process, give me the root element. And that NSApplication will get sent back, well a representation of that application object will get set back as a UI element ref and then from there a very useful attribute is the AXChildren's attribute.
And so when that accessibility client ask for the children, a request will come into NSApplication, call the right method and NSApplication will send back an array of Cocoa objects, in this case probably an NSMenu that is the top level menu and a bunch of NSWindows that are the children of the application. And calling this recursively then, an accessibility client is able to create a whole tree that maps to your application's user interface.
Now in addition to top down, a client can also go bottom up, and they can do so by hit testing, handing in a point on the screen and asking the app, Tell me what UI element is there. And Cocoa will do the hit test, identify the object and return a UI element ref for that object and then using the AXParent attribute, the accessibility client is able to navigate back up the tree.
Now the reason I am spending so much time on this hierarchy is that those two attributes, AXChildren and AXParent in getting that hierarchy correct is really one of the central tasks in making your custom view accessible. Now to assist in that hit testing, there is another method accessibility hit test in the protocol, we also can tell in the assistive app what is the focused UI element, who has the keyboard focus, so a method to help with that. And then a method to talk about whether an object is ignored by accessibility or not.
And that bears mentioning for a moment. So you may notice that in Cocoa, we actually have a much more complicated view hierarchy, even in the simple window with one button in it example, than we are reporting to accessibility that window has a private frame view and a public content view and an NSButton view or control which has a button cell, and yet we are only reporting three things to accessibility.
And that is mainly because just as a user looking at the screen would say there is a window with a button in it. That is pretty much the structure we would want to report to accessibility. And so to allow for this reduction of complexity, any piece of that hierarchy, that view hierarchy, actually that piece of any Cocoa object could be ignored by accessibility. Ignored is not the same as hidden, when you hide something, it and all its sub-views disappear from view. When you ignore something that is telling accessibility to pass through to the first un-ignored child or set of un-ignored children.
And ignored is very important, it is the first topic thing we are going to look at in Sketch, because NSView, the class itself, if you put one into a window, it is not terribly interesting to a user. It draws its little grey background and that is about it.
And so by default NSViews, the NSView class is ignored by accessibility, when you create a custom view, you need to make it un-ignored by accessibility. And so if we take a look at Sketch as it existed for a long time now, if we look at it from an accessibility standpoint, there is an application, Sketch app, has a window. There is actually a scroll view that reports itself to accessibility.
It is un-ignored by default, but then inside that scroll view, there is nothing. It is blank, because that custom SKT graphics view is ignored. So let's jump off the bat and let's address that, so the first step is making our custom view show up in accessibility. So let's head over to the demo. All right, and first let's just build and run.
And I will open up a sample document here, and looking at this with accessibility in Inspector, you will notice that-- OK we see this window is reporting itself, but if we move the mouse in here, it reports there is a scroll area, but nothing inside of it. None of the graphics area shows up, none of these shapes show up, it's pretty much from an accessibility standpoint, just a dead area. We don't know what is going on in there. The first step is, to first make it un-ignored, so let's quit and do that.
Again it is the SKT graphic view, which is the view that is drawing that graphic view and you will notice it is just a straight sub-class of NSView, so it inherits the ignoredness. What I have done is added a category on SKT graphic view for accessibility, and very commonly when you are doing accessibility, you will put all of the accessibility methods and overrides in a category just so you have them all in one spot. And all I am going to do very simply is override accessibilityIsIgnored to return No. We will build and run. I hit the wrong thing there.
And I do that, now when I mouse over, there is something inside of that scroll area. It is unknown, but there is something there. That is progress. Why is it unknown? Well NSView has no idea what. It is going to be sub-classed to be turned into-- so all it knows at this point is I am a view, here I am, but I don't know what I am supposed to be doing. We are going to set the proper role next, because it is pretty darn easy.
So let's quit, I will hide that. So again, we are already getting some information from NSView, we are already getting some role, we just want to override the existing information. And so we are going to override the method, accessibilityAttributeValue, which is actually-- where a great deal of your accessibility customization happens in overrides or in implementations of this method.
And if we are asked for our role attributes, we are going to return the layout area role that is the new role that covers these areas where people do layouts. And if they are asking about some other value or some other attribute value, we will just let the super-class NSView handle that. So let's give that a run.
I will get rid of that, and now you notice our role is reporting itself correctly, and in addition, the role description has changed from unknown to layout area. That is because NWViews implementation is dynamic and is going to ask itself for its own role and then provide the correct system-provided localized string for the role description. I would like to point out that one other thing is that, we wrote what one line of code, two lines of code, yet we are getting a lot of accessibility information that is already implemented in NSView.
You will notice that it already knows who our parent is. It already knows our window top-level element. If we focus on it, it is giving us back the correct highlighted rectangle for its size and position. So we get a lot free from NSView, we just have to tweak a little in this case. So that is a start, let's jump back to slides and talk about making this more accessible. Let's go. So making custom views accessible-- that really is the meat and potatoes of doing accessibility in your applications.
In the talk earlier this week, there are a lot of relatively easy things to do to improve the accessibility that have a giant effect in your application, like adding descriptions to images and making sure that text fields are linked up to say their title UI element to give a lot of context to what is going on in the user interface. And those are relatively simple and have a great affect. But almost every application has some meat and potatoes view, some reason why you wrote the app in the first place and very often, those are custom views.
Sketch is a good example, the reason we wrote the app was to have this place where we could create and manipulate graphics. Although these views across many applications are very different, the process that we go through to make a view accessible is actually very similar every single time. It really breaks down to three different phases. There is a design phase, where we figure out the basic structure.
Before we start doing things, we have to figure out, well, What do we actually want this to look like to accessibility? What hierarchy do we want to be reporting? What nodes should be in the tree? and then, What should the roles be of those nodes? And then once we figure out what we are actually trying to provide to accessibility, the next step is taking a look at how we want to provide, take a look at the objects and the object hierarchy that we are using in our application, and mapping between the two.
What views need to be ignored? What views do we need to do a little work in? Once we get that basic structure and designs, we should build it out and phase 2 for me is, get this all able to be inspected and working in accessibility Inspector. We want to have the parent and child relationship of that tree working correctly. We want to be able to hit test with the mouse, and we also want the size and position working.
Now to me this second phase is really the-- I don't even know what the word is, it is the crux of making your app accessible. It is the crest of the hill, if you will. And then the last phase, once you have that basic structure in place, it really is a matter of filling in the remaining pieces.
We have gotten the fundamental things like the role and the hierarchy and the geometry of each element done. Now we need to report whether it is enabled or not, or we need to report a description on it. We need to implement actions, focus testing and any notifications that might need to be sent. Now this afternoon, we are going to be focusing on phase 1 and phase 2, but the Sketch plus accessibility sample code associated with this session will give you sample code that does all three phases.
So part of determining what you want in that hierarchy is determining roles and sub-roles and as we saw just a moment ago, it is programmatically simple to change a role, you just return a different string constant. The difficulty sometimes or the thing that takes more time is figuring out what is the appropriate role for this situation.
Note that what role you picked very much drives all the other attributes, actions and notifications that are needed for that element, because all of those are very role specific. Now sometimes picking roles is pretty straightforward. There is a very obvious mapping between roles and sub roles-- or to roles and your UI elements.
In this case, we have an image, we have some static text, we have a button and we have very straightforward mappings to existing roles, but sometimes it is not quite so straightforward, so here is a control, a custom back ball control. This is a little movie, we will see it in action.
This is a piece of sample code you can download as well, and this track ball has three degrees of freedom, and we don't have an AXMarble cool track ball role, and in fact very often as people innovate with their user interfaces and come up with cool new little widgets, there isn't an exact role that matches.
So what do we do in that case? Well if there is not a clear mapping, we would like you very much to take a look at the existing roles and find some way to expose the functionality of that widget through accessibility. So in this case, we have three degrees of freedom, we might make that whole thing a group with three sliders, with each slider maybe going through the range of values for the X, Y and Z axes.
Again it doesn't map exactly to what it looks like, but we wouldn't be able to provide that with the set of roles anyway. But at least a Voiceover user is able to get to that control and manipulate it, and get through the same functionality or get the same functionality that a sighted user would have. And in general as we talk about roles, what do you do when there is an exact fit? Well again, don't let a perfect fit be the enemy of implementing accessibility.
The first thing is, Yes, you should try to map the visual parts as best as possible. The reason for this is that very often a Voiceover user is working in collaboration with a sighted user, and they are going to talk about this button or that scroll area, or what have you, and having it all mapped visually helps a lot in that collaboration. But when that is not possible, expose the functionality.
The second thing is to ask, if you are making your application accessible, I highly recommend that you sign up for the accessibility-gov mailing list and if you hit one of these issues where you are not sure how it should be made accessible, ask, because often it is the first time you are making something accessible. But the folks on the list have done this numerous times and might be able to help you avoid some pitfalls.
And then finally, especially if you feel this is a common UI element, there really should be a role for this, file an enhancement request in bugreporter and please include a screen shot, because occasionally we will get a request and we just have no idea what kind of UI element you are trying to describe. There is one other thing to note about roles, and that is using an alternate role description.
So the role is how an app, an assisted app like Voiceover finds out programmatically what kind of element this is, but the role description is a localized string that is read by Voiceover to the user, and so occasionally, very sparingly, you might want to use an alternate role description on an element. Now the kind of canonical example here is tabs in a tab view behave a lot like radio buttons. You select one, you can only have one selected at a time and you select one, another one de-selects.
And so for accessibility, these are all reported as AXRadioButton Elements. That is their role; however, in a collaborative environment, you have a Voiceover user talking about radio buttons, and you have a sighted user collaborating talking about tabs and it gets a little confusing. In that case and in Snow Leopard, tabs report their role description as tabs, not radio buttons.
But why should we do it sparingly, well every single time a Voiceover user encounters a new role description. They need to work out and figure out how do they interact with this new user interface element, and certainly if there are thousands upon thousands of role descriptions ,and every time you move that Voiceover cursor, it is something you have never heard of before, and you have no idea how to use this thing that is going to be a very frustrating experience. So the smaller the number of roles that are described, the easier it is to learn the system.
But again there are these cases like this tab case, where it makes a lot of sense to provide an alternate role description. OK that said, what are we going to do in Sketch? Well Sketch works out really nicely with the new roles we have defined in Snow Leopard. The main view area will be an AXLayoutArea, each shape is an AXLayoutItem, an item in that area and then even the handles will be represented, because they are visual elements, as AXHandle, the third new role that we have added in this layout area suite.
So we know how we want that hierarchy to appear, now, How do we get that to happen? Well again, we need to do this mapping between our object hierarchy or the objects that are actually in our Cocoa app and which ones of those are going to be the backing in essence for the nodes in the accessibility tree that we are reporting to an accessibility client.
So take a look at your object hierarchy, some objects may be ignored. Sometimes like in this view case, it already implements an accessibility, so we just may need to override some methods. And in some cases, we might have objects that are just inherent from NSObject and need to implement the entire protocol.
It is maybe 10, 12 methods, not that much, and sometimes we even have elements that we want to report that we currently don't even have an analogous object in our Cocoa application, and we are going to look at two different cases of this, using a faux UI element, where we just make up an element out of thin air to respond to these requests and a proxy element where we have an object that kind of does the job.
But for one reason or another, we can't use it so we will create a little proxy that has a reference to that object that exists already in our app to pull some information from it. So if we look at Sketch's hierarchy, there is an SKT graphic view and then that graphic view has a bunch of graphics, graphic objects. They each know how to draw themselves. They each know their own bounds, so OK, great.
We will have that view report all of those graphics as their children, those graphics will be a great place to implement an NSAccessibility or so I thought, when I first sat down to make Sketch accessible, but we do have a bit of an interesting design challenge. It would be a good word for it, so as is often the case in drawing applications, what is the word, usually or very often in drawing applications that we don't necessarily see in a lot of apps is that the graphic object in this case, it is the model object that gets written out to the document, but it also happens to know how to draw itself in a view.
In a drawing application, very often the model is also used in the view, because it honestly makes a lot of sense to do so. But that gives us an accessibility wrinkle, because as Sketch exists today, it is a document-based app, the document holds on to the model objects, the graphics, it also has a window controller that has its window, in the window lives in a graphic view and the graphic view through bindings has a reference back to the graphics.
That is all well and good, and so we could potentially-- that graphic needs to report its accessibility parent-- can only have one accessibility parent, and so that graphic-- OK, it will report that graphic view as its parent. Everything seems just perfectly fine, until I looked at the little to-do list on the Sketch that we may want to add additional views in another window on the model.
And you have certainly seen this in other drawing applications where you have a document window open. Then you open a second window on the same document. Maybe zoom into 800 percent so you can do some fine tune work in one window, while the other window is showing the results at 100 percent.
And certainly that would be fairly straightforward to do in Sketch, in which case, we would end up with a second window controller with a second window with a second graphic view, but also using the same underlying graphic objects, the same model, and when we would report it to accessibility that one graphic object now has two accessibility parents. Well that doesn't quite work in a tree where you only have one parent, so were we to go down this path, where we would extend Sketch, we would hit a bit of a bump, a bit of a bump.
So what to do? We are going to introduce a little accessibility proxy object. The idea here is that whenever we need to return the children of a graphic view, we will make these little temporary objects, we will set the graphic view as the parent, we will set the graphic so now this little proxy has a reference to its accessibility parent.
And it has a reference to the actual graphic that it represents, so we can ask it for all sorts of interesting information like its bounds for instance. Each of these proxies has a separate parent, even though it refers back to the same underlying model object. We are good to go, so from our original design decision we now have a slight change. We are going to use a graphic proxy object, it also will inherit from NSObject and so we will need to implement the 12 methods of the accessibility protocol. Let's take a look at doing that now.
So in this demo, we are going to get those shapes reporting themselves and we are going to do essentially the phase 2 portion, getting the hierarchy working, hit testing working and geometry working, size and position for all of the shapes of the layout area. Let's take a look at this in progress, so first let's look at this proxy object, it is very straightforward.
We have an object that holds onto the accessibility parent and the backing graphic. We have some properties to get and set-- to read those values and initF?, method, or actually a factory method that will take the graphic in the parent and give us back one of these little temporary objects already auto-released.
Let's take a look at the code behind that, again we are just going to take the parent and the graphic and hold onto it, when dealloc, we will release them, nothing to groundbreaking there and we want to make sure that accessibility is ignored as No. Here are the attributes that at least for this portion, we are going to return. We are going to talk about the role parent, the window and the geometry information, the rest we will get to in a future demo.
And if we look at the attribute values as they are asked by accessibility, we are a layout item, so we will return that. We use a convenience function, or it isn't even a convenience-- you should use it-- an accessibility role description, which will take a role and an optional sub-role and return a localized string that will be presented to the user.
If they ask for our parent, we return the parent object that we set. If they ask for What is our window? or What is our top level UI element? well we just ask our parent for that because it should have the information or it will have the information.
For our position, well that graphic that we are holding onto knows its own bounds, but we also need, because when we report information to accessibility, we don't report it in our own view coordinates, we have to do it in screen coordinates, so we need whatever view is containing us in order to do the appropriate translation of coordinate spaces. So we get hold of our containing graphic view, and that is simply a method that I have written above that goes up the hierarchy until it finds its first un-ignored view.
We get our bounds, get our origin, we check if we are flipped, and if so we need to adjust what corner we are sending back. But then essentially we are moving from local to window coordinates and then from window to screen coordinates just using standard Cocoa APIs for doing so, and then we return and NS value containing the point of our position. For size, it is even a little easier, we get our containing graphic view, get our local size and then we move from our local size to the window size.
The window coordinate space in terms of size is always the same as the screen's, so we can take that size, wrap it in NSValue and return it. So again, the bounds information and the graphic is already there in Sketch, we are just pulling it out, translating it to the right coordinate space and sending it back for accessibility, and if somebody asks us for something else, we just return nil.
We don't have a super-class that has implemented this method, if somebody wants to hit test, they will hand us in a point. Now hit testing with accessibility. If this method is called on your object, it has been determined by one of your parents that Yes, this point is either within you or within one of your descendents, so therefore at the very least at this case, if this method has been called, at present, we don't have any children, we don't have any descendents, so it must be us.
We would return self, but you have noticed that I have wrapped it in this function NSAccessibility, un-ignored ancestor. If for some reason, somebody sub-classed this class and made it accessibility ignored, we wouldn't want to return that to accessibility. What this function will do is, it will take-- it will return yourself if you are un-ignored, but if you are ignored, it will walk up the hierarchy to find the first un-ignored ancestor and return that instead. And then finally, we have put in stub methods for the rest of the NSAccessibility protocol, in case those questions are asked by an accessibility client.
And the last thing I would like to point out is, whenever we are making these temporary objects in accessibility, we will make them, we will use them and then let them be auto-released, and then when a question comes in for that element again, we will make another object.
However, to get back to the right spot, we need to be sure that two objects that represent the same element respond Yes to isEqual and hash the same. And so all I am doing here is ensuring that we return Yes to isEqual in the case that we are the same object and that our hash is correct.
OK? And finally, we have got that child object all set up, let's take a look at what we are doing in the parent to return those children. So if we are asked for our children as the graphic view, we will get ahold of all of our graphics, we will make an immutable array that we are going to stick these proxies in, then we will run through all of them. I am going in reverse because I would like to send them out in Z order rather than reverse Z order, which is how they are stored.
And for each child, we will make one of these little proxy objects, stuff in the graphics, stuff in ourselves as the parent, return the array, return the array, and we have just reported all of these little objects as our children. For hit testing, we want to hit test a shape.
And so the point that comes in again is a screen point, so we will convert it to the window, convert it down to the view itself and then we are going to use a method that Sketch is already using for hit testing to shapes to find the graphic under that point.
If we find that hit graphic, we are going to make one of our little proxy objects, these temporary objects, stick the hit graphic in there, put the reference to ourself in and return it. And then finally, if we didn't hit a graphic, we just return ourself or more accurately our un-ignored ancestor.
And with that, we should have phase 2 done for the first chunk of our hierarchy, the layout area and its items Let me hide Xcode, go to our sample document here and bring up the Inspector. And as I go around, indeed the layout items report themselves, let's see, let's focus on one of them and turn on the highlight to see. Yes, it is reporting its bounds correctly, it is sizing its position.
Let's test to make sure we can go up the hierarchy and then come back down using accessibility inspector. We will navigate to the parent element and indeed, it is the layout area with the correct bounds. Let's go back down to the children, maybe the first layout item, which should be that square.
It is hard to see the highlight there. Let me go back up and we will take a peek at the line. And notice that it is the bounds of that line, all values and accessibility are a rectangle. And so indeed with what we just did, we now have enabled half of the hierarchy we are trying to do, excellent. Let's go back to slides.
So a few things to note about what we just saw, every element requires a reference to its parent, because it needs to report its AXParent. And every element needs some way to have access to the view that contains it, because every element needs to report its size and position, so that containing view is the go-to object for doing all of the coordinate transformations. We saw that in this case. We will see that in this other case as well, as we talk about these handles.
Oh, one other point about this geometry, NSAccessibility uses Cocoa screen coordinates. In Cocoa the bottom left is the origin, bottom left of the main screen and so all of the coordinates that you will ever receive, all of the points that you ever send back will be in Cocoa screen coordinates.
The accessibility APIs, the client APIs take a more Carbon look at screen coordinates, where the upper, is that left or right, I can't even tell anymore, upper right of the main screen is the origin. And so you will pass a value back in Cocoa screen coordinates, when you look at it in accessibility Inspector, which is using the client's screen coordinates, the value for that point is going to look different.
You didn't do anything wrong, just the transfer, the translation is happening automatically for you. The best way to check however is to just turn on that highlight in accessibility Inspector to ensure that the right rectangle is being reported on the screen. There is also that issue of IsFlip that I kind of went over a little quickly in the code. Let's talk about that in a little more detail. So we do have to translate from the view to the window to the screen for the position, the point that is the position.
If we have a flipped view, then it is going to be the upper left corner that is what is reported as the origin of that rectangle in the flipped coordinate space. But if we just take that point and translate it to the screen point, that is the wrong spot, because we really want to report that point to accessibility, the lower left point, because that is the point in Cocoa screen coordinates that maps to the origin that we are looking for.
So if we are flipped, we need to report the correct corner, so the code that we saw does that. OK that said, let's get back to mapping our hierarchy for these handles, where we don't actually have a handle object in Sketch. In fact, the way Sketch deals with handles is the graphic itself just has this e-numb, 1 through 8 for each e-numb.
It is just upper left, middle left, upper right, and that e-numb value is used a lot of places in Sketch to represent each handle. And then the graphic itself does all of the drawing of the handles, all of the hit testing, whether a handle was hit, so it can resize and the like.
But we need an object, so when we report an AX handle, there is some object that those requests can go to. So we are going to use a fake UI element, we call it a faux UI element. This is a very prevalent and useful pattern in making your applications accessible, so much so that faux UI element, we ship it in an example, sample code, so you have the full source to it and we are reusing that example code as is in Sketch. I highly recommend any time you hit this case that you grab the faux UI element sample code and use it as is and we are going to sub-class it for the SKT handle element class.
Now that faux UI element, you can search for the image map example that is where it first appeared. It is truly the easiest way to provide one of these elements that appears only for the sake of accessibility. Let's take a look at doing that. So we are going to do phase 2, getting the hit testing, the geometry and the hierarchy correct for handles.
OK, so first let's take a look at SKTGraphic, I needed to add some additional functionality on the graphic itself. I needed each graphic to be able to tell me the handle codes. Since we are using an e-numb, I just chose to use an index set so I have an easy way to pass around a bunch of integers, typically 1 through 8. Also, since the graphic is the one that knows where to draw its own handles, I have implemented a method where I can hand it a handle code and the graphic will hand me back the correct rect.
I have-- OK, let's take a look at faux UI element. This is the reusable class and again it itself, it is very simple. It is designed to be reusable so it doesn't have a hard coded role. When you create one, you set the role, you set the parent.
These little reusable objects, typically the reason why you need them is because all of the smarts about that particular thing on the screen lives somewhere else. That is why there is no object to represent it, and that is typically the parent and so faux UI element that class doesn't know a heck of a lot about anything.
It asks its parent for all of the relevant information, so it defines this child support protocol to be able to call to the parent and ask for things like, What is my size? What is my position? Am I focused or not? We won't walk through all of faux UI element, but I did want to show you first the importance of isEqual and the hash function.
Again these temporary objects need to compare as equal if they represent the same element, and second to show that this faux UI element, if you use this as your super class, it is already providing a lot of stuff, the code is already written to handle the role, role description, whether it is focused, the parent, window, top level UI element, size and position.
Now our own little sub-class of the handle, we are going to hold on to the handle code, because that e-numb value. That single integer value is really what sets one handle apart from another, lets us identify them. We have an init method and a convenient factory method and a method to get the handle code out, so we are really just a little wrapper around this single identifier.
The init method, not much to see there, the key thing to see here is in our sub-class, we have overridden isEqual and hash to throw the handle code into the mix, because once we have a parent shape and a handle code, we are uniquely identified as the correct element.
That is all that is going on in the handle itself, what we just looked at. It is all really going on in that graphic accessibility proxy where the handle through faux UI element is already correctly reporting its parents, now we need that shape to report its children. And really the children are also the handles and the layout area or layout item reports an attribute of its handles.
So for the children, when we are asked for those, we will just get our handles and return those. When we are asked for our handles, if we are selected, so our handles are showing, we are going to return an array of handles, UI element objects. Otherwise, we will return an empty array, because we have got nothing. Let's go take a look at this handle UIElementsWithHandleCodes, so we take-- oh let me go back just a moment.
Note that what we are doing to get that set of handle objects is we ask the graphic that we are a proxy for, Hey give me your index set of handle codes, then we-- I have factored out a method that will take that index set and build the little faux UI elements that we want.
And all it does is it takes that index set of handle codes, gives me a mutable array that I can stick stuff in and then I am using a new block enumeration API on an NSIndex set to run through every index. For every index, I just create one of my handy little fake handle objects, stick it in the array, release it and at the end I have got the array of children and the array of handles.
That is all it takes to return the children. For hit testing, again now I have children that I have to worry about hit testing. I do exactly what? Well it is very similar to what I did in the view itself. I convert the point down to be a local point, I ask the graphic for the handle under the poin. And again most applications already have hit testing implemented.
And then if I have a hit handle, I make up one of my little faux UI elements and return it, otherwise, it is my un-ignored ancestor. And then finally, support for children elements, well these handles when they are asked for their size or position are going to ask me, their parent. So I will take that faux element, grab the handle code out of it, I will ask that graphic I am the proxy for the handle-- for the rectangle, then I will do the appropriate coordinate transformation and return the screen point.
The same exact thing for size, once I get that bounds for the handle code, it is easy for me to take the size and make it into the screen coordinate space and return that. And with that we have handles. So let's take a look at that. OK, let me not do that, oh dear.
That is the app and if I bring up the Inspector, let's select, and if I hit a handle it is reporting that I have a handle. Actually, these handles are extremely small, so we can also check to make sure we actually are doing correct coordinate transformations. Let's zoom into 800 percent here, so we can get a good look at a handle and as we mouse over, indeed we have a handle. As we lock focus on it, we are getting the geometry right, let's make sure we can navigate up the tree to a parent.
We can navigate back down the tree to a handle. And indeed now, we are reporting the layout area, the layout item and children, the children of the layout item, which would be its handles when selected. Let's deselect and we will lock Focus on that item to make sure that, notice it is a giant shape now that the children are size 0 because there is no selection. All right, great, we have just achieved a giant milestone in the accessibility of Sketch. Are you as excited as I am? OK, all right back to slides. That seriously is a major milestone in the accessibility of Sketch, so yes.
[ Applause ]
When you have your accessibility hierarchy for your custom view showing up in accessibility inspector, you can navigate up and down the tree. You are sure that all of the geometry is correct. You are far and away, well on your way to having that custom view be very nicely accessorized. Next comes phase 3, which is working through the remaining items. Now again, we have implemented a lot of the very core attributes, but every-- for instance, the layout area has an attribute of being able to determine its selected layout items.
We haven't implemented that yet, and in fact the roles that we chose in step one, very much determine what we need to do now. What are the remaining attributes? What are the remaining actions that we need to implement? What attributes can the client not only get information about.
but also set information about? And in the layout area, we can set the size and position of a layout item through accessibility. We can also set the position of each handle, so an accessibility client should be able to get a hold of a layout item and move it around or resize it through accessibility. Or drill down to an individual handle and move it around.
And whatever that handle does, in most cases resizing should take that affect in the application. Or say in Keynote or applications where you have that little, maybe it is a little rounded rect, and it has that extra handle that you can adjust the corner radius that should also be moveable, so that is an assisted app like Voiceover could move that handle around.
And tests and a very, very handy, actually the user default or argument to pass in when you run an app is NSAccessibility DebugLogLevel 1. NSAccessibility DebugLogLevel is going to log any time the accessibility infrastructure and AppKit hits something that it just doesn't like, something is awry and amiss.
And sometimes you are done with accessibility, but then you run with this log level, and you realize there are a couple of rough edges that you realize you have to iron out. Actually, that helps bring us to trouble shooting. The first thing that happens when you are trying to troubleshoot is ensuring that the parent and the child reporting is correct.
For instance, you could imagine very easily on the one hand, you have a proxy object reporting its parent, but in the parent, you accidentally have the actual graphics being reported as the children. That is not going to work. You are not going to see the right results in accessibility inspector.
The most common thing or reasons that phase two might go awry is when you have these objects that are temporary isEqual and hash for the same element must return Yes that those are indeed equal. Sometimes you may be returning ignored elements, because you have not been using those un-ignored element helper functions.
And again if you have an issue setting NSAccessibilityDebugLog Level 1 as your best way of getting a lot of diagnostic information about what is awry with your app. Now the last thing I would like to do is take a look at Sketch, once I have completed phase 3. And associated with this session should be the sketch, plus accessibility sample code, which is the completed version of Sketch and let's take a look at it with Voiceover. So we will launch this completed version.
Turn on Voiceover.
[ Computer talking ]
[Presenter] OK for those of you who may have been, how many folks were in the introductory accessibility, anybody? OK, so a few of you. In that case, you knew that, or you know, the rest of you can just imagine, when we turned on Voiceover and pointed at the sketch document, it was like ahhh. Didn't know there was nothing there, it was just a big empty space.
Now we are getting some information. We are in a layout area. Let's drill down and see if we can take a look at those things. So Voiceover user then says, What is in this?
[ Computer talking ] Line layout item.
[Presenter] Line layout item.
[ Computer talking ] Text layout item.
[Presenter] Text layout item.
[ Computer talking ] Circle layout item.
[Presenter] Circle layout item.
[ Computer talking ] Rectangle layout item.
[Presenter] Excellent. And I don't know if everybody saw the announcement in Snow Leopard, there is this amazing new feature in Voiceover, the screen reader on Mac OS X where a user for the first time is able to use the multi-touch track pad as a way of navigating in a window. And let me show that to you very briefly. It is especially handy in those views that have a very graphical layout, so let me turn it on very quickly.
[Presenter] And then I am just going to use one finger and move it around the track pad. You are going to see a little dot on the screen, a little circle on the screen that we have here for demo purposes that will show my finger moving on the track pad. You will also see Voiceover detailing exactly what is happening on the screen-- or as I move around. So there is my finger.
[ Computer beeping ]
[ Computer talking ]
[ Computer talking ] Layout item.
[ Computer talking ] Line layout item.
[Presenter] Get a pretty good idea of where things are in relation to each other? How large things are in relation to each other?
[ Computer talking in background ]
[ Applause ]
That is extraordinarily cool.
[ Computer talking ] Track pad off.
[Presenter] OK, the next thing I would like to do is, we can also set values through accessibility, so let's move this circle around a bit.
[ Computer talking ] 140.000
[Presenter] This is not using scripting, this is not using Sketch's own event handling. This is through accessibility's API.
[ Laughter ]
[Presenter] Oh sure.
[ Applause ]
Feel free. And then finally, we have exposed handles with the size and position, but every assisted application may choose to do whatever it wishes with the information your application provides. And rather than make a really little rectangle on the screen, when you hit a layout item in voiceover, it chooses to allow you to access them through a menu and resize using that handle. So let's get to that little menu.
[Presenter] Now, I have added in the final product some descriptions for each of these handles, so we can tell the different places around. Let's move around the lower left handle.
[Presenter] And then again, by changing the position of that handle through the accessibility API, some other process voiceover is able to do this in sketch.
[Presenter] We are moving that handle. There is a little bug in Voiceover that is not redrawing the cursor.
[Presenter] But you are the server of information. That is a client bug, so Sketch is doing the correct thing in this.
[ Computer talking ] Voiceover off.
[Presenter] And again over the course of this session and over the course of this week, Sketch which had this kind of big area that was not accessible at all, now a Voiceover user can do actually a great deal of stuff within Sketch through the accessibility APIs. And we would of course encourage you to do the same thing with your custom views.
[ Applause ]
So for more information, Eric Hope, our Technologies Evangelist, again that accessibility mailing list, when you are doing accessibility for you app, I highly recommend subscribing to that list. We have a bunch of documentation, and we also have an accessibility webpage on apple.com. This has been recently redone.
It has a lot of great information, both about Voiceover, about all of the things Apple is doing with accessibility. It also has a little section where accessible applications are highlighted. We love to see accessible applications, and we love to tell everybody about what applications are accessible, because there are a lot of people very curious that are wanting to use your apps through Voiceover. 19