When the guys over at Q Branch first showed me Vesper, they already knew what my first question would be:
How's your VoiceOver support?
At the time it wasn’t great, but we’ve made some real strides. (Some of what I’ll be covering didn’t make the cut for 1.0, but will show up in the first bug fix release shortly.) I’m going to describe my process for auditing the app, providing feedback and how we implemented the basic accessibility features of the app.
Establishing a baseline
First things first when auditing a new app, you need give it a run through with VoiceOver on. I have my phone setup so that I can turn on VoiceOver by triple clicking home, I recommend you do the same.
So with VO on, it’s good to start with the basics:
- Are things selectable? Sometimes apps are just completely opaque and need some massaging before they do anything other than bonk when you try to select anything.
- Does the navigation work? Most importantly, can I get stuck on a screen where I have to disable VO to get out of it?
- Are buttons and other controls labeled? Do they have sensible hints?
- How is the text selection?
- Can I select things that should be invisible or otherwise obscured. This is a very common problem and easy to miss if you’re not testing with VO on regularly.
- Does left/right flicking with one finger navigate through the on screen items in an order that makes sense? Same question for a two finger swipe up or down to navigate all items on screen. These are both primary discovery mechanisms for VO users and critical to making your app as usable as possible.
Vesper uses customized, but still largely stock user interface elements, so the basics work pretty well for evaluating this app. Right off the bat, nearly everything was selectable, but almost nothing was labeled and many things that should be hidden or obscured were selectable. The drag re-ordering in Vesper isn’t a standard implementation with an edit button or drag handles, so there was some need to customize the cells to support dragging. Same story with the swipe gesture to archive/restore notes.
- Labels and hints galore. We gave all the controls sensible labels and hints as to what they do. I suspect once we’ve got users actually using the app we’ll be able to refine these labels and hints to be more useful.
- Elements that were obscured visually but still visible to VO were marked as hidden.
- Elements that show modally (like the various popovers in the app and the in-app browser) were marked as modal and we implemented support for the two finger scrub to escape gesture so that users can exit a popover without making a selection.
- Drag re-ordering is accomplished by giving the cells the trait UIAccessibilityTraitAllowsDirectInteraction which allows the user to double tap and hold and then have normal interaction with the cell in order to re-order the cells. As re-ordering is happening, the position of the cell is announced using notifications. This same direct interaction and notification style also allows users access to the restore and archive swipe actions. The archive / restore functionality isn’t called out in the cell’s hint because the hint is already substantial, but this will likely be supplemented with an explicit (probably VO only) control in a future release since it’s not very discoverable.
- Up until very recently, all the tags and tag suggestions views were subviews of the app’s textviews. This causes issues because text views present themselves to VO as containers for their lines of text and if you add subviews the system won’t be able to find them. This meant that the tags views needed to become (and are now) siblings views sit on top of the text view rather than subviews sitting inside.
- As I mentioned above, I think we’re going to need an explicit control for archive and restore to make it reasonably discoverable.
- I’d like to integrate some way to annotate the images attached to notes with something akin to alt text.
- Gather feedback and fix the things we didn’t figure out during the beta.