React Native Accessibility Is Pretty Bad

Allow me to preface this by saying that React Native sets out to solve problems I don’t think are very interesting to solve. In particular I don’t think that sharing a single UI between disparate platforms is desirable from any perspective that isn’t cost savings and that even from that perspective it’s dicey.

So I’ve been recently saddled with some React Native problems.

Hey, here’s some very good React. It’s idiomatic and uses the popular and recommended libraries. Isn’t it great?

~Them

So I do what I always do, I fire up VoiceOver. I had read the docs for React Native when it was new. I knew they were aware of accessibility, so maybe it wouldn’t be too bad.

(You’ll be very surprised to learn it was quite a bit too bad.)

Some of the issues were specific to the project. They didn’t understand VoiceOver or how accessibility works in React, so I spent a bit of time fixing those problems before I got into the weeds.

Where I realized there were some tricky problems, and maybe some intractable problems, was when I dug into React Navigation. This is the library recommended (but not maintained) by Facebook for navigation in new projects. (It’s very very cross platform and we all know that’s an inherent good.)

A detour to set expectations

If you navigate around an iOS app that uses built-in UIKit controls like UINavigationController, you get lots of great stuff for free and maybe the most important free thing is compatibility with assistive tech like VoiceOver. When you navigate to a new screen you hear a tone and VoiceOver jumps to a logical location on the new screen, usually the title in the navigation bar. If you want out of that screen you can do a two finger scrub (in roughly the shape of a Z) and boom you pop back one screen, again hearing the tone that tells you that you’re somewhere new and again VoiceOver moving it’s focus somewhere sensible.

Under the hood, this is all done with a straightforward accessibility API, primarily via notifications and actions, largely automatically. When the navigation controller pushes a new controller onto the stack, it can inspect the navigation bar for a sensible element for VoiceOver to focus on, then tell the system that there’s a new screen and here’s an element that should be considered for initial focus. The container view that holds the views of the controllers in a navigation controller can implement an action called escape (the scrub mentioned earlier) and when VoiceOver receives that gesture it can search up the view hierarchy until it finds someone implementing escape and it more than likely finds that navigation container view and navigates to the previous screen.

iOS has had a screen reader (VoiceOver) since iOS 3, it’s had an accessibility API since iOS 5. They’ve been hard at work for a long time now and have given us a toolchain where you have to make a lot of sketchy decisions to produce software that isn’t useful to people who use assistive technology.

React Navigation from an accessibility perspective, is one big sketchy decision, and React Native isn’t helping them do better

The accessibility API in React Native doesn’t even offer the API for accessibility notifications or for accessibility escape. For some reason they support an action called magic tap, an accessibility feature no app does (or should) implement, but skipped over escape, which you need if you want to create an accessible navigation library on iOS.

Given the lack of React Native support for important accessibility API, anyone building a new app and using the recommended library for navigation has to

  • Know these technical and usability problems even exist
  • Implement a native bridge for posting accessibility notifications
  • Implement a middleware to respond to React Navigation actions and post screen changed notifications using that bridge
  • Implement an action for the escape gesture, place a natively bridged component somewhere in your applications component hierarchy such that it can receive the event from the system and dispatch the action which will eventually find it’s way to dispatching a React Navigation action to pop it’s stack by 11
  • Figure out how to, in your middleware that watches for React Navigation actions, find an appropriate initial element for VoiceOver to focus on and get that element over the bridge to the accessibility notification API (I’m not sure how to do this in a way that’s at all generalized and not very project specific)

Most projects won’t even clear the first hurdle.

Buttons aren’t buttons

The various Touchable components in React Native don’t advertise themselves as buttons to the accessibility API. You can readily add the annotation. They even made enhancements to the API for doing just that in a recent version of React Native. But, as with navigation, you’d have to know this is something that’s needed.

Phantom Elements

Flex box as a layout idiom loves to have a hidden element to make stuff center properly. On top of being obnoxious to me personally, this makes it super easy to leave phantom elements floating around your UI that only VoiceOver users will encounter.

Missing API

The React Native accessibility is missing a LOT of the native iOS accessibility API. Much of which has been around almost as long (and in some cases longer) than React Native has existed.

Custom rotors are cool. Knowing what’s up with Zoom is cool. Annotating your views so Switch navigates in a more useful way is cool. React Native can do exactly none of those things without cumbersome bridging.

Accessibility is Table Stakes if you’re going to claim to be building “native” apps

If your fancy new library is 🎇 Whatever 🎇 Native 🎇, and the front page of your website is all about how everything is just like if you had written it with the platform vendors recommended toolchain, then you have a moral responsibility to make that statement true in the ways that it matters most. One of those ways is accessibility.

  1. Or you could submit a PR to the React Native project and hope one day someone actually looks at it. 

Updated: