Recently at work I’ve been adding voice over support to a React Native app, this has been … testing shall we say.
The reason is because there are various props to use.
accessibilityLabel : what will be read out, nice and easy
accessible : whether its accessible, mixed here .. sometimes needed sometimes not. Needed more on iOS than Android
.. and a few others which work well like
But then you get to the fun, differing props for Android and iOS:
importantForAccessibility : Android only, various options to pass through to the Android Framework
accessibilityElementsHidden : iOS only, somewhat similar to the above, but not the same
You can check the docs here: https://reactnative.dev/docs/accessibility
So. Because Flutter doesn’t use native components and instead draws them itself using Skia. Is voice over support easier / consistent between platforms?
To test this, I’ve written a very simple app with some familiar UI patterns to the one I’m working on:
- Home screen with a list of items, and carousels
- Offer details screen
- Bottom Navigation tabs
What we want
- Read out correctly in order
- Announce form errors
- Announce API errors
- Read out correctly in order (including nested components / custom layouts)
- Correctly navigate through a carousel
- Offer Details
- Read out correctly in order
- Back navigation announced, and actionable
- Read out the items in the list view
- Bottom Navigation
- Allow the user to easily swipe through the options
- Read out the tabs, and make sense to the user
Not asking for much, but looking for this to be the same on both Android and iOS.
What worked out of the box?
- The header
- Edit boxes, with hints
A few things need help though. For example the show/hide password button. As that is just an
IconButton we need to give it semantics information:
Simply wrapping the
IconButton widget in a
Semantics widget, and adding a conditional label fixes that problem.
Now onto something that is forgotten about. Announcements…
In the first screen, there is a validation error on the password field, on the second screen the call to the API fails with an error. These must be read out to the user, without changing their current focus / position on the page.
To do this, we call the
SemanticsService in Flutter.
This screen now reads as expected on both Android and iOS, and is fully usable to a person needing the use of the Voice Over / Talkback.
This screen (with exception of the offer images) could be used without any changes. However, there are things that should be done to make the experience nicer.
Lets take the first component:
Without any work, this will read each element separately. Ideally we want to read out “Your rewards, 400 points” .. swipe .. “View transactions, button”. To do this I used
This will merge semantic information in the containing widget, and read out as one.
The next component to tackle is the carousel of products:
Again, this worked. But the user would have to swipe through the title, description, and points. I wanted to read out, the product title, description and say how many points it would earn.
Semantics has a property called
excludeSemantics when this is set to true, semantics information inside will be ignored. This meant that I could add a label containing all the information needed easily. I also wanted to convey to the user that they were in a list, so added a prefix of
Item x of y inside this carousel.
Next the offer images, which the user can tap to go to the offer details page:
As these are images from an (hypothetical) API, I added a
contentDescription to the API response. Firstly I wrapped my Image in a
GestureDetector which makes the voice over treat the item as a button, information to use a double tap. Then wrap that inside another
Semantics widget, and add the content description as the label to be read out. As double tapping this takes you to the offer details page, we might as well look at that next…
With this page being very simple, absolutely nothing was needed. The user is informed there is a back button, and it is read out correctly and actionable.
This page uses a standard
ListView in Flutter, and re-uses the same Product Card I created in the Home page. Each item is read out as designed in the home page, reading the title, description and points earned in one.
I used the standard Material Bottom Navigation widget, this worked with the screen reader without any changes needed. It informs the user whether the tab is selected, what index the tab is, the total number of tabs, and the title of the tab.
Better than React Native?
In short, yes. I wrote the app only using an Android phone. Tested all of the TalkBack functionality and made any changes needed. I then ran it on an iOS device and didn’t have to make a single change to support Voice Over. This is not the experience I have had using React Native, due to it using native components, and differing props for each platform.
Here’s a video showing how voice over is working…