Our team has taken a look at some of the most recent accessibility features with the latest releases for both Android 12 and iOS 14. Both Google and Apple have added several new features for Accessibility, but we have highlighted a few of the most recognizable features below and added links to Google’s and Apple’s sites if you would like to explore further.
Android 12 Accessibility Updates
Google has implemented several Accessibility features, but over time the accessibility menu has become cluttered and difficult to navigate. With the most recent Android 12 release, one of the biggest differences is in the Text and Display submenu. Options that pertain to screen brightness such as Dark theme, color inversion, and reduce brightness have all been moved into an option for “Turn screen darker,” overall simplifying the process. Now that these settings have their own page, it allows both menus to be easier to read and makes more room for options for the font and display sizes to be moved to the top.
With the most recent update, the Select to Speak option is now listed under Display, whereas prior it was under Screen Readers. In addition, the Audio and On-Screen Text category has been divided into two new subcategories: Captions and Audio. This feature allows you to select specific text to read out and also offers a floating bubble as a shortcut.
One of the largest differences from previous versions is how to activate the accessibility menu. Previously, to activate the menu you would need to swipe up with two fingers from the bottom of the screen. The developer preview 3 has replaced that with an accessibility button; it resembles bubble notifications and has the ability to be moved anywhere left or right on your screen. There is also an option to make this button completely transparent if you are not interacting with it.
These are just a few of the new accessibility options for the Android 12 release, for more information on Android accessibility features, check out their website.
iOS 14 Accessibility Updates
As for Apple, there are also a few new noticeable accessibility features offered by iOS 14 for individuals. The first noted is Sound Recognition, a feature that listens for sounds such as a baby crying, a fire alarm, doorbells, sirens, and water running to alert someone potentially before they may be aware on their own. This can be activated and located under settings and accessibility.
iOS 14 has also introduced a back tap feature. This feature can make it easier to launch Google’s voice assistant, take a screenshot, open the control center, and more. This eliminates the need to swipe with your thumb which could be difficult for individuals with cognitive or motor disabilities. To get this activated go through the following steps: Go to Settings > Accessibility > Touch, and tap Back Tap.
Now, FaceTime has the ability to detect when someone is using sign language. When the phone detects that someone is signing, they are automatically made the focus making them easier to see on the screen.
Apple has made several other updates for accessibility features. These include headphone accommodations making the sound through headphones more clear and VoiceOver being able to recognize and give a description of what is seen on the screen including text, images and photos. For a complete listing of the Accessibility features for iPhones, check out Apple’s website.
Accessibility is a large focus for us at PLUS QA; we created a dedicated team back in 2015 and since have grown that team to 15 professionals. If you are interested in utilizing our team for accessibility testing, don’t hesitate to contact us!