As part of Accessibility Awareness Day 2024, Google is showing off some updates to Android that will be useful for people with mobility or vision problems.
Project Gameface allows players to use their faces to move the cursor and perform common click-like actions on the desktop and now it’s coming to Android.
The project allows people with limited mobility to use facial movements, such as raising an eyebrow, moving their mouth or turning their head, to activate a variety of functions. There are basic things like a virtual cursor, but also gestures where, for example, you can define the start and end of a wire by opening your mouth, moving your head, and then closing your mouth.
It is adaptable to a person’s abilities and Google researchers work with it Incluzza in India to test and improve the tool. Certainly for many people, the ability to simply and easily play many of the thousands (well, millions probably, but thousands well) of games on Android will be more than welcome.
There is a great video here showing the product in action and customization; Jeeja there in the preview image talks about changing how much she needs to shake her head to activate the gesture.
This kind of granular adjustment is as important as someone being able to adjust the sensitivity of your mouse or trackpad.
Another feature for people who can’t easily operate a keyboard, on-screen or physical: a new text-free “look to speak” feature that lets users select and send emojis either by themselves or as representatives of a phrase or energy.
You can also add your own photos so someone has common phrases and emojis on the speed dial, as well as images of commonly used contacts attached to their photos, all accessible with a few glances.
For the visually impaired, there are a variety of tools out there (with varying effectiveness, no doubt) that allow the user to identify things that the phone’s camera sees. The use cases are endless, so sometimes it’s best to start with something simple, like finding an empty chair or recognizing the person’s key fob and pointing to it.
Users will be able to add custom object or location identification so the direct description feature gives them what they need rather than just a list of generic objects like “a mug and a plate on a table”. Which mug?!
Apple also showed off some accessibility features yesterday, and Microsoft has some as well. Take a minute to read these projects, which rarely get mainstream treatment (although Gameface did), but are of great importance to those for whom they are designed.