Google is using its AI prowess to bolster its accessibility features. At its I/O developers conference on Tuesday, the company shared that it's leveraging Gemini Nano, its AI model that runs on smartphones, to advance its TalkBack screen reader, offering blind and low-vision users richer and clearer image descriptions.Â
First launched in 2009, TalkBack reads aloud what's on a screen and lets users navigate their device using custom gestures. It also supports voice commands and a virtual braille keyboard.  Â
More from Google I/O 2024
Google says TalkBack users come across an average of 90 unlabeled images a day. Gemini can help fill in any gaps, such as details about what's being shown in a photo someone sent or the style and cut of clothes while online shopping. And because Gemini Nano works on-device, the descriptions are generated quickly and can continue to work without a network connection.Â
Google shared an example depicting a dress, for which TalkBack generated the description, "A close-up of a black and white gingham dress. The dress is short, with a collar and long sleeves. It is tied at the waist with a big bow."
TalkBack can provide richer image descriptions, with the help of AI.Â
Bringing Project Gameface to Android
At last year's I/O, Google launched Project Gameface, an open-source, hands-free gaming "mouse" that lets people control a computer's cursor using head movements and facial gestures. Now, Google is open-sourcing more code for Project Gameface on Github to allow developers to expand this capability to Android.
With this expansion, Android cameras can track facial expressions and head movements and translate them into controls. According to Google's blog post, "Developers can now build applications where their users can configure their experience by customizing facial expressions, gesture sizes, cursor speed and more."
With Project Gameface, users can link facial expressions and head movements to various controls.
For instance, a user's head movement will also determine how a cursor moves, and gestures like raising an eyebrow or looking up can be custom-linked to various commands like "Select."
The updates come ahead of Global Accessibility Awareness Day, which this year falls on May 16. Google is one of many tech companies that has doubled down on efforts to expand digital accessibility across its devices and platforms. In recent years, it's launched features like Guided Frame, which helps blind and low-vision Pixel users take selfies; Magnifier, which makes it easier to see small text and objects; and Sound Notifications, which alerts people with hearing loss about "critical household sounds" like appliances beeping or water running.


