Google's digital assistant is getting a pretty significant update for users with accessibility needs, the company announced today. Thanks to a partnership with Tobii Dynavox, a subsidiary of Tobii, the Mountain View giant is bringing Google Assistant actions to the Snap Core First software.
Snap Core First is a piece of software that lets users create blocks that are assigned a variety of words and or letters, allowing users that can't move or speak, such as those with cerebral palsy, to communicate more easily. The software is at the heart of Tobii Dynavox's dedicated tablets, but it's also available as standalone software for iPad and Windows 10 devices.
Thanks to the integration, users can now link the Google Assistant to this app, which in turn allows them to create quick shortcuts for a variety of actions, including smart home controls. Setting up the integration is fairly simple, and users just need to log into their Google account, assuming it's already connected to the smart home devices through the Google Home app. Then, blocks can be created and customized to issue any instruction the user wants to the Google Assistant.
While the main goal is for this to be used with eye-tracking systems for users who have limited movement, it can be used with a touch screen or even a mouse. Since the app is available for Windows 10, this integration also gives any PC user a way to perform Google Assistant commands using their computer. The Snap Core First software is free if you don't mind missing out on the ability to synthesize speech, and for users without disabilities, that's not a necessity. That makes this integration useful for anyone that wants to use the Google Assistant on their PC.
This isn't Google's first attempt to make the Assistant more accessible, though. Earlier this year, the company introduced Action Blocks, quick shortcuts for Assistant instructions that can be placed in the user's home screen on Android devices.