Project Understood Improves Google Accessibility
By 2020, 50% of all Internet searches are expected to be undertaken with voice commands alone. To date an estimated 52 million Google Home devices have been sold but just 4 years from now, in 2023, it’s expected that there will be 8 billion voice assistants being used. Thermostats, ceiling fans, smart speakers and other voice-command devices now quite clearly seem to be the way of the future.
For anyone with speech and language difficulties, the changing technology is currently not very accessible. As the world moves in the direction of voice commands, they need to be included so that they are not isolated or left behind. This is a concern for anyone with an unusual accent or atypical speech patterns, but it’s far more pronounced for individuals with Down Syndrome. That’s where Project Understood comes in.
Speech Production in Down Syndrome
People with Down Syndrome face many challenges, and speech production is among the most noticeable with an obvious effect on daily life. Being able to communicate effectively is something most individuals take for granted, but those with Down Syndrome usually experience several issues that make such communication more challenging.
These issues include delayed growth, high-arched palates, upper jaws that are smaller than average, weak oral muscles and low tone in their tongue muscles. This results in difficulties breathing, pronouncing and articulating many speech sounds. Current studies suggest that Google Home misses around 30% of the words spoken by someone with Down Syndrome. Clearly, access needs to be improved.
Google’s Pioneering Project Understood
Happily, while Google’s technology is struggling to accommodate those with different speech patterns, its operators are working to address those concerns. In 2017 Google teamed up with the Canadian Down Syndrome Society and launched Project Understood.
The objective is to teach the company’s systems how to better understand atypical speech patterns, including those associated with Down Syndrome. Many individuals with the condition have already donated their voices, recording various phrases and uploading them. The greater the database, the better Google’s algorithms will be able to understand the different patterns of speech.
Although the project is still in its early stages, it is making excellent progress and will hopefully one day build a database that improves voice command technology access for all users. In a world where everyone deserves to be understood and included, this will become increasingly important in the next few years.