A Canadian project aims to make voice assistants more accessible by having people with atypical speech offer recordings of their voices.

How to shop with Alexa and get early access to deals
ZDNet breaks down all the ways Amazon Prime members can use Alexa to shop, find the best deals, track their packages and more while shopping on Amazon.

The Canadian Down Syndrome Society recently announced its latest program, Project Understood, which allows Google speech technology to better understand people with Down syndrome.

“People with Down’s syndrome have atypical speech, their facial structures are different. Their tongue is bigger, thicker and it means that they speak in a way that is not typical,” said Shelley Brown, Chief Strategy Officer at FCB Toronto and leader of the Canadian Down Syndrome Society project. “That’s why they find speech technology really hard to use.”

SEE: Amazon Alexa: a guide for insiders (free PDF) (TechRepublic)

Digital speech assistants such as Amazon Alexa, Google Assistant and Apple Siri have become staples in everyday life, and infiltrate smartphones and smart home devices. Juniper Research discovered that the popularity of voice assistants worldwide will increase, with an expected 8 billion in use in 2023.

However, these voice assistants may not be able to help people with disabilities, a population that could be helped most by technology, Brown said.

“Speech technology can be so useful and important for people with Down’s syndrome. It can be a technology that can make a difference for someone to live independently or not,” said Brown. “It is a very intuitive, easy way to access everything a computer can do, from an appointment reminder, directions on how to get somewhere or how to do something.”

Hey Google, how do we make voice assistants more accessible?

As soon as the Canadian Down Syndrome Society decided to continue this mission of making voice assistants more accessible, officials realized they needed the support of a large name company, Brown said.

“We’ve done a lot of research and have started calling a lot of different places,” said Brown. “When we called Google, they were so open, enthusiastic and receptive to the whole idea.”

The first step that Google and society have made included a test. “One of the questions we didn’t have the exact answer to was, are the speech patterns of people with Down’s syndrome comparable enough (without people) that the voice technology can learn from it?” Asked Brown.

To find the answer, Project Understood recruited nine people with Down syndrome and asked them to record 1,700 words and sentences. The recordings would be played through a Google speech platform and researchers would discover if the technology could learn from repeated input of voices, Brown said.

The group received an exciting response from the test: Yes. “Based on that first test with nine people, it became very clear that the technology could learn from them and that it just needed more data,” said Brown.

Project Understood is still at a very early stage of development. Currently, Google gets about one in three words incorrect for someone who lives with Down’s syndrome. However, the system is extremely dynamic, meaning it can better understand if it gets more data, Brown said.

“What we are really looking for is gathering about a thousand votes from people with Down syndrome. The more data we can give the system, the smarter the system will be,” said Brown, “At the moment we have nearly 300 participants are already in the program and we will continue to attract more people with Down syndrome so that they can teach Google how to understand them. ”

“By teaching Google to understand people living with Down’s syndrome, this has consequences for other people with different disabilities or different forms of atypical speech,” Brown said. “Although the first group of people who will benefit from (the project) are people with Down’s syndrome, their participation will ensure that speech technology works better for many other people.”

People with Down syndrome can help the cause via this link. By incorporating their own voices that say certain words and expressions, people with Down’s syndrome can directly help improve the accuracy of Google’s speech recognition models, Brown said.

The press release offered some example sentences that users can ask to include:

  • The boy ran down the path
  • Flowers grow in a garden
  • Strawberry jam is sweet
  • I owe you yo-yo today

“We have always tried to reformulate how people think about people living with Down’s syndrome. These people are not victims, they are not sick. These are concerned individuals and they have a view of the world,” said Brown. “Google is one of the largest and smartest companies in the world, and someone who lives with Down syndrome is an expert in life with Down syndrome. This is a case where they can share their expertise and teach Google to be more useful not only for people with Down syndrome, but also for people with some basic problems. ”

For more information, see How to make your sites accessible to all users: 3 tips for business owners at TechRepublic.

Next Big Thing newsletter

Be up to date with smart cities, AI, Internet of Things, VR, autonomous driving, drones, robotics and more of the coolest technical innovations.
Delivered on Wednesday and Friday

Register today

Also see

Similar Posts

Leave a Reply