Apple is training Siri to better understand stuttering users

Sharing new details in an article about how the company trains voice assistants to handle the voice of users with language disabilities, apple is looking at how to improve Siri so that it can better understand the needs of stuttering users.
Apple is training Siri to better understand stuttering users
Apple has set up a database of 28000 audio clips provided by stuttering users, which will be used to train Siri and improve the ability of speech recognition system to understand users with language barriers, according to an apple spokesman.
In addition to improving Siri's ability to understand users with language barriers, Apple also added hold to talk function for Siri, so that users can freely control the time when Siri listens to voice, which can prevent Siri from interrupting stuttering users.
With the first type to Siri feature in IOS 11, users can use Siri without voice.
Apple plans to release a research report this week on its efforts to improve Siri, which will provide more details about Apple's work in this area.
It is reported that Google and Amazon are also working hard to train Google assistant and Alexa to better understand the voice of users with language barriers. Google is collecting voice data from users with language disabilities, and Amazon launched the Alexa foundation in December to establish an algorithm to identify the unique voice patterns of users with language disabilities.