Artificial Intelligence Detects 100% Of Asymptomatic COVID-19 Coughs

Scientists at the Massachusetts Institute of Technology developed an artificial intelligence model that could distinguish between a healthy cough and one that comes from an asymptomatic coronavirus patient. The differences are nonexistent to the naked human ear, but the AI was able to accurately identify nearly 99% of coughs from people with COVID-19, including all of the coughs from individuals without symptoms. The model was trained by listening to more than 200,000 recordings of coughs and spoken words, the “largest cough dataset that we know of.”  The team said it’s working on incorporating the model into apps, and eventually smart speakers and other listening devices, so that people can consistently and conveniently be screened for coronavirus infection. This, researchers say, can help prevent asymptomatic individuals from unknowingly spreading the virus to others. What’s more, the method would be free and save you from having a cotton swab poked up your nose.

The effective implementation of this group diagnostic tool could diminish the spread of the pandemic if everyone uses it before going to a classroom, a factory, or a restaurant,” study co-author Brian Subirana, a research scientist in MIT’s Auto-ID Laboratory, said in the release. “We think this shows that the way you produce sound changes when you have COVID, even if you’re asymptomatic,” Subirana added.
When the pandemic began, the MIT scientists thought it would be an interesting experiment to see if an AI model they invented to detect signs of Alzheimer’s disease could also work to detect COVID-19. The team felt confident the model could work because there’s growing evidence that coronavirus patients experience similar symptoms associated with Alzheimer’s such as neuromuscular impairment that affects the vocal cords.

The sounds of talking and coughing are both influenced by the vocal cords and surrounding organs. This means that when you talk, part of your talking is like coughing, and vice versa,” Subirana said in the release. “It also means that things we easily derive from fluent speech, AI can pick up simply from coughs, including things like the person’s gender, mother tongue, or even emotional state. There’s in fact sentiment embedded in how you cough.”

In April, the team collected 70,000 recordings of people forcibly coughing into their cell phones or laptops, which amounted to about 200,000 cough audio samples. About 2,500 of those were submitted by people with COVID-19. Participants also had to answer surveys about symptoms they were experiencing, their COVID-19 diagnosis, gender, geographical location and native language. After using a couple thousand recordings to “train” the AI, 1,000 audio samples were used to officially test if the model can discern between a healthy and sick cough, even if the person is asymptomatic.

By listening for “vocal cord strength, sentiment, lung and respiratory performance, and muscular degradation” specific to COVID-19, the AI model identified 98.5% of coughs from coronavirus patients, and 100% of asymptomatic coughs.