- Sputnik International
Get the latest news from around the world, live coverage, off-beat stories, features and analysis.

‘Alarming’: Research Identifies Over 1,000 Phrases That Trick, Activate Voice Assistants

© RUB Press Release/Maximilian GollaUsing light sensors, they registered when the indicator LEDs of the speakers lit up.
Using light sensors, they registered when the indicator LEDs of the speakers lit up. - Sputnik International
A new information technology study on voice assistants has resulted in the creation of a list of more than 1,000 word sequences that can trigger the devices to begin invading users’ privacy and listening to nearby conversations.

While owners of voice assistants have been repeatedly reassured that their devices will remain inactive until called upon, new research conducted by Germany’s Ruhr-Universität Bochum (RUB) and the Bochum Max Planck Institute for Cyber Security and Privacy has identified over 1,000 words and phrases that inadvertently activate the machines.

The list of terms includes words in English, German and Chinese.

For example, in the video below, the character Phil Dunphy of the ABC sitcom “Modern Family” is overheard saying “Hey Jerry” during an episode of the show, which triggers a device running Apple’s Siri assistant to activate.

The researchers explicitly detail that the television character’s greeting was “confused with ‘Hey Siri.’”

Three additional examples were posted by researchers, including one in which a Google Nest device confuses a character on television saying “OK, who is reading” with “OK Google.”

It’s worth noting that, despite the accessible data and examples from the research, the study’s full paper has yet to be officially published, according to Ars Technica.

However, a brief write-up published by authors Lea Schönherr, Maximilian Golla, Jan Wiele, Thorsten Eisenhofer, Dorothea Kolossa and Thorsten Holz shows that the devices possess the ability to intrude on consumers’ private conversations and privacy in general.

“The devices are intentionally programmed in a somewhat forgiving manner, because they are supposed to be able to understand their humans. Therefore, they are more likely to start up once too often rather than not at all,” Kolossa said in the RUB news release on the research.

While voice assistants are marketed as devices that remain inactive until called upon, the best practice to maintain one’s privacy may be to unplug, turn off or disable the machines until they are needed - or refrain from enabling voice assistant features altogether.

“From a privacy point of view, this is of course alarming, because sometimes very private conversations can end up with strangers,” Holz said. “From an engineering point of view, however, this approach is quite understandable, because the systems can only be improved using such data. The manufacturers have to strike a balance between data protection and technical optimization.”

To participate in the discussion
log in or register
Заголовок открываемого материала