Jennifer Jolly, Special for USA TODAY
Published 5:01 a.m. ET Feb. 25, 2020 | Updated 11:31 a.m. ET Feb. 25, 2020
Your Google Home or Amazon Echo may make your life easier, but beware, they could actually be an easy way for hackers to get all your personal data. Buzz60’s Susana Victoria Perez has more.
One in every four adults in America own a voice-activated smart speaker. Though we love the convenience of ordering a gadget to play music, make calls and such, most of us get a little creeped out when they “wake up” when they’re not supposed to. I often trigger Siri when saying “seriously” or “Suli” – the name of my parents’ dog. My friend Bill Keeshan says his Alexa-connected device “gets triggered by my daughter saying ‘actually.’ ”
All too familiar anecdotes aside, researchers at Northeastern University and the Imperial College of London spent the past six months streaming 125 hours of popular Netflix TV shows to a handful of voice-activated smart speakers. Their goal, which you can read about on a just-published webpage called “When Speakers Are All Ears,” is to figure out what words accidentally activate smart voice assistants – from Apple, Amazon, Google and Microsoft – the most and what implications it might have on our privacy.
For the study, researchers built a custom cubicle and tested five types of speakers: a first-generation Google Home Mini, a first-generation Apple HomePod, Microsoft’s Harman Kardon Invoke and two Amazon Echo Dots – a second-generation and a third-generation model. For the experiment, they binge-played TV shows, including “Gilmore Girls,” “The Office,” “Dear White People” and “Narcos.” They used a camera to detect when the speakers lit up, a microphone to monitor what audio the speakers played, such as responses to commands, and a wireless access point to record “all network traffic between the devices and the Internet.”
“We want to find out what exactly these systems are recording and where it’s going,” David Choffnes, an assistant professor involved in the study at Northeastern, told me over the phone. “People fear that these devices are constantly listening and recording you. They’re not. But they do wake up and record you at times when they shouldn’t.”
Hey, Google: Good and bad news
In short, you’re not crazy. The smart speakers accidentally activated as many as 19 times a day and stayed awake – potentially recording and/or exposing private conversations unbeknownst to users. About half of the accidental activations lasted less than six seconds.
The bad news: The other half recorded for as long as 43 seconds of audio each time. “We have found several cases of long activations. Echo Dot 2nd Generation and Harmon Kardon Invoke devices have the longest activations (20-43 seconds). For the Homepod and the majority of Echo devices, more than half of the activations last 6 seconds or more.”
Choffnes pointed out they found no evidence that these accidental recordings are used in any nefarious way. “We haven’t found this to be a major privacy issue,” he said. “But we have a lot more work to do. We want to know how many activations lead to audio recordings being sent to the cloud vs. processed only on the smart speaker and whether cloud providers correctly show all cases of audio recording to users.”
Hey, Siri, Google and Alexa: Enough with the snooping
Anxiety, depression and PTSD: The hidden epidemic of data breaches and cybercrimes
These words caused the most accidental wake-ups
Researchers said the biggest non-wake word culprits caused activations that were five seconds or longer and included:
- Google Home Mini: words rhyming with “hey” (such as the letter “A” or “They”), followed by something that starts with hard “G,” or that contains “ol” such as “cold and told.” Examples include “A-P girl,” “OK, and what,” “I can work,” “What kind of,” “OK, but not,” “I can spare,” “I don’t like the cold.”
- Apple Homepod: words rhyming with Hi or Hey, followed by something that starts with S+vowel, or when a word includes a syllable that rhymes with “ri” in Siri. Examples include “He clearly,” “They very,” “Hey sorry,” “OK, yeah,” “And seriously,” “Hi, Mrs.,” “Faith’s funeral,” “Historians,” “I see,” “I’m sorry,” “They say.”
- Amazon devices: words that contain “k” and sound similar to “Alexa,” such as “exclamation,” “Kevin’s car,” “congresswoman.” When using the “Echo” wake word, there were activations from words containing a vowel plus “k” or “g” sounds. Examples include “pickle,” “that cool,” “back to,” “a ghost.” When using the “Computer” wake up word, there were activations from words containing “co” or “go,” followed by a nasal sound, such as “cotton,” “got my GED,” “cash transfers.” Finally, when using the “Amazon” wake word, there were activations from words containing combinations of “I’m” / “my” or “az.” Examples include: “I’m saying,” “my pants on,” “I was on,” “he wasn’t.”
- Invoke (powered by Cortana): words starting with “co,” such as “Colorado,” “consider,” “coming up.”
Convenience vs. privacy
There’s a growing sense that when it comes to many of our modern gadgets, we can have privacy or convenience but not both. A Pew Research Center survey reported that more than half of smart speaker owners are at least somewhat concerned about the amount of data collected by these devices. “I think we should have regulations in place to protecting personal data,” Choffnes said.
This week in San Francisco at the RSA conference on cybersecurity, one of the topics is how to prepare for the potential dangers of more devices having access to your private conversations.
If you want to know what your devices might have heard, each of the companies involved gives you a way to see, hear and potentially delete old recordings from your devices and the cloud:
- Amazon’s Alexa/Echo: Accessing your Amazon recordings is fairly simple. You can delete anything there and opt out of having your audio recordings reviewed by people (working to make the devices more accurate).
- Google Home: Google lets you review and delete your recordings, and it won’t let people review your interactions unless you approve it. Google said it might use recordings to deliver targeted ads to your device.
- Apple’s Siri: In your device settings, you can opt out of having Apple store or review any of your voice interactions with Siri. You can delete any recordings.
- Microsoft’s Cortana: Use Microsoft’s privacy dashboard to delete your voice data from the service.
Jennifer Jolly is an Emmy Award-winning consumer tech columnist. Email her at email@example.com. Follow her on Twitter: @JenniferJolly.
Read or Share this story: https://www.usatoday.com/story/tech/conferences/2020/02/25/google-alexa-siri-randomly-answer-even-without-wake-word-study-says/4833560002/