Since the introduction of smart speakers, watchdog groups have reiterated the dangers of these devices. The idea of having an always-on device listening at all times is equivalent to welcoming a spy device into your home. They have repeatedly warned that the recording of conversations held in the vicinity of these devices could fall into the wrong hands - either through the company or other malicious means.
This weeks, the warning from these organizations have been brought back to light, with a fascinating situation created by an Alexa-powered device. A couple from Oregon was called by an employee of the husband, saying that they should disable their Echo because he had received the recording of a conversation from their home, delivered by Amazon. The husband did not believe the story, until the employee mentioned that they had been discussing hardwood floors.
Following that, the couple, who had multiple Echo devices throughout their home, disconnected all of them. The couple contacted Amazon about the issue, who confirmed that it had happened by reviewing account logs. The engineer, however, did not explain what had caused the bizarre and incredibly inappropriate behavior. The couple then began spreading the story, including speaking to a local news station, KIRO 7.
After becoming public, Amazon responded publicly to the incident. The company, in their FAQs, explains how Alexa works generally with her wake keyword.
Amazon Echo, Echo Plus, and Echo Dot use on-device keyword spotting to detect the wake word. When these devices detect the wake word, they stream audio to the Cloud, including a fraction of a second of audio before the wake word.
With that information in mind, Amazon explained,
Echo woke up due to a word in background conversation sounding like "Alexa." Then, the subsequent conversation was heard as a "send message" request. At which point, Alexa said out loud "To whom?" At which point, the background conversation was interpreted as a name in the customer's contact list. Alexa then asked out loud, "[contact name" class="UpStreamLink">, right?" Alexa then interpreted background conversation as "right." As unlike as this string of events is, we are evaluating options to make this case even less likely.
Obviously, Alexa's trigger process was, in this case, far too easy to spoof (assuming the stated process is accurate). Amazon has agreed to look into the issue and work to make the trigger less likely to be triggered accidentally in the future. The best way to limit the behavior is to allow users to turn off the method (as well as others) easily.
This will not be the last time we hear about an issue like this. These types of services are a growing market, with smart speakers featuring Alexa, Google Assistant, Cortana, Siri and more are on sale everywhere, even your local drug store. Smart appliances, including refrigerators, ovens, thermostats and more all feature these smart capabilities. As these products grow, it is important for the companies behind them to be more open about their policies and more careful with what can be done with their platforms.