On the left is an Echo, a wonderful appliance from Amazon whose name is Alexa. She is a wonderful listener - perhaps too good. Yesterday I read two stories which suggest you may want to cover Alexa’s long ears if she is a listener in your house.
First the primmer:
When the Echo is on, by default she is listening. If you say “Alexa”, then her rim turns blue (not shown) and she starts listening carefully. Whatever you say next goes into the cloud where it gets interpreted. You might be asking for a song to be played, for something to be added to your shopping list, asking a question to resolve a bet, ordering something from Amazon or others, or invoking one of the many other services available through Alexa. Point here is that you know that your actual request to “Alexa” is permanently in the Amazon cloud unless you explicitly delete it – which you can do.
Story #1, more amusing than scary, comes from Twitter friend @DonutShorts. A San Diego news station was showing a piece on a little girl in Texas who ordered a bunch of stuff from Alexa and her parents only finding out about this when the packages started to arrive. One of the on-air people said: "I love the little girl, saying 'Alexa ordered me a dollhouse'." Many of the Echoes, owned by listeners heard their name called and did what they were told: they ordered dollhouses, too, which started to arrive all over San Diego. Turns out that by default ordering without verification is turned on. However, you can either turn it off or require a verification PIN in the Alexa app.
If you use Alexa to control things like calling 911 or disabling your home security system, you want to be very sure that the command you give Alexa for this is too unique to be said coincidentally on the air or by a visitor after the word “Alexa”. You can also change the wakeup word from “Alexa” to something much less common (since “Alexa” has now become common) and not likely to be used accidentally.
Story #2, courtesy of old friend Edie, is from the LA Times:
“When Bentonville police found the body of Victor Collins inside James Andrew Bates’ home in November, they also discovered a house outfitted with a number of Internet of Things devices, including an Amazon Echo. The gadget is constantly listening for spoken commands, but according to the company only records and stores snippets of conversation following a ‘wake word’ — in the Echo’s case, ‘Alexa.’ Bentonville police say there’s ‘reason to believe that Amazon.com is in possession of records related to [their] investigation.’”
So far, according to the story, Amazon has refused to release whatever the police asked for. But we should assume that, if Amazon has what’s being asked for and there’s a valid warrant, they’ll have to release it. To me that’s no big deal if Echo is indeed only recording what comes after the wake word. The film clip with the story says that Amazon says that this is the case and that Echo does NOT record except after the wake word – presumably during the period when her rim is blue indicating that she is in full listen and record mode.
I believe that Amazon believes what they told the newspaper, although this is not specific in their user agreement for Echo. But it’s also clear that Echo is listening when she is on or she wouldn’t be able hear her name and go into full listen mode. So what if Echo is hacked so she transmits whatever she hears? Scary thought.
Which brings me to why my Echo in the picture with this post has a red ring (it’s really red; it’s just my droid that makes it look almost white). If you push the microphone button on top, the Echo is put into “no listen” mode. Now a cynic might ask why that mode can’t be hacked as well; and perhaps it can. But the hardware engineering could be such (and I hope it is), that pushing this button physically disables the microphone in a way that software can’t switch back on. Mary and I are now in the habit of using the button to tell Alexa to cover her ears when we’re not talking to her.