Be Afraid. Be Very Afraid.

amazonVoice-activated digital assistants – like Amazon’s Alexa and Echo, Google’s Assistant, and Apple’s Siri – have become so popular that in a few years, more than half of U.S. households will have them.

That’s more than 70 million households.

That’s a lot of people who should be very, very afraid.


In early May, a husband and wife were at home in Portland, OR.  A home in which every room was wired with Amazon Echo voice-controlled speakers, used to control their home’s heat, lights and security.

Woman-on-mobile-phone-call-distressed_01 reversedThey got a call from a business associate of the husband’s.  The caller was in Seattle, 176 miles away.

The business associate had received a recording of a conversation between the husband and wife.  The couple didn’t know they’d been recorded.  They didn’t know the recording had been sent somewhere.

Their trusty Echo had done this all on its own – recorded their private conversation, selected a recipient from their address book, and sent the recording.


Headline print screen.jpg

Oh, yeah.

The wife contacted Amazon, whose spokesperson offered this explanation:huh_01

Echo woke up due to a word in background conversation sounding like “Alexa.”  Then, the subsequent conversation was heard as a “send message” request.  At which point, Alexa said out loud, “To whom?”  At which point, the background conversation was interpreted as a name in the customer’s contact list.  Alexa then asked out loud, “[Contact Name], right?”  Alexa then interpreted background conversation as “right.”

Poor little Echo just got all confused and misunderstood what it was hearing.  And sent that private conversation not only to the contact in Seattle, but to Amazon as well.  If you have the Google or Siri systems, they also get copies of your conversations.

Because that’s how the systems work.

Did you know that?

The Amazon spokesperson went on to say, “As unlikely as this string of events is, we are evaluating options to make this case even less likely.”

Less likely means possiblePossible means this can happen to anyone who uses this technology.welcome-mat (1)

The article about the Portland couple was followed by a spate of articles on how to protect ourselves from the very technology that increasing numbers of us are welcoming into our homes:

“Protect Your Privacy With An Echo In the Room”
“After Amazon Echo Misfire, Ways To Protect Your Own Privacy”
“Smart Gadgets:  Ways To Minimize Privacy and Security Risks”

Are you listening, folks?

On second thought, you don’t have to listen.

Your devices are doing that for you.

Whether you want them to, or not.

be-afraid-be-very-afraid-_big_think cropped

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: