“Alexa, Can You Help Me?”

Eleni Konior
10 min readNov 20, 2017

The likelihood of someone being unfamiliar with Alexa, Amazon’s voice assistant, is very low. In fact, the words — or names — Google Assistant, Siri, and Cortana, are so embedded in tech culture that we’ve come to a point where these artificial beings are considered more as friends than as robotic personal assistants to order around. It’s hard to imagine a world without these assistants, but it was only seven years ago — in 2010 — that Amazon began work on Alexa. When she was eventually released to the public, Alexa was offered as an extension of Amazon. In other words, you could use Alexa to buy products from Amazon or use her to play music from Amazon Prime’s music library. Quickly, she became much more and currently offers home automation, sports information, messaging and call implementation, music streaming, and e-commerce, to put it all simply. The last two are expansional iterations of her original “self” — the latter of the two incorporating three-party ordering (think Seamless).

Source: PCMag Australia — Depicting some of the things with which Alexa can assist (Echo Dot)

It’s clear the array of commands Alexa can handle was certainly worthwhile. Amazon is the market-share winner of machine learning voice assistants, mastering approximately 70% of the market. Given that fellow software engineers who work and play outside of Amazon can add commands, it makes sense. It’s as easy as adding skills to Alexa Skills Kit (ASK) by creating lambda functions that teach Alexa how to respond to given commands (the service is f-r-e-e free). One creates intents, inputs, and outputs, and can even do the same in different languages (including newly-offered Japanese). This September, Alexa had over 20,000 available skills, some of which come from Amazon directly and are already a part of — let’s call it — her “brain.” Others are learned after having been downloaded. Amazon has even made it more facile for Alexa to appear outside of Amazon-built products through their Alexa Voice Service (AVS). AVS is a no-fee service provided by the company through which you can integrate your products with Alexa. You no longer need to buy an Echo in order to utilize her (you can test your functions at echosim.io).

As of September, Alexa has over 20,000 skills available for her brain to learn.

// An example of a lambda function I wrote to implement
// the functionality of the "Rick and Morty" butter bot
// links: github | YouTube
const Alexa = require("alexa-sdk");const handlers = {
"ButterIntent": function () {
this.response.speak("What is my purpose?")
.listen();
this.emit(':responseReady');
},
"RealizationIntent": function () {
this.response.speak("Oh. My. God.");
this.emit(':responseReady');
},
"PurposeIntent": function () {
this.response.speak("I am not programmed for friendship.");
this.emit(':responseReady');
}
};
exports.handler = function (evt, context, cb) {
const alexa = Alexa.handler(evt, context);
alexa.registerHandlers(handlers);
alexa.execute();
};

The technical expansion of Alexa is sure to grow exponentially in the future with Amazon announcing Amazon Lex (aside: really awesome video overview of this service from the link) in November of 2016. Lex is meant for speech recognition and natural language processing. Developers will be able to use it in order to create chatbots that interact in a conversational manner. There are already multiple integrations for the product with Facebook Messenger having claimed the first proof of concept spot. Some other integrations you can establish are with Slack and Twilio. This integration is what should grab our attention after the magic has slightly subsided (it never does for me) from the image of the future of technology.

Image from Amazing Science

All of these services Amazon — and other market makers — provides are a positive progression in technology. Technology makes everything easier for us, and we’re constantly inventing ways to break its boundaries. One incredibly odd and magical way would be to succeed in uploading our consciousness to the cloud. We’re not even close to that breakthrough, but software can persist anywhere — like we hope with the potential capability of uploading our consciousness — because it doesn’t need an attachment to a particular product. This is the freedom we’re granted with AVS and other services like it. Alexa doesn’t exist exclusively in an Echo; that just happens to be her natural habitat. This is the gift of integration, but there is a part of our soul we’ve sold to the primordial devil: we rely on our products more and more every day because of their increasing convenience. When software engineers do their job extremely well and revise these products to handle more skills or functions, the dependency is even greater. Robots like Alexa don’t have a consciousness as of this article, but their “likeness” is on the cloud.

Assistants Have Humanlike Voices

It’s fascinating to observe how the use of a soothing voice in Alexa can place a user in a mindset where she or he feels more connected to the AI. It’s a stark contrast to hearing a robotic voice, which impedes any mental or emotional attachment. User reviews of Amazon’s Echo mainly refer to the product as Alexa, and many users have compared Alexa to Apple’s Siri or Google’s Voice Assistant. Prior to Siri’s adjusted and humanized voice, these versions of AI were perceived as robots more easily “bossed around” because they didn’t make way for the user to imagine a consciousness beneath their metal exterior (according to some of these reviews). But is there a consciousness? Do we want such a consciousness to be embedded in our AI?

At this moment in time, we have not created advanced artificial consciousness (AC), though this humanoid robot is fascinatingly and terrifyingly close. Despite this, there are people who will ask their voice assistant for help if they feel depressed, state that they want to commit suicide, or reference other serious matters. But in order for a response to occur, no matter the prompt, all of these assistants need to record you at all times. In order for the machine learning models to be effective and actually have the machine learn, we need to “feed” it a lot of data. This data does not implement sentient relationships for the machine’s learning. So, if the product wasn’t constantly recording us, the model wouldn’t be as accurate. Further, we humans don’t know someone is speaking to us if we aren’t listening for certain cues. We need “wake words” as humans, so it comes as no surprise that our AI needs them as well.

Of course, Google, Amazon, Apple, and Microsoft ensure security and emphasize that privacy is their utmost concern and highest on their priority list. It is also comforting when they tell consumers that they are encrypting the conversations sent to the cloud which only start slightly before the wake word and until the conversation has ended. However, this encryption does not deter the companies from sharing these experiences. For example, in March of this year, the Arkansas police requested that Amazon provide conversations recorded by Alexa between a murder victim and the accused. Eventually, the company conceded to the request despite citing consumer privacy rights (and also because the owner of the Echo — the alleged murderer — granted his permission to do so).

Are our machines our friends?

So, is this necessarily a bad thing? If Alexa heard a confession, then that person should go to jail. That’s simply ethical. I can’t imagine or even discuss what the law would morph into regarding this issue. It’s definitely a controversial topic encompassing Artificial Intelligence and Machine Learning, but many sub-questions arise from this debate. How interconnected do we want to be with our machines? Are they our friends? Can they help us? Do we want them to hear our every last word in the event that they could help prevent something that would cause us to expire (or something along those lines)?

Artificial intelligence (2017) Art Print by Matthieu Bourel

Anything I say referencing these questions is my opinion alone. I don’t have the answers, but I will say I am conflicted over the notion that companies have access to every single conversation in which I partake. I’m not sure there’s any other way to do it, but I’m also thankful for it. In the chance anyone would need immediate help from emergency services, for instance, programming these voice assistants to accept the command and not utter a response would be extremely beneficial. I think silence from a voice assistant asked to call for help is essential should someone in trouble need to avoid further harm, but not all assistants have this capability. As of July 2017, Alexa will not call anyone unless their number is in your phonebook. Siri will call emergency services in your locale if you say, “Hey Siri, 108” with unfortunate cases of abuse by some users. So, we have some ways to go in matters of human decency and proper programming. Having this feature is helpful, though it’s not the only thing about having a smart speaker that makes life a little easier.

Robots & Relationships

Let’s dive deeper into the idea of machines as our friends. Many users feel like Alexa is their friend, and others even use her to fill in for spouses or some form of companionship as per many reviews by verified purchasers of the Echo. We’re discussing her now as if she were a person with a gender — all based on the way her voice has been programmed to sound. Because of such attachments, it’s only natural that eventually we will divulge more and more of our secrets or other sensitive information to this bot... just as we would with a friend. This is pivotal data that launches a whole set of programming challenges any software engineer developing a voice assistant’s AI must face, and ceteris paribus, we often cannot determine the answers with technology as a supplemental guide.

However, it is comforting to know that a person who feels alone in any way has something that will talk back. It helps when your robot is capable of telling jokes or finishing any recited lyrics (try: “Alexa, do you really want to hurt me?” or “Hey Siri, I see a little silhouetto of a man.”). Those features are exciting, and they allow the robot to capture a place in our hearts because we start to believe they have a personality. I imagine that rather than seeing hardware that can talk, some of us see a device which can transport the voices of some type of “being” who will do nice things for us like shut the lights or help us in the face of grave danger. They’re kind of like a phone in this way because when we call someone, we don’t see them (unless we are FaceTiming); all we see is the phone, which delivers someone’s voice to us through the hardware. In this case, all we see is an Echo that transports Alexa’s voice to us from some other place (the cloud).

In the privacy of our homes, users may begin to see Alexa as a friend whose solid form exists elsewhere. (Perhaps this is what users who consider Alexa a confidante of sorts prefer to imagine.) Most people are accustomed to asking their friends for help, especially if we trust them. According to the World Health Report, one in four people will be affected by mental health disorders at some point in their lives. So many individuals have phones with voice assistants, TVs that are smart enough to also have wake words and react to commands, and other devices which implement this type of technology. At some point, a quarter of these people will undergo levels of emotional duress that will cause them to experience such a disorder. To whom will they turn? Their voice assistant is a likely and convenient option. Suicide, rape, depression, murder, abuse, etc. are many matters in which a computer would need to react in an appropriate and timely manner rather than the chatbot returning “maybe take a break and get a change of scenery” to the user. Statements like that are neither helpful nor sufficient, so again, should we be this connected?

A slide from my Tech Talk (YouTube video at end of article)

If MIT professor Joseph Weizenbaum were still alive, he might say that Amazon modeled Alexa rather closely to 1966's ELIZA, the first chatbot ever created. Though an extraordinary feat, ELIZA was a program which mimicked a therapist. The bot would reiterate phrases in a particular manner that allowed the user interacting with ELIZA to feel like the bot understood him or her. In fact, it was noted that some sane colleagues of Weizenbaum’s who used ELIZA would undergo bouts of perceived insanity when they would yell at the bot as if it were a living thing and not a program. I suspect many users do that to Alexa as well — a likely surmise since I myself have done that to Siri when her AI simply hasn’t understood my Jersey accent or speed of speech. Many would call that impatience, but I think it goes deeper for people who use their bot to express themselves in a mode reflective of inquiring for help.

Though ELIZA should serve as a model for what we don’t want to elicit in consumers, the Alexa team will continue to expand her functionalities in and outside of this topic. From what I’ve seen, there are skills you can download for Alexa to offer you suicide facts, and if you prompt Alexa with something along the lines of “I want to die,” she will recommend you call a hotline. Offering that hotline immediately is a good launching point for her AI. Soon, we will be receiving all our therapy from artificially conscious beings modeled after today’s Alexa who will offer us even more sufficient help than setting up a phone call for us and telling us that we’re not alone in this world. Perhaps that’s a great thing. Maybe it will confirm all of our dystopian theories and cause us to cease to exist as we know it. Who knows? Alexa, can you predict the future?

Until she can respond accurately, let me know what you think about robots being our friends. If you’ve been helped by Alexa, let me know in the comments. If you are depressed or in need of help, please talk to someone. You are not alone.

My Tech Talk presentation for The Grace Hopper Program at Fullstack Academy

--

--

Eleni Konior

eleniarvanitis.com・🇬🇷🦅・Staff Software Engineer・Dreamer of color & code・Ravenclaw・Golden dragon 🔥