“Glad You’re Having a Great Day, Want to Hear Your Favorite Song?” Emotion in Artificial Intelligence Design

David Enarson | May 3, 2017 | In the News Mobile Strategy

Full disclosure: for over a month, the Amazon Echo Dot I received as a gift did nothing but sit in its box. The thought of voluntarily bugging my home didn’t sit right with me. Until eventually curiosity got the best of me, and I shook off the irrational notion that anyone cared to listen in on my daily activities. And before too long I began to wonder about the role of emotion in artificial intelligence design.

Turning it on for the first time was a magical experience. The ring of LEDs lit up like a power boost. As cool as it looks, however, the value of this device is not in its shiny lights and modern industrial design. The true value lies in its voice assistant, Alexa. Alexa started out as “it.” An inanimate object. But over time, as I interacted with the device more frequently, I noticed something. I had begun referring to Alexa as “her.”

My inadvertent anthropomorphizing got me thinking. What role might other unconscious forms of communication, like emotion, play in our future interactions with artificial systems?

In-keeping with my impression of Alexa as female, out of politeness, I usually frame my requests in the form of a question. I will ask (not tell) Alexa to turn on or off the lights, set a timer, and let me know if the Cubs won. But if I’m frustrated, it may come out more like, “Alexa turn on the lights, now!” Alexa complies with every request, regardless of my tone, inflection or chosen phrasing. The lights turn on, and Alexa, in a neutral yet obedient tone, replies, “okay.”

But what if Alexa was programmed with more sentiment analysis and corresponding conditional responses? What if she even had attitude? Every so often, I’d love for her to refuse a request, with: “Why don’t you turn them on yourself. You know you could use the exercise!” It would break the monotony.

Extending this further, vocal analysis could include emotion detection. Is the user excited, angry, sad? And what if this data were then meta-tagged to your purchase decisions? Amazon could then analyze the data to determine if any relationships exist between our emotional states and our ordering habits. Maybe there’s a correlation between ‘sad’ Alexa requests and cookie orders? Might Alexa then start to make offers based on your current mood? Or even play music based on the emotional states of people in the room?

As it turns out, researchers and companies alike are working to appropriately roll out this technology evolution. The Daily Dot asks, “Can we humanize artificial intelligence—before it kills us?” Phillip Tracy covers the topic of emotion in AI, including analyzing the vocal sentiment of Steve Jobs during an interview.

But what if Alexa could respond with advice or realize the intent of our conversations? All of these considerations are important as we examine and evolve artificial intelligence and speech assisted devices. Sophie Kleber, Executive Director of Product and Innovation at Huge, recently delivered a session at SXSW about Designing Emotionally Intelligent Machines. One area of focus was “affective computing”—interactions designed to possess both empathy and intelligence. Kleber’s research also revealed an interesting fact—a common “desire for an emotional relationship with AI-equipped devices that goes well beyond being an assistant. The next step is to give robots a heart.”

Because the more relatable and familiar Alexa (or any other bot) can be, the more easily it will be integrated into our daily lives, and the more helpful the technology will ultimately become—both for end users and for developers alike. Check out our recent Device Squad podcast with MERL Sr. Research Scientist, John Hershey which also covers voice recognition and language processing strategies and the role they’ll play in our everyday lives. And if you’d like to learn more about these, or any other, emerging technologies and how they might benefit your enterprise, please give us a call.

David Enarson

David Enarson is a Mobile Strategist at Propelics. He has experience leading mobile application initiatives in the Enterprise, specifically with sales force audiences. He has worked with well known brands in the Pharmaceutical, Consumer Packaged Goods and Services verticals. David combines his ability to chart out a big picture strategy with his technical aptitude and understanding of the complexities of the Enterprise to drive value for clients. He has a true passion for startups and mobility and was a past participant and coach at Startup Weekend Chicago. David’s interest in mobility began early on when he founded a company to develop and market applications for the Palm OS.

More Posts

Follow Me:
LinkedIn