Researchers are working on natural-language-processing updates that will help it detect emotion in someone’s voice, as well as remember and connect known information about a user to their requests.
For example, if Alexa knows that a user lives in Mill Street in Oxford, it’ll factor in that information when deciding how to answer the question “who is singing at the Kite tonight?” It will know that the user is not asking about kites but the pub they most like sleeping under the tables of.
If Alexa knows its master likes to listen to popular beat combo artist Kanye West, it’ll be more likely to know that it is working with an illiterate, tone deaf moron who has no concept of music – and its user is just as bad.
But spotting emotion is important. If Alexa can tell if you are upset or angry it can come up with all the emotional responses which are designed to soothe you. It might be the first to say “sorry” when you get mad at yourself for paying so much for it when there might be better personal AI servants on the market.
Rosalind Picard, a professor at MIT’s Media Lab, says adding emotion sensing to personal electronics could improve them: “Yes, definitely, this is spot on.” In a 1997 book, Affective Computing, Picard first mentioned the idea of changing the voice of a virtual helper in response to a user’s emotional state. She notes that research has shown how matching a computer’s voice to that of a person can make communication more efficient and effective. “There are lots of ways it could help,” she says.