According to Bloomberg, Amazon.com is developing a wearable voice-activated device capable of recognizing human emotions.
The device to be worn on the wrist is described as a health and wellness product in the internal documents that Bloomberg has viewed. This is a collaboration between Lab126, the hardware development team that developed the Fire tablet and the Echo intelligent speaker, and the Alexa voice software team.
Designed to work with a smartphone application, the device features microphones combined with software that can distinguish the emotional state of the wearer from the sound of his or her voice. Based on the documents Bloomberg has consulted, the technology may be able to advise the wearer how to interact more effectively with others.
It is not clear how long the project will last, or whether it will ever become a commercial device. Amazon gives teams a great deal of freedom to experiment and make new products, some of which will never make it to the market. Work on the project, called Dylan code, is ongoing, and there is a beta testing program, although it is unclear whether this includes hardware, emotion detection software or both.
Amazon has refused to comment on the news.
Companies such as Microsoft, Google and IBM, among many others, are developing technologies designed to derive emotional states from images, audio and other inputs. Amazon has publicly announced its desire to build a more realistic voice assistant.
The technology could help the company gain knowledge for potential health products or be used to better target advertising or product recommendations. The concept could feed into the debate about the amount and type of personal data collected by technology giants, who already collect a wealth of information about their customers. Earlier this year, Bloomberg reported that Amazon has a team that listens and notes audio clips captured by the company’s Echo line of voice activated speakers.
A U.S. patent filed in 2017 describes a system in which the software uses voice pattern analysis to determine how a person feels, discerning between “joy, anger, anger, pain, sadness, fear, disgust, boredom, stress or other emotional states. The patent, made public last year, suggests that Amazon could use the knowledge of a user’s emotions to recommend products or otherwise tailor-made responses.
A diagram attached to the patent says that the technology is able to detect an abnormal emotional condition and shows a woman sniffing and telling Alexa that she is hungry. The digital assistant, sensing that she has a cold, asks the woman if she wants a recipe for chicken soup.
A second patent granted to Amazon mentions a system that uses techniques that distinguish the wearer’s speech from background noise. Amazon documents reviewed by Bloomberg state that the wearable device will take advantage of this technology.
Amazon’s work on a wearable device underscores its ambition to become one of the leading manufacturers of cutting-edge voice recognition software and consumer electronics. The Echo line of intelligent speakers and Alexa’s built-in voice software have made the use of voice commands at home popular.
Recognition of emotional states could be used in applications for well-being and mental health, making them able to interact with people and provide timely information and suggestions.