Amazon
Apple Watch
voice."Amazon Tone
Amazon's
Business Insider
AI
the University of Oxford."How
Cerner
John Hancock Vitality
Vitality Points
Kaltheuner
Halo
Digital Pedagogy Lab
unconvincing."Amazon
Alexa
Amazon Halo
it's
Maulik Majmudar
Tone
Dr Sandra Wachter
Frederike Kaltheuner
Amazon's
Alexa
Amazon Tone
Nakeema Stefflbauer
Stefflbauer."I
John Hancock
Amazon Halo Band
Chris Gilliard
No matching tags
No matching tags
No matching tags
language."Amazon
people's
No matching tags
It uses "machine learning to analyze energy and positivity in a customer's voice so they can better understand how they may sound to others, helping improve their communication and relationships," Amazon's press release for Halo reads.To give an example, Amazon's chief medical officer Maulik Majmudar said Tone might give you feedback such as: "In the morning you sounded calm, delighted, and warm." According to Majmudar, Tone analyzes vocal qualities like your "pitch, intensity, tempo, and rhythm" to tell you how it thinks you sound to other people.Experts that Business Insider spoke to are dubious that an algorithm could accurately analyze something as complex as human emotion — and they are also worried that Tone data could end up with third parties."I have my doubts that current technology is able to decipher the very complex human code of communication and the inner workings of emotion," said Dr Sandra Wachter, associate professor in AI ethics at the University of Oxford."How we use our voice and language is greatly impacted by social expectation, culture and customs. "Even if it's in aggregate and anonymous, it might not be something you want your watch to do," she said.Chris Gilliard, an expert on surveillance and privacy at the Digital Pedagogy Lab, told Business Insider he found Amazon's privacy claims unconvincing."Amazon felt the heat when it was revealed that actual humans were listening to Alexa recordings, so this is their effort to short circuit that particular critique, but to say that these systems will be 'private' stretches the meaning of that word beyond recognition," he said.Wachter said that if, as Amazon claims, an algorithm was capable of accurately analyzing the emotion in people's voices, it could pose a potential human rights problem."Our thoughts and emotions are protected under human rights law for example the freedom of expression and the right to privacy," said Wachter."Our emotions and thoughts are one of the most intimate and personal aspects of our personality.
As said here by Isobel Asher Hamilton