AI that purports to read our feelings may enhance user experience but concerns over misuse and bias mean the field is fraught with potential dangers
It’s Wednesday evening and I’m at my kitchen table, scowling into my laptop as I pour all the bile I can muster into three little words: “I love you.”
My neighbours might assume I’m engaged in a melodramatic call to an ex-partner, or perhaps some kind of acting exercise, but I’m actually testing the limits of a new demo from Hume, a Manhattan-based startup that claims to have developed “the world’s first voice AI with emotional intelligence”.
AI that purports to read our feelings may enhance user experience but concerns over misuse and bias mean the field is fraught with potential dangersIt’s Wednesday evening and I’m at my kitchen table, scowling into my laptop as I pour all the bile I can muster into three little words: “I love you.”My neighbours might assume I’m engaged in a melodramatic call to an ex-partner, or perhaps some kind of acting exercise, but I’m actually testing the limits of a new demo from Hume, a Manhattan-based startup that claims to have developed “the world’s first voice AI with emotional intelligence”. Continue reading…
Discover more from Stay Updated
Subscribe to get the latest posts sent to your email.