You are a mind reader, whether you know it or not. You can tell just by looking at a human face whether the person is concentrating, confused, interested or in agreement with you.
But people afflicted by autism lack this ability to ascertain emotional status — it’s one of the signature characteristics of the disease. Help could be on the way for autistic individuals, though: A novel computer-vision system developed at the Massachusetts Institute of Technology could do the mind reading for those who can’t.
The system’s software goes beyond tracking simple emotions like sadness and anger to estimate complex mental states like agreeing, disagreeing, thinking, confused, concentrating and interested. The goal is to put this mental state inference engine on a wearable platform and use it to augment or enhance social interactions, said Rana el Kaliouby, a postdoctoral researcher at the Media Lab.
“This is only possible now because of the progress made in affective computing, real-time machine perception and wearable technologies,” she said.
Also, wouldn’t it be wonderful if our computers could read our emotions in order to respond to our frustrations?