How AI Can Infer Human Emotions (oreilly.com) 25
An anonymous reader quotes OReilly.com's interview with the CEO of Affectiva, an emotion-measurement technology company that grew out of MIT's Media Lab.
We can mine Twitter, for example, on text sentiment, but that only gets us so far. About 35-40% is conveyed in tone of voice -- how you say something -- and the remaining 50-60% is read through facial expressions and gestures you make. Technology that reads your emotional state, for example by combining facial and voice expressions, represents the emotion AI space. They are the subconscious, natural way we communicate emotion, which is nonverbal and which complements our language... Facial expressions and speech actually deal more with the subconscious, and are more unbiased and unfiltered expressions of emotion...
Rather than encoding specific rules that depict when a person is making a specific expression, we instead focus our attention on building intelligent algorithms that can be trained to recognize expressions. Through our partnerships across the globe, we have amassed an enormous emotional database from people driving cars, watching media content, etc. A portion of the data is then passed on to our labeling team, who are certified in the Facial Action Coding System...we have gathered 5,313,751 face videos, for a total of 38,944 hours of data, representing nearly two billion facial frames analyzed.
They got their start testing advertisements, and now are already working with a third of all Fortune 500 companies. ("We've seen that pet care and baby ads in the U.S. elicit more enjoyment than cereal ads -- which see the most enjoyment in Canada.") One company even combined their technology with Google Glass to help autistic children learn to recognize emotional cues.
Rather than encoding specific rules that depict when a person is making a specific expression, we instead focus our attention on building intelligent algorithms that can be trained to recognize expressions. Through our partnerships across the globe, we have amassed an enormous emotional database from people driving cars, watching media content, etc. A portion of the data is then passed on to our labeling team, who are certified in the Facial Action Coding System...we have gathered 5,313,751 face videos, for a total of 38,944 hours of data, representing nearly two billion facial frames analyzed.
They got their start testing advertisements, and now are already working with a third of all Fortune 500 companies. ("We've seen that pet care and baby ads in the U.S. elicit more enjoyment than cereal ads -- which see the most enjoyment in Canada.") One company even combined their technology with Google Glass to help autistic children learn to recognize emotional cues.
Knackers (Score:1)
https://www.youtube.com/watch?... [youtube.com]
O'Reilly? (Score:1)
Why do I care what an auto parts store's interview about artificial intelligence?
I like my computers to disregard my mood (Score:5, Insightful)
A computer trying to interpret and/or react differently based on how I feel is exactly the opposite of what I would like to have near me.
Re:I like my computers to disregard my mood (Score:5, Funny)
"A computer trying to interpret and/or react differently based on how I feel is exactly the opposite of what I would like to have near me."
When computers start asking people to calm down, compucide numbers going through the roof.
Re: (Score:2)
So much this. We're talking here about analyzing the user more thoroughly, in order to better manipulate her.
Re: (Score:2)
LOL This man is a genius. And probably single O.o
read my lips (Score:2)
I'm thinking back to that time I slammed my fist on my particle-board desk so hard during a Windows NT installation going sideways in heretofore unmapped N-dimensional space that I snapped the little dowels off at one end of my desk plank. There was a CD involved, and floppy disks to load missing drivers for my primary disk systems, and random errors, and circular dependencies in non-coldstart retry attempts.
I don't even know how many levels deep I was at that point in pointless problems begetting more poi
Amazing (Score:1)
Garbage In, Garbage Out (Score:2)
Re: (Score:2)
All this work and data sounds impressive until you realise that FACS ("Facial Action Coding System") is bollocks [sagepub.com].
I really do wish this technology worked, though. Simply because if it did, it would detect that I detest Ads and I would never see another Ad in my life....
Salmon inspired neural networks (Score:3)
They could train their deep learning networks based on research done on brain of dead salmon reacting to human emotions...
http://prefrontal.org/files/po... [prefrontal.org]
The next step is this.. (Score:2)
Now they have to understand that emotion is a state indicator and it is used to communicate to ourselves and others the state of satisfaction of our peculiar accent of motivations in the nominal Human Motivation Array. Think of an N - dimensional motivation array with each axis corresponding to one motivation. The motivation vector points out a vector nulling behavior we have developed in our behavior-space over our lifetime. We convey emotions to each other so we can read another's state and adjust our beh
Computers cannot "detect" emotion (Score:2)
In real life, people don't always scowl in anger or pout in sadness. These are stereotypes, not universal signals to be detected. People may smile in anger, cry in happiness, etc. There's tons of variety in emotion. Software that assumes stereotypes are the norm aren't going to work very well.
Here's a great article on so-called statistical recognition of emotion via software: "Pattern Classification Explained [lisafeldmanbarrett.com]." The author is a well-known emotion researcher.
Comment (Score:2)
How long will it be before humans can infer AI emotions? https://www.youtube.com/watch?... [youtube.com]