When an Ohio State University study tested how well facial recognition software could detect emotions, it received a 96.9 percent accuracy rate with the six basic emotions and a 76.9 percent accuracy rate with compound emotions such as “happy surprise” and “angry fear.” Emotient, a company that uses machine-learning algorithms, is developing an app for Google Glass that detects emotions in real time. Lead scientist Marian Bartlett says the app will be on the market soon.
The facial recognition software is computed with Facial Action Coding System, or FACS. Developed by Paul Elkman, “it breaks down emotions to specific sets of facial muscles and movements: the widening of the eyes, the elevation of the cheeks, the dropping of the lower lip, and so on,” explains The Atlantic.
FACS is used in multiple capacities, from the design and construction of animated characters to the identification of genes, chemical compounds and neuronal circuits regulating emotions.
The app and its abilities can be used in the diagnosis of people that have difficulty recognizing facial expressions, such as in autism and post-traumatic stress disorder.
The software could also be applicable as a lie detector, distinguishing true emotion from spoken words.
No Comments Yet
You can be the first to comment!
Sorry, comments for this entry are closed at this time.