“The system can tell that a man comes by a preschool every day at recess,” an executive at a video monitoring software firm once explained to me. “It can even tell that he’s smiling. But it can’t tell whether the smile is creepy or not,” meaning that current, while powerful, can’t tell a grandparent from a predator, or, in another example, a hug from an assault. Breakthroughs in emotion reading technology, however, are now making our video analysis smarter.
Researchers at MIT‘s Media Lab have been developing a suite of technologies to analyze social interactions that are now hitting the market. In one project, actors mimed emotions and volunteers interpreted their expressions. The majority answers were added to a database that serves as a visual library of expressions which is compared to realtime footage using 24 feature points on the face. Researchers incorporated this technology to glasses with built in cameras that capture expressions, send them to a computer, and then tell the wearer via an earpiece and light on the lens how their interaction is going. This may seem like an overly complicated way to perform the simple task of reading reactions, but the researchers found that, on average, people were only correct 54% of the time, while the glasses were 64% accurate. The software has already been spun out to create the company Affectiva, which aims to collect emotion data over the web. Another team designed badges that use tone of voice and body language to measure persuasion and have launched Sociometric Solutions to market their technology and services to businesses. The researchers claim that their badges can boost telephone sales by 20% and already have clients like Bank of America.
Such technology has numerous uses for creating and analyzing bigger and better data. Some of the benefits are social, as for example withthat hopes to market its glasses to people with difficulty reading social interactions and the prototype has already proven popular with people with autism. Numerous labs also use their sensors and software to measure responses to products, treatments, and advertising. Affectiva offers services to use webcam footage to track responses on websites, providing the most accurate assessment of viewer response.
Emotion reading technology can also have tremendous value outside of business. As mentioned in my analysis of police use of force data, such technology would be valuable to automatically examine police video and audio recordings to better understand the underlying factors in confrontations and detect unreported instances of aggressive behavior in officers and suspects. The national security implications of emotion reading technology could be even more monumental, such as use in conjunction with security cameras to detect exceedingly nervous travelers at airports for further inspection. As more advances are made, emotional information can join click streams, textual analysis, and other new, exciting methods of exploiting Big Data.
- MIT lab develops glasses that can read another person’s emotional state (newscientist.com)
- Facial Expression Analysis the Future of Content Programming? (thecontentlab.icrossing.com)
- Tackling Big Data on Police Use of Force (CTOvision.com)