Computers that know how you feel will soon be everywhere

With its massive data set, el Kaliouby believes Affectiva has developed an accurate read on human emotions. The software can, in effect, decode feelings. Consdier Affectiva’s take on tracking empathy: “An example would be the inner eyebrow rise,” says el Kaliouby. “Like when you see a cute puppy and you’re, like, awww!” It can even note when you are paying attention.

The software relies on a so-called Facial Action Coding System, a taxonomy of 46 human facial movements that can be combined in different arrays to identify and label emotions. When it was developed in the late 1970s, humans scored emotional states manually by watching the movement of facial muscles. It was time intensive. “It takes about five minutes to code one minute of video,” says el Kaliouby. “So we built algorithms that automate it.” The software had to be trained to recognize variety in expressions. My smirk, for example, might not look like your smirk. “It’s like training a kid to recognize what an apple is,” el Kaliouby says.

Five years in, the technology has become robust enough to be reliably useful. Experience designer Steve McLean, for example, who runs the Wisconsin design firm Wild Blue Technologies, has used Affectiva to build a video display for Hershey to use in retail stores. If you smile at the screen, the display dispenses a free chocolate sample. Tech startup OoVoo, which competes with Skype, has integrated the software into its videochat to create a product called intelligent video that can read chatters’ emotions. “We’re looking at focus groups, online education, and political affinity,” says JP Nauseef, managing director of Myrian Capital, which invested in both Affectiva and OoVoo and sits on Affectiva’s board.