So did you really like our ad?

We at bloomfield knoble are big believers in testing our creative and creative messaging. We have conducted focus groups, in-depth interviews, online panels and software to measure eye-tracking and other physiological responses. The challenge with this, or any type of testing, is to avoid testing bias and to, as much as possible, accurately record responses. As anyone who has ever been involved in testing, this is much harder than it seems. Now it turns out that there may be a way to remove bias altogether by using technology that can analyze a person’s face as they watch advertisements.

A system made by Affectiva, a start-up in Waltham, Massachusetts, can pick up on hidden emotions just by monitoring face movements. According to an article by Aviva Rutkin writing in New Scientist, Affectiva’s software first pinpoints important facial markers, such as the mouth, eyebrows and the top of the nose. then, machine-learning algorithms watch how those regions move or how the skin texture and color changes over the course of the video. These changes are broken down into discrete expressions indicating shifting emotions.

According to Affectiva’s principal scientist Daniel McDuff, the approach lets you find out what people actually think from moment to moment while the ad runs, not just what they say once it is over. “It provides a way of getting at those more genuine, spontaneous interactions,” he says. “This is their visceral response. It’s not sent through a cognitive filter where they have to evaluate how they feel.” In a study published this month, McDuff and his colleagues asked 1,223 people to give his team access to their home webcams while they watched a series of ads for sweets, pet supplies and groceries.

Before and after the ads ran, the subjects filled out online surveys about how likely they were to purchase the products shown. While they watched, the software stayed on the lookout for emotions, such as happiness, surprise or confusion. Afterwards, the researchers found that they could use the facial data to accurately predict someone’s survey results –  suggesting that they could rely on the computer’s analysis alone to know where an ad was successful. In the future, McDuff thinks the system could plug into TV services such as Netflix. “You could imagine suggesting TV programs or movies that people could watch, or ads that they find more enjoyable,” he says.

The Affectiva team has amassed a database of over three million videos of people across different ages, genders and ethnicities. McDuff says that there seem to be subtle variations in emotional responses: women tend to have more positive facial expressions than men, for example. By understanding how different groups respond, companies could put together ads that are fine-tuned for particular audiences. The data could also help advertisers to tweak their ads to tie in more closely to viewers’ emotions – for example, by putting in the name of the brand at the moment that elicits the strongest positive reaction.

Automated emotional analysis systems are promising, says Michel Wedel, who studies consumer science at the University of Maryland in College Park. They let advertisers break an ad down moment by moment to figure out exactly what works and what doesn’t. “What’s particularly powerful is that they’re unobtrusive,” he says. “They don’t rely on introspection or recollection.”

Related articles