Seeing things through the eyes of consumers

Mobile technology and fly-on-the-wall digital video research are proving to be a breakthrough in understanding the behaviour of previously difficult-to-reach consumers

For 50 years researchers have known that the research we do isn’t always the research we need. Results may take weeks or months. When we respond to surveys, we hold back or make ourselves look good or misremember. We know a lot about middle-class households in rich economies, but much less about how people work and live in the developing world.

Mobile and video technology is changing all this. For example, the idea of the vox pop isn’t new – companies send out a film crew to accost random consumers in the street. The cost and the time of editing the footage meant that results are expensive and may take weeks to reach the people who make decisions. But, because those people know next to nothing about the interviewees, the voices they hear are a poor basis for decision-making.

In contrast, when Channel 5 wants to do a vox pop, it asks members of the public who take its conventional surveys to use their webcams to do their own vox pops. Within 48 hours, those responses have been transcribed, analysed for sentiment, edited and delivered, ready for the weekly editorial meeting.

Tom Beasley, Channel 5’s viewer insight manager, uses the clips to support his traditional audience research, helping schedulers and commissioners to understand what’s behind the viewing figures. “We might be trapped in our own little bubble, so we want to get the opinion of the public,” he says. “Everyone in the meeting wants to find out what the viewers say, even if it’s not always what they want to hear.”

Previously, these opinions were gathered using surveys that contained what the researchers call “open text” – a box, usually at the end of a list of questions, in which you can say anything you want. Open text can be useful in the same way as vox pops are. Surveys often ask only about things the company thinks is important, completely missing the other important information about what we love and hate.

But, confronted with the survey equivalent of a blank sheet of paper, Channel 5’s viewers tend to freeze: “The viewers often give one-word answers, or say ‘it was good’ or ‘I didn’t like it’,” Mr Beasley says.

Dave Carruthers, founder and chief executive of Voxpopme, which provides Channel 5’s video surveys, as well as working with companies such as Barclays and Tesco, says: “We find that video interviews give six or eight times as much content as open text and we’re capturing emotion too. Seeing consumers saying things in their own words is far more compelling internally than statistics alone. It has the emotional value to go with the rational value of big data.”

Selfie vox pops also reach decision-makers more quickly. For example, Voxpopme uses crowdsourcing to transcribe the responses so they are searchable within ten minutes of being recorded. Digital video can be delivered and searched in hours, rather than waiting for a final edit. When interviewees are part of an existing panel, they can be selected by age or background.

When volunteers share their phone’s location data, researchers have an idea who has seen an advertisement, even a billboard, and can match it to their subsequent shopping behaviour with greater accuracy
When volunteers share their phone’s location data, researchers have an idea who has seen an advertisement, even a billboard, and can match it to their subsequent shopping behaviour with greater accuracy

The impact of AI

Nevertheless, in vox pops we’re still explaining how we feel. Realeyes, a company created by researchers into artificial intelligence (AI) at the University of Oxford, takes this a step further by using AI to watch our response and automatically measure our emotions when we watch advertising. Peter Haslett, director of customer development at Realeyes, says: “People are not good at pointing out ‘that works’ or ‘I am engaged’.”

Quantifying a response in this way has the advantage that it generates reportable numbers. So when volunteers watch advertisements for clients, such as Coca-Cola or adidas, the application automatically recognises fleeting facial expressions. It can spot which moments grab and retain audience attention, and creates scores that compare with a reference database of 8,000 advertisements to predict the engagement and emotional impact that drive sales.

Letting people do their own research using video and mobile is cheap enough that we are learning much more about how different types of communities live

Using a 300-person test, the company claims it can predict advertising that will have a high sales lift with 75 per cent accuracy and videos which will encourage high charity donations to an accuracy of 67 per cent.

Helping to research our unconscious emotions, or those we feel uncomfortable communicating out loud, are potential future applications for the technology that underpins Realeyes. “Detecting emotions could be used for medical research – how do we feel about what the doctor is saying? Or education – are students confused or inspired?” Mr Haslett explains.

Emerging market research methods

Putting people first

Letting people do their own research using video and mobile is also cheap enough that we are learning much more about how different types of communities live. By combining passive measurement, video and mobile technology, weseethrough has conducted research in the last 12 months in countries such as Ghana, Egypt, Brazil, Nigeria and Vietnam. “We are interested in collecting actual behaviour. We try to get people to record video for us for hours, even days,” says Duncan Roberts, weseethrough’s chief technology officer.

The company specialises in domestic subjects in difficult-to-reach locations and communities that researchers have been forced to ignore in the past. It has done this by asking volunteers to use Google Glass, the futuristic spectacles with cameras, which were a commercial failure, but great for ethnographic researchers.

The glasses film constantly and when a subject enters a room, or at particular times, Google Glass automatically asks them to talk about what they are doing and why. The resulting stream-of-consciousness videos have been a revelation in the head offices of clients including Unilever, Ben & Jerry’s and IKEA that have witnessed the everyday lives of customers through their own eyes.

Among the highlights was a Brazilian housewife who uses her husband’s toothbrush to scrub the shower, before replacing the toothbrush in the mug, or a Nigerian mother explaining that it’s difficult to open packaging while simultaneously cooking by torchlight and holding a baby.

According to Liam Corcoran, vice president of advertising and audience measurement at Research Now, many more of us will passively take part in research in the future thanks to mobile devices.

For example, for decades researchers such as Research Now have tried to evaluate how effective advertising is based on a guesstimate called “opportunities to see”, which relies on information we remember or report in diaries – which newspaper we read, which television programmes we watch, our route to work in the morning.

But when volunteers share their phone’s location data, researchers have a much better idea who has seen an advertisement, even a billboard, and can match it to their subsequent shopping behaviour with much greater accuracy.

At weseethrough, Mr Roberts speculates that in the future we might also be paid to share our mobile and video data with researchers, for example from home security video cameras. In his experience, when the goal of the project is explained clearly, subjects have been happy to share surprisingly intimate video logs of their personal lives. “We do an awful lot of censorship,” he adds.

What’s a question you’d like to ask consumers?