Incomplete survey responses happen, but do they have to? What survey experience causes incompletes, and is there a way to decrease incompletes? The topic matters today more than ever because of survey fatigue and the fierce competition for consumers’ time.
“Back in the day, I remember doing these surveys that were like 25-30 minutes long and had 8,00o respondents; the data looked great,” said Karine Pepin, vice president at 2CV, on an episode of the market research podcast – “Reel Talk: The Customer Insights Show.” “Why would I want to keep partial data? That would not even have occurred to me. This is just more recently because it’s so much more difficult to field studies.”
This article covers the following:
What types of incomplete surveys happen?
Incomplete responses happen in various stages of the survey experience.
The landing page
“There are people dropping off at various points in the surveys,” Karine said. “The main place where people drop off is actually on the landing page. And after taking all these surveys, I can’t say I’m surprised. Ninety percent of the landing pages out there just aren’t that engaging.”
Maybe the landing page isn’t clear enough or doesn’t entice people to click onward. That’s especially concerning, as the landing page’s only job should be to get people to take the survey.
Karine said that some landing pages have just too much extraneous stuff that people aren’t reading to begin with.
“It’s just not a very nice welcome to the survey,” she said.
The beginning of a survey
Other times, even when people start the survey experience, they might drop off quickly if the experience is mundane or goes on and on. For example, excessive screener questions could get people to drop off.
In the middle of answering questions
Some incompletes happen when questions just go on and on or are displayed in an unappealing grid format. People might drop off because the experience is not fun.
Other times, people might drop out because they are experiencing survey fatigue, said Karine.
By giving incomplete answers
In a qualitative survey, people might also give incomplete answers. Indeed, what is a complete or incomplete answer to a question might be up for debate, but some answers are clearly not complete thoughts.
“The reviewers are primarily looking for technical criteria like video quality, sound quality, anything inappropriate, length of recording,” Matthew said. “They’ll also make a judgment call on whether or not the question was answered.”
What to do with incomplete responses?
Karine said even considering using partial responses is a newer strategy. Still, when answers are just trickling in, and some are just a part of the survey, it might be worth looking at those and seeing what insights can be gathered from them.
The survey experience
In addition to evaluating incomplete responses, we can also look at the survey experience as a whole to see what can change to avoid incomplete data, to begin with, or going forward for the current survey.
Karine said that incomplete responses could happen because of the actual survey experience. A host of issues can affect the survey experience, including:
- How easy is it to take the survey
- The wording
- Length of survey
- Relevance of the study to the consumer
Understanding the experience
It can certainly be hard to look at the survey experience, especially when we are so close to creating the content. But there are ways to keep a pulse on what can work and what good and bad survey experiences are. For example, look at other surveys.
“Sometimes, you see things from a different perspective if you fill out surveys that are not yours,” Karine said. “Sure, you test your surveys or your colleague’s surveys, but you know the questions coming up, and they make sense to you. So when you fill out somebody else’s survey, it’s very different. I’ve learned a lot about this process, to see what we are putting panelists through and how we can improve our own process.”
- the respondent expectations
- what steps are working
- clarity of the experience
- potential friction points
“As an industry, are we doing that with our surveys?” Jenn said. “Can we actually be measuring that better?”
Look at the totality of the experience.
To improve the survey experience, it’s wise to look at the entire experience. Not just one question at a time.
“We seem to have been focused in the past on question by question,” Karine said. “Like, oh, let’s gamify this question. But it’s not about one question. It’s about an end-to-end experience.”
Compare it to reading a book, Karine said.
“You don’t focus on one chapter, you are focusing on the whole book to make it good and engaging,” Karine said.
“And if the first chapter is no good, you are not going to get to that good sixth chapter,” Jenn added.
“As it turns out, we judge a book by its cover,” Karine said.
“And surveys by their landing pages,” Jenn said.
Is it a survey recruitment issue?
It’s not always possible to understand why somebody dropped off, but trends can undoubtedly emerge. For example, Karine said in a study that involved playing a video game, people who didn’t even start playing the game dropped off quickly.
“They were less engaged with the topic,” she said, adding that this can come back to targeting the right audiences.
Some drop-offs can happen because the wrong consumers are starting to take the survey.
Are you giving the right incentives?
Some respondents might be motivated by different kinds of incentives, including specific payouts per survey, gift cards, or perhaps a unique experience with the brand.
“The people that are less engaged might be the people that are more price-driven,” Karine said.
“Why would I be doing a survey for $1 if I could do something else that makes more?” Jenn said.
Understanding the product
Karine said that the survey experience can also improve when the researcher understands the product and audience. For example, let’s say we are launching a survey to ask about a specific product. Try that product first, understand what it does and how it helps the customers, and familiarize yourself with it.
Keep an eye on where people are dropping off in the survey. If you see a trend, consider updating the study. Please don’t consider it written in stone.
“It’s not like you just launched a shuttle here,” Karine said. “If there’s a question that doesn’t work, it’s just better to address it as soon as possible. It’s not going to get any better.”
Making surveys more engaging
Karine said there should be a way to design surveys, so respondents look forward to participating.
“Whenever I design surveys now, I think of it through that lens of how do we make this more engaging?” Karine said.
That could include:
- a better landing page
- the use of illustrations
- a more conversational tone
- keep questions as concise as possible
- clear progression
“Instead of a progress bar, can we show a path?” Karine said. “Can we make this more of a table of content? If this is supposed to be more of a conversational tone, let’s make it more of a conversation. You show respect to the panelists and show them what we will discuss. And nothing is really a surprise. ”
Wanting to make surveys more engaging and actually implementing that strategy certainly is different, Karine said. Once the awareness and strategy are in place, we need to consider what technology can allow us to do that.
“Let’s define what the experience should be like, but do we have the tools to actually make it happen?” she said. “And it’s hard to describe what a beautiful survey is, but you know it when you see it.”
At the end of the day, we need consumers to help us evolve our products and services. That’s why it’s so important to figure out how to make our surveys engaging and worthwhile so we can still gather customer insights two years from now.
Listen to our market research podcast