Knocking on Doors to Discover Why People Don’t Respond to Surveys
A survey screener is mailed to residential addresses—not individuals—in all 50 states and the District of Columbia. Households are asked to complete a screener questionnaire online, by mail, or by phone. Those with youth aged 20 or younger who have not finished high school are asked to complete a longer, education-focused survey. Households with no children are asked to complete only one question on the survey screener to indicate this.
Surveys are a common and effective quantitative research method for gathering information about people’s behaviors, preferences, or attitudes. For decades, however, survey response rates have been declining, which hampers organizations’ ability to use survey findings to make decisions about their work and makes surveys more expensive to conduct.
The National Household Education Surveys Program, which AIR supports for the National Center for Education Statistics (NCES), is a case in point. Between 2012 and 2019, the response rate for the screener phase of this survey declined from 74 percent to 63 percent. To find out why, AIR conducted a groundbreaking study on behalf of NCES that provided actionable information to combat this growing problem for the next survey administration in 2023.
“A big goal of our study was to learn more about people who do not respond to surveys—and in a way that other research wasn’t attempting or capturing,” said Rebecca Medway, senior survey methodologist at AIR who led the study. “I’m not aware of any other study of this depth and scale on this research question. We literally went to people’s houses across the country and tried to get them to talk to us.”
A “Massive” Undertaking
About 50 AIR researchers, interviewers, notetakers, logistics coordinators, and support staff participated in this project.
“It was quite a massive project,” said Melissa Scardaville, AIR senior researcher and qualitative sociologist. She created and led the training for interviewers, conducted interviews, and analyzed the data. “The days were really intense.”
The project entailed:
- 85 in-depth, 90-minute interviews with NHES:2019 survey nonrespondents in four 30-mile radius sites centered in California, Connecticut, Ohio, and Texas; and
- 760 address observations to verify the accuracy of mailing addresses in these four sites, plus sites in Illinois, Texas, and the Washington, DC, area.
The field work was done in 2019, before the COVID-19 pandemic hit.
While AIR staff attempted to schedule interviews in advance, most of the interviews resulted from door-to-door solicitation. That process occurred over three weeks, with two-person teams of interviewers and notetakers spending hours trying to find addresses without the use of Wi-Fi-enabled navigation and traffic apps, a measure to keep addresses secure. Teams sometimes visited twice daily to find people at home. Notetakers doubled as address observers to verify the accuracy of the survey mailing list and to check residences for mailboxes, evidence that people were currently living there, and other indicators relevant to the study.
The teams encountered “very mixed” receptions. “Some people were very firm—they did not want to be interviewed,” Scardaville said. “Most were polite about it. Others would ask questions and engage in conversation about the study, but ultimately didn’t want to participate. Some were very interested, even enthusiastic.”
Uncovering Reasons for Nonresponses
AIR experts incorporated questions into the interview protocol to test research theories for why people don’t respond to surveys. Individual-level explanations include privacy concerns, anti-government sentiment, busyness and fatigue, concerns about survey length or difficulty, lack of interest in the topic, and low levels of civic engagement or community integration, according to the AIR report of the study. The report also cites societal changes that could be contributing to lower response rates, including an “increasing number of survey and solicitation requests, declining confidence in public institutions, and growing concerns about security and identity theft.”
In the end, the study provided evidence for every one of these theories. Busyness was the most-cited reason for not completing the survey. Some people didn’t remember getting the survey invitation or multiple reminders. Some said they knew it was important because the mailing came from the government, but they set it aside and forgot about it. “As important as the survey is to us, people have so much going on in their lives and this is a just one of several competing priorities for them,” Medway said.
The interviews yielded tangible evidence of this. “There was a lot of hustle and bustle in many households,” Scardaville said. “There were a lot of children climbing over everything and dogs running around and people getting phone calls.”
Surprising Attitudes on Privacy and Security
The study finding that most surprised the AIR team was the range of privacy and security concerns. “A decent amount of people in our respondent pool had been scammed in some way before, so they were hesitant to provide information about themselves,” Medway said. “People have the option of filling out the survey online, but some people are very hesitant to put their information online.”
At the other end of the spectrum, some people believe that the government already knows everything about them. “If they file their taxes, if their children go to public school, if they completed the U.S. Census, they think anyone in the federal government can access that information from a central repository that government agencies share—or that if the federal government doesn’t have that information, a corporation does,” Scardaville said. “So why bother asking them for information on a survey if someone already has it?”
Strengthening Future Surveys
A key takeaway from the study is that survey nonrespondents are not a monolithic group. People have different active or passive reasons for not completing a survey.
“The study team created seven typologies classifying the drivers of survey nonresponse,” said AIR Principal Researcher Danielle Battle, who leads AIR’s work on the NHES. “We’d like to tailor our future materials based on these typologies. We know there are people who probably will never do a survey, so maybe we won’t put so many resources into that group. We may be able to predict the types of people we’ll need to follow up with—and that could be relevant for other survey researchers as well.”
Already, Battle said, AIR is supporting NCES in implementing study recommendations to improve the response rate, including:
- Confirming legitimacy. Survey nonrespondents want to check whether a solicitation is legitimate. AIR is working with NCES to make the NHES participant page on the NCES website clearer and more engaging, with a direct link to the survey.
- Revising messaging. Survey nonrespondents had important questions, concerns, and misconceptions about the survey. AIR worked with NCES to revise the FAQs and conducted cognitive interviews with target audiences to test their effectiveness.
- Redesigning the questionnaire. Survey nonrespondents without children may think the survey is not relevant to them. In a redesigned questionnaire that will be tested in 2023, the cover shows the one screener question they need to answer. If there are no youth in the household, they can easily see that they do not need to complete the rest of the survey.
- Improving visual appeal—and adding a sense of urgency. Survey nonrespondents in the past have received the survey invitation and three mailed reminders to complete the survey. For the 2023 administration, they’ll receive a fourth reminder as a test. Envelopes will include language encouraging people to respond quickly. Most mailings will be designed a little differently to respond to varied preferences and get people’s attention.