close
close

Le-verdict

News with a Local Lens

How pollsters adapt to voters who don’t answer their phones
minsta

How pollsters adapt to voters who don’t answer their phones

With fewer landlines, busy parents juggling work and family, and young adults who rarely answer calls, it has become much more difficult to engage respondents.

As the US presidential election approaches, news reports and social media are increasingly filled with data from public opinion polls. How do pollsters know which candidate is leading in which swing state or with which key demographic? Or which questions are most important to the greatest number of people 264 million eligible voters across a vast country?

In other words: how do pollsters do what they do?

In the Emerson College survey, we lead a dynamic survey operation that, like many others, has continually evolved to keep up with changing trends and technologies in survey research. In the early days of survey research – about 100 years ago – data were mainly collected by mail and during in-person interviews. Of course, this is no longer true these days.

In the early days of the survey industry, being asked to participate in a survey was a novelty and response rates were high. Today we are bombarded with survey requests via email, text messages, online pop-ups and phone calls from unknown numbers.

With fewer landlines, busy parents juggling work and family, and young adults who rarely answer calls, preferring communication via text, it has become much more difficult to engage respondents. This change in behavior reflects evolving challenges to reach diverse populations in modern survey research.

Evolution of data collection

In the broadest possible terms, polls and surveys have two elements: choosing who to contact and reaching them in a way that is likely to get a response. These elements are often linked.

In the 1970s, after domestic telephones have become widespread In the United States, survey operators adopted a random sampling method called random numberingin which survey designers would choose the area codes they wanted to reach and live operators would randomly dial seven-digit phone numbers in that area code.

In the 1990s, pollsters began to abandon random dialing, which was time-consuming and expensive, because random selection often selected out-of-service telephone numbers or not useful for opinion surveyssuch as businesses or government offices. Instead, pollsters began adopting recording-based samplingin which public voter registration records were used to compile lists from which respondents were randomly selected.

Information in these documents and other associated public records, such as those detailing gender, age and education level, helped refine the random sampling called stratified sampling. This is where the big list was divided into subgroups based on these different characteristics, such as party affiliation, voting frequency, gender, race or ethnicity, income or the level of education.

Investigators then randomly selected from these subgroups in proportion to the population as a whole. So, if 40% of the overall population has a college degree and 60% does not, a survey of 100 people would randomly select 40 people from the list of those with a college degree and 60 from the list. list of those who do not have one.

Other advances in ways of reaching respondents emerged in the late 20th century, such as interactive voice responsewhich did not require live operators. Instead, automated systems played recordings of the questions and recorded the oral responses. In 2000, Internet survey also began to emerge, in which participants filled out online forms.

From probability sampling to non-probability sampling

Over the past two decades, the rise of cell phones, text messaging, and online platforms has radically changed survey research. The traditional gold standard of using only live telephone surveys of carriers has become almost obsolete. Now that phones display who’s calling, fewer people answer calls from unknown numbers and fewer are willing to talk to a stranger about their personal opinions.

Even the random sampling that was once the norm has given way to non-probability sampling approach based on increasingly specific population proportions. So, if 6% of a population are black men with a certain level of education and a certain household income, then a survey will strive to have 6% of respondents matching those characteristics.

In quota samplingparticipants cannot be randomly selected, but rather chosen as participants based on their specific demographic characteristics. This method is less statistically rigorous and more prone to bias, although it can produce a representative sample with relative efficiency. In contrast, stratified sampling randomly selects participants within defined groups, thereby reducing sampling error and providing more accurate estimates of population characteristics.

To help polling operations find potential respondents, political and marketing consulting firms have compiled voter information, including demographics and contact information. At Emerson College Polling, we have access to a database of 273 million U.S. adults, with 123 million cell phone numbers, 116 million email addresses, and nearly 59 million landline numbers.

A newer technique that pollsters use to reach respondents is called river samplingan online method in which individuals complete a survey during their regular Internet browsing and social media activity, often via an advertisement or pop-up window. They complete a short screening questionnaire and are then invited to join a survey registration panel whose members will be invited to respond to future surveys.

A digital figure emerges from a long stream of other digital figures.A digital figure emerges from a long stream of other digital figures.
Databases compile large amounts of information on many American voters. (da-kuk/E+ via Getty Images/via The Conversation)

Emerson College Survey Methodology

Our polling operation used a variety of approaches to reach the more than 162,000 people who have responded to our surveys so far this year in the United States.

Unlike traditional pollsters, Emerson College Polling does not rely on live data collection by operators beyond small-scale testing of new survey methods to evaluate and improve the effectiveness of different polling approaches.

Instead, like most modern fish finders, we use a mix of approachesincluding web-based text surveys, interactive voice response on landlines, email outreach, and opt-in panels. This combination allows us to reach a broader, more representative audience, which is essential for accurate polling in today’s fragmented social and media landscape. This diverse population includes younger individuals who communicate through various platforms distinct from those used by older generations.

When we contact people in our stratified samples, we take into account the differences between each communication method. For example, older people tend to respond to landlines, while men and middle-aged people are more responsive to mobile SMS surveys. HAS reach underrepresented groups – such as adults ages 18-29 and Hispanic respondents – we use online databases that they voluntarily signed up for, knowing they might be surveyed.

We also use information about who we sample and how to calculate the margin of error, which measures the accuracy of the survey results. Larger sample sizes tend to be more representative of the overall population and therefore lead to a smaller margin of error.

For example, a survey of 400 respondents typically has a margin of error of 4.9%, while increasing the sample size to 1,000 reduces it to 3%. provide more precise information.

The goal, as always, is to present to the public an accurate reflection of what the population as a whole thinks about the candidates and the issues.

This article is republished from The conversation under Creative Commons license. Read the original article.

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *