The Fight For How We Know

By Thomas Miller

Most Americans believe that the earth is round. Almost half of all Americans believe in ghosts. Support for Israel is strongest among white evangelical Christians. The Senate race in Colorado between Udall and Gardner is neck and neck. The average commute time in the U.S. is just over 25 minutes. And Katie Perry is America’s most popular celebrity.

Miller augustThese are facts that you likely have heard in the news more than once. Like me, you probably believe them. The credibility of each assertion could be challenged, but few of us take the time to be suspicious (unless the assertion conflicts with a strongly held belief – in which case we simply reject the conclusion without any examination of its accuracy). What we think we know about Americans’ behaviors and beliefs once came from countrywide public opinion research – done by phone, mail or (long ago) in person – where questions have been asked of a representative sample of (or all) adults, with a known chance of any one person being selected to respond. Affixed to each factoid is a range of numbers, called the margin of error, that serves as the imprimatur of precision because all of our uncertainty is thought to lie only within that range.

Well, times are changing and so are the ways we find things out. Traditional surveys are under assault. Why? Because no one answers the phone anymore and those who do, arguably, don’t resemble the general population. Many Americans (and most Europeans and Africans) don’t talk on land lines. They use cell phones and survey interviews on cell phones are more expensive than on landlines. Plus, even when calling cells and land lines, the average response rate in telephone surveys has fallen to about 9 percent. National Research Center’s mailed surveys still garner, on average, about a 35 percent response rate but this rate has been notching down as the price of mail increases making opt-in Web surveys ever-more tantalizing.

If you sit quietly in a dark empty room, you can almost hear the sound of survey researchers pulling out their hair. The avant garde – thought by some to be just survey cowboys and others to be survey Einsteins – appreciate the opportunity for the fast and inexpensive data collection offered by the Internet. Clearly there is money to be made by those who can harness the Web to deliver accurate survey results and many researchers have sold the responses they have collected from paid Webizens recruited to respond to surveys.

But what is the touchstone of accuracy?  It’s not so much about how responses are collected, it’s more about whether those responses are valid – that is, whether the responses of the few hundred in the survey are an accurate representation of all the people whose attitudes or behaviors you want to understand. Just because some survey group puts a margin of error around its results doesn’t prove that the ‘real’ population response is within that range – or even that it is likely to be within that range.

Historically, the proof of survey accuracy comes from the ability of opinion polls to predict voting behavior. Polls continue to score well on this test and while most of the accurate polls are done using traditional sampling and analysis methods, some of the accurate polls are conducted on the Internet. With response rates dropping using traditional survey methods and the accuracy of some Internet-based surveys on par with phone or mailed sampling, survey samurais are studying the pros and cons of opt-in data collection where respondents select themselves rather than being selected by researchers.

Stay tuned. A venerable survey research group just announced last week that it will turn to some Web surveying for the upcoming midterm elections, causing an opinion tornado among survey professionals. A lot is at stake – not just correct predictions of election outcomes but what we think is true about American society. Knowing (or believing we know) what we, as a confederation of neighbors, thinks ties our culture together. Surveys that tell us what Americans think and do are the next best things to having the family over for dinner – only you probably wouldn’t talk politics.



4 thoughts on “The Fight For How We Know

  1. Great question, Matt. Too few managers consider the quality of the data they rely on – especially survey data. Years ago, colleagues and I studied the quality of citizen surveys conducted across America and published our results in PAR (“Assessing Excellence Poorly”, 1991). A new meta-analysis is overdue.


  2. […] I have written recently that the evidence Americans rely on to understand society’s predilections has been based on one kind of public opinion survey method that is under assault. Though mostly unspoken, the primary model for collecting data over the last 40 years – random selection (“probability samples”) of Americans interviewed by telephone – is known by virtually every survey research professional to be unsustainable because phone surveys get ever lower response rates at accelerating costs while attracting people that don’t necessarily represent the general public. […]


Tell Us What You Think

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s