- Colleen K. Porter, MA, Project Manager
- Joshua R. Tippery, Field Director
Policymakers and business people across a wide spectrum rely on data from surveys and polls as they make important decisions such as whether there is a market for a new product or if proposed legislation will be acceptable to local voters.
But when different surveys give various answers, how to know which is “right”?
There are many different factors that make a difference in the outcome of survey findings.
- Was this an opt-in online poll to which anyone could respond, even multiple times?
- Was it a phone survey using professional interviewers?
- What was the exact wording of the question being asked?
- Was this a mail survey where the recipient could see the return address of a legitimate organization and look at all the questions before responding?
- Was the poll done in person at a local shopping mall, talking to those who went past and were willing to stop and answer questions on that topic?
- Were the participants paid for their answers?
- How many people answered the survey?
All those factors can make a difference in the outcome, and perhaps yield varying results. A savvy consumer of survey data should expect that all these factors are reported so that a judgement can be made.
The issue of clearly laying out survey methods received national attention in early 2009 when Dr. Gilbert Burnham, a respected faculty researcher at the Johns Hopkins Bloomberg School of Public Health, repeatedly refused requests to make public several basic facts about the methods used in his research on civilian deaths in Iraq that were published in the British medical journal Lancet.
The American Association for Public Opinion Research (AAPOR) took action. A professional organization that includes survey researchers in political polling, government agencies, and universities across the U.S., AAPOR established standards for minimum disclosure and established a Transparency Initiative[1] to recognize organizations who routinely disclose the research methods associated with their publicly released studies.
This certification does not make any judgment about the quality of the research itself. The review process only looks at whether the research organization is willing to consistently disclose details of their research methodology.
In 2015, the UF Survey Research Center became a charter member of the AAPOR Transparency Initiative.
We have tried to keep the public informed about the methods on our publicly released surveys, and include those same essential items when producing a private report for a client.
Our commitment to transparency includes trying to be clear when those methods change. In January 2015, we launched a major change in one of our showcase surveys.
The Florida Consumer Sentiment Index (CSI) telephone survey has interviewed 500 respondents across the state of Florida every month since 1983. It is relied on by economists and planners as a metric that reflects opinions and conditions in the state. The index is benchmarked to 1966, which means a value of 100 represents the same level of sentiment for that year. Values can range from 2 to 150.
The sample was 100% landline Random Digit Dialing (RDD) until December 2014 and switched to 100% cellphone RDD in 2015.
Over the previous decade, the number and percent of Floridians using exclusively cell phones has been growing. The most reliable estimates of ongoing changes in the telephony landscape come from the National Health Interview Survey. For 2014, it was estimated that 47.6% of Florida adults were in cell-phone only households, with only 7% landline only.[2] We did some experimentation with a dual-frame design (including both landline and cell), but since our methods follow the national Consumer Sentiment Index conducted by the University of Michigan and that study transitioned to 100% cell-phone RDD in January 2015, we also made that shift.
Communication about this procedural change was important because the change created a methodological break in the time series.
There was some concern about how the new sample design might affect participation and the index scores, and indeed, there were differences. The demographics became a more accurate reflection of Florida adults; the average age dropped from 61 to 46, and the percentage of Hispanic participants more than doubled from 9.4% to 22.9%. The table below provides more details.
We tried various methods in order to communicate this change in methodology and break in the survey time series.
1. PRESS RELEASE: The press release for the first month of the new sample offered some details about the change in procedure. In some cases, the change in sample design got the headline.
2. GRAPH: The press release includes a series of graphs. During 2015, we used a different color to indicate the months conducted with the new protocol. An asterisk took readers to a note explaining the color difference.
3. WEBSITE: Full information about the Consumer Sentiment Index project is only one click away from our Bureau homepage, including historical data, the most recent releases, and methodology details. Because the changes in 2015 were so significant, we added a tab to explain those differences in simple language.
It is hard to predict what the best practices in survey research will be in the future. In recent years we’ve had to redesign web surveys to look good on a smart phone, and accommodate the shift to cell-phone only households. Researchers are experimenting with surveys by text message and health surveys that have people take a picture of their meal plate rather than filling out a paper diary to list the foods.
But whatever the new changes, the UFSRC will honor the commitment to make our methods transparent so that research can be replicated and methods evaluated.
ACKNOWLEDGMENTS: We would like to thank the thousands of Floridians who have answered our survey questions through the years.