Menu
 

You are on the Natcen site

Click here for Scotcen

natcen map

You are on the Natcen site

Click here for Scotcen

natcen map
 

The inquiry into polling must be bold to protect trust in statistics and democracy

Posted on 19 June 2015 by Kirby Swales, Director of Survey Research Centre .
Tags: general election, polling

The first session of the BPC inquiry into the accuracy of the polls during the 2015 General Election campaign starts today. This is a welcome and vital initiative as inaccurate polls are damaging to democracy and reflect badly on the survey research industry as a whole. The industry must get better at communicating and improving methodology, and funders and the public must be encouraged to understand the strengths and weaknesses of different methods.

Polling accuracy

So, we now know the polling industry over-estimated the extent of Labour support, and it wasn’t until the exit poll – based on an arguably more sophisticated methodology – was published that we knew the shape of the result.

NatCen Associate John Curtice has written and spoken about the recent history of political polling and the possible factors behind the results. This moment is likely to be significant in the history of polling – there are echoes of 1992 when the Labour vote was also over-estimated and the infamous 1936 US Presidential Election campaign when a Literary Digest poll of more than two million people failed to predict that FDR would be elected.

I Stock _000015942771_Medium

Inaccurate polls are bad for democracy

Polls are an important part of our democracy because they throw light on what the public is thinking. They make it possible to check claims by politicians about the public mood and they give information can that enhance public debate. But if they are incorrect then they can be damaging. There is evidence that they can influence the way people vote through bandwagon and underdog effects. And they certainly seem to have influenced the tone and approach of the campaign in the 2015 General Election; Ed Miliband’s decision to rule out a Labour-SNP coalition followed repeated polls showing that this was the most likely election outcome.

Moreover, polling performance affects everyone working in survey research and with statistics.

Opinion polls are one of the statistics that the public really engage with and thus a poor reputation can be bad for everyone. British Social Attitudes findings from this year suggest that most of the public are broadly trusting of “official statistics”, far more so than when they get stats from politicians or the media. And it is important that the pollsters’ shortfalls don’t undermine trust in statistics more generally.

What might be done?

We should wait for the inquiry’s findings before deciding the way forward, but if the polling industry is not seen to put its ‘’house in order’ others will be keen to do so. The reaction has been vociferous, with some political commentators arguing that opinion polls cannot be taken seriously. There is even a Private Member’s Bill that would regulate ALL research looking at voting intentions.

There are two possible avenues of action that spring to my mind.

1. Polling companies and the media becoming better at explaining methodology, despite its complications. NatCen focuses on random probability surveys but we find the media often lazily categorise them as a “poll”. People engage with the polls largely through the media, so we need to make sure that the media understand what they are reporting and help them to identify a probability sample from a volunteer web panel, and the possible impact of different sampling, weighting and modelling methods. Being a member of the BPC already brings with it a requirement to publish some of the underlying data, but this could go further, such as sampling procedures and response rates. More than that though, this information needs to be presented in such a way that it can be understood by the layperson.  

2. Finding ways to ensure that more of the published data is based on a probability sample. Many of the 2015 General Election campaign polling results come from web access panels. A range of research – including this paper using the National Sexual Attitudes and Lifestyle survey data which NatCen collected – suggests that they are better at understanding differences in a population but not so good at producing reliable population estimates. We haven’t yet seen the findings of the British Election Study’s random probability face to face survey, but perhaps we should be asking if after the election isn’t too late to publish research that could have put the polls in context?

This is an important time for those working in survey research. On the one hand, the 2015 General Election showed just how central our work is in modern society and politics. On the other, it shows the dangers of expecting or claiming that surveys have a level accuracy that is not justified. Now the dust has settled, I hope the inquiry can helps us make a considered but bold step forward. 

This blog first appeared on Research Live.

 

comments powered by Disqus
Blog filters
Year
  • 2017
  • 2016
  • 2015
  • 2014
  • 2013
  • 2012
  • 2011
  • 2010
Month
  • Jan
  • Feb
  • Mar
  • Apr
  • May
  • Jun
  • Jul
  • Aug
  • Sep
  • Oct
  • Nov
  • Dec
NatCen/ScotCen
Clear Filters

Subscribe to the RSS Feed: