Menu
 

You are on the Natcen site

Click here for Scotcen

natcen map

You are on the Natcen site

Click here for Scotcen

natcen map
 

Looking into the crystal ball – polling methods in the 2017 General Election

Posted on 07 June 2017 by Kirby Swales, Director of Survey Research Centre .
Tags: NatCen Panel, NatCen on the Election, general election, methodology, polling

Today is the last day of election polling, and the final estimates will soon be published (you can find details on trackers such as the one run by the BBC and UK Polling report). Who would want to be a pollster right now? The industry is once again in the spotlight and the Labour ‘surge’ has meant the polls have become very much part of the news in this election (despite many saying they would resist after the miss in the 2015 General Election). This time, the marked divergence in many polls means some won’t even have the comfort of failing in a herd. Some are, apparently, feeling befuddled and even terrified.

To their credit, pollsters set up an independent inquiry into the 2015 polling mistakes. The inquiry's report was clear, “the primary cause… was unrepresentative samples”. Everything else made a “small contribution to the total error”. Unfortunately for the polling industry, Theresa May called an election before they had the chance to fully implement the recommendations. Whilst one pollster has spent significant sums on recruiting under-represented groups, the majority of the effort has gone into improving weighting rather than improving samples.

What’s in a ‘sample’? It’s an interesting dimension to our political debate that despite polls regularly leading the front pages, few of us know how they are put together. For a typical poll, people are recruited via internet or telephone marketing, often with the pay-off of ‘have your say on the issues’ or earning a shopping voucher. The pollster then ‘weights’ the answers to make sure, for example, that the percentage of young people responding equals the percentage in the population, and to adjust the estimate for whether people are likely to actually vote.

It’s not a bad system, especially for getting a quick snapshot of national opinion, but there are a few problems at the heart of it. First, you don’t hear from the people who are missed by or don’t respond to internet or phone marketing. Second, you can miss the people who take their time to respond to emails. Finally, you get more people responding to your ads who are really keen to give their views – the more politically engaged. It’s inevitably slightly more complex than that, but that’s the crux of it and 2015 showed that it can make a big difference. This hasn’t mattered so much in the past when those under-represented people voted in the same way as those who answered the questions – but it does matter when there is some unknown or new difference between them. For example, one thing that has thrown pollsters recently is that young people are now more pro-Labour than they were in the past. 

There is another way, but it is slower. It basically involves throwing a few thousand electronic darts at random at the Royal Mail’s postcode file. You then knock on all those doors. Not once. Not twice. But up to 9 times until you get an answer. Save for those totally off-grid, who never answer their door or who’ll never answer a survey, that’s as a good a sample as you’re ever going to get  - known as a ‘random probability face-to-face sample’ in the jargon. It’s what we do at the charity the National Centre for Social Research (NatCen) and is the standard method for government surveys that produce official statistics you hear about on the news (such as the level of unemployment, poverty or obesity).

This is not feasible in the time available before a snap election. At NatCen, we have developed the NatCen Panel to occupy a middle ground – it uses faster interviewing methods (online and telephone), but the people invited to take part come from a face-to-face random probability sample. Importantly, we use telephone interviewing to make sure people without access to the internet (still a significant proportion of the population) can take part and give people more time (not everyone is available at short notice). It’s still not perfect – certain people still don’t take part – but importantly we know the source of the sample and can get more of the ‘hard to reach’. Take non-voters for example. According to one polling company, which asked which of their respondents did not vote in 2015, the answer was just 9%. The actual proportion of non-voters in the election was 34%. This gives you a good indication of what was going awry in the 2015 polls. For NatCen’s panel March 2017 survey the answer was 26% (by contrast the ‘gold-standard’ British Social Attitudes survey got 31%). 

In summary, it does not look like the 2017 General Election is going to settle the debate around the polls, though it is very unlikely to be a repeat of 2015. What we do already know is that either some pollsters need to take another good look at their samples and research methods, or we need to significantly reappraise the importance of election campaigns for deciding how people vote. Either way, it'll be a big moment for the sector, and it will be important to carry on the work to improve methods and increase awareness

Follow me on Twitter: @KirbySwales

Further information on the details of this survey are available here

comments powered by Disqus
Blog filters
Year
  • 2017
  • 2016
  • 2015
  • 2014
  • 2013
  • 2012
  • 2011
  • 2010
Month
  • Jan
  • Feb
  • Mar
  • Apr
  • May
  • Jun
  • Jul
  • Aug
  • Sep
  • Oct
  • Nov
  • Dec
NatCen/ScotCen
Clear Filters

Subscribe to the RSS Feed: