Polling & Politics

Measuring the pulse of public opinion

By Robert Heeg

Easy targets for the media, the laughing stock of comedians; after several widely errors, trust in election polls seems to be at an all-time low. Not entirely fair, argues Jon Puleston, who demonstrates that the pollsters often get it right. That doesn’t let them off the hook though.

The UK general election of 2015, the Brexit Referendum and the US Presidential Election, both in 2016, all had a different outcome than the pollsters predicted, leading to negative headlines for the industry. Ironically, these widely publicised ‘poll disasters’ were not actually that inaccurate from a comparative statistical perspective. In fact, two of the three were more accurate than normal, they just failed to predict the outcome, which is a slightly different thing.

This is what Lightspeed’s Vice President of Innovation Jon Puleston demonstrated at this year’s ESOMAR congress. In his presentation Are We Getting Worse at Political Polling? Puleston offered an analysis of 60 years of international polling, pulled from a database of 30,000 global polls from 25 countries available to ESOMAR. Puleston not only provided a reality check, he also showed why polls don’t always get it right. “To be able to properly understand what went wrong with these prominent miscalled elections I needed to understand how big the size of the polling errors actually were in context to other elections around the world. It’s not until you start assembling data on a macro scale, as we have done, that you can actually get some generalised learnings.”

Puleston was first asked by Kantar CEO Eric Salama to review polling methodology around the world, following the prominent polling miscalls in the UK and the US. With his QuestionArts team at Lightspeed he had taken a keen interest in polling techniques and done a lot of research exploring alternative methods of predicting election results. So he set about gathering all the polling data he could with the aim of establishing some benchmarks. As part of this process he interviewed polling experts all around the world. What became apparent was the lack of any centralised source of international data about polling accuracy to compare the performance of one election with another in an objective way. “Everyone I spoke to had a different way of assessing the accuracy of their polling activity.” Luckily a huge amount of polling information is freely available on the internet. “The challenge was to compile it all into a consistent analysable structure”, says Puleston. This took a lot of work with the help of a PhD intern who spent two months compiling and analysing it.

Shock results

It has become almost a popular sport for pundits to criticise polling, especially after the US election and UK referendum results. Puleston feels that some of this criticism was fair. The UK 2015 Election polling win error was bigger than average and the UK Polling Council review of this result did reveal some shortcomings in methodology and lack of transparency. “Nearly every UK and many international polling companies have taken notice and responded to this, and I think this has already had a positive impact.”

There were also some larger than average polling errors at a state level in the 2016 US election that were misleading. These prompted an investigation from the US polling community that highlighted the difficulty of phone polling at a state level. “Because of the rise of mobile phones it is harder to geographically balance sample.”

Puleston sees some fascinating contradictions in the perception and actual performance of recent polls. Whilst the Brexit polls did not predict the outcome, the average polling error was less than average. “In any other election that was not so knife-edge you might have said polling companies did a pretty good job – especially when considering the well-known complexities of predicting referendum results, where there is no historical benchmarking data to assess who will vote.”

Editor’s note: the full paper reviewing 60 years of polling from a Kantar database of 25 countries is available to ESOMAR members in MyESOMAR.

If you’re an ESOMAR member you can read the full article in MyESOMAR in the digital copy of Research World. If you are not a member of ESOMAR you can join and receive a free copy of Research World 6 times a year or alternatively you can sign up for a subscription of the magazine in our publications store.

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.
Please note that your e-mail address will not be publicly displayed.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles