Polling & Politics

The modern minefield of political opinion polling

If you are in Australia on the day of an election, you will notice something unusual. Or rather, you’ll smell it. Barbecuing sausages. Lots of them. In fact, at most polling stations, there is a “democracy sausage” available to voters, barbecued to raise money for the schools and community centres hosting the ballot boxes.

But what links Aussie elections with many of those in other democratic nations is that opinion polls ahead of the federal election in May 2019 called the wrong result.

The UK research industry went through this same experience after the Brexit referendum; widely expected to return a decision to remain in the European Union, the vote finished with 52% in favour of leaving. And in the US, there was an inquest into the fact that so many pollsters predicted Hillary Clinton to convincingly win the 2016 presidential race which was ultimately won by Donald Trump.

How wrong is wrong?

Industry experts are quick to point out that getting it “wrong” isn’t always quite what it seems.

Kathy Frankovic, former director of surveys with CBS News and a member of ESOMAR’s Professional Standards Committee, notes that “wrong” polls are almost always within the margin of error.

“This is not anything new. It might be more about our desire for predictability, and our expectation that numbers are really precise measurements when really, they often are not. A number like 51% could be below 50% or higher, but people don’t know that and sometimes don’t want to know that.

“On the other hand, that doesn’t mean there haven’t been problems in recent years. Every time it happens, the good news is it really forces people to scrutinise methodologies and presentations and, hopefully, make things better.”

Jon Puleston, Vice President of Innovation for Kantar’s Profiles Division, has analysed circa 30,000 political opinion polls, and says the vast majority over the past 30 years have – and still do – come up with a result that matches the outcome of an election. Polls in around 12 percent of the closest races call the wrong result.

Puleston says the elections that are called “wrong” tend to be the most closely fought and the most closely watched. “The irony is that in very tightly fought elections, there are more undecided voters, and those are subject to bigger errors because there is a significant body of people who don’t know or change their mind.”

This was the case both for the Brexit and Trump votes.

How will I vote? Ask me later

In many ways, it’s getting harder for pollsters to make sense of public opinion in the run-up to elections. Firstly, the days of achieving representative national samples by calling land lines are finished in most countries, there are difficulties reaching people via mobile, and there is a high rate of refusal.

Mirta Galesic is a social psychologist and Crown Chair in Human Social Dynamics at the Santa Fe Institute in the US. She says people are sometimes embarrassed to admit they’re going to vote for what they feel is an unpopular candidate.

The effect of what’s often called “shy voting” can be huge. In France, it’s led to under-reporting of Front National votes by as much as 7%.

There’s also the problem of voters genuinely not knowing which way they’ll vote. According to Puleston, exit polling in Germany that asked not just how people voted but when they made their decision found that 30% had decided that week, 20% made up their minds on the day, and 10% only settled on a name once in the booth. How does a pollster deal with that?

If it’s so hard to get the “right” answer – and even the right answer can end up being wrong – is there really any point continuing with pre-election polls? Consensus on this point is unequivocal: political opinion polling is about more than just picking the winner in advance.

Sarah Campbell is Executive Director of the Association of Market and Social Research Organisations (AMSRO) in Australia. She says political polling has a vital role to play in providing the public with a clear view of sentiment regarding parties, policies and leaders.

New ways to ask the question

Henri Wallard, deputy CEO of Ipsos and President of Ipsos Public Affairs, says there’s too frequently a readiness to distil this issue down to headlines that fail to paint the full picture. Political opinion polls are the public face of the entire research industry, so it matters that the perception is – as with weather forecasters – that they’re always wrong, when the reality is that this is far from true.

That being said, few would deny that the industry needs to innovate in order to give pollsters more accurate results, even when voters themselves don’t know what they’ll do.

Wallard says. “We do not pretend here of course that polls are always right … but we do maintain that the method is based on solid theoretical ground. It would be foolish to throw the baby out with the bathwater.”

One simple way to help weight responses, he says, is to ask people how firm their intention to vote is, and how firm their choice is. Kantar has had success with asking people “how likely” someone is to vote for different parties on a sliding scale, so their degree of certainty can be seen. In some markets, people are asked how they voted last time, and this is used to weight their response against actual voting data. Another approach is to ask people how comfortable they are talking about politics, as there’s a correlation between shyness around the whole subject and an intention to vote for a candidate they are reluctant to be honest about.

Galesic is part of a team that has looked at the effectiveness of asking people not about their own voting intentions, or how they expect their country to vote, but rather what the people in their social circle are likely to do.

When is a poll not a poll?

What causes confusion for many people outside the research industry is the sheer proliferation of polls before an election, and the huge variance in the quality of the polling.  These polls, some of them simply bogus, can skew forecasts along with public sentiment. The result: pollsters get a bad rap, and people become even less likely to talk to professional pollsters.

Organizations like ESOMAR, WAPOR and AAPOR set rules that establish industry standards which professional researchers are required to follow when conducting polls. The problem remains, however, that most members of the public don’t know how to discern between real and spurious polls.

Media organisations often don’t understand that difference, however. This matters because they are responsible for producing many of the statistics that masquerade as polls, and for reporting the findings of others without taking into account basic standards indicating rigour, such as sample size and methodology. To help address this problem, AAPOR, ESOMAR and WAPOR have developed a free online training tool for journalists around the world, to help improve media reporting about polls and survey results.

The way forward

The way forward, then, appears to be a combination of, on the one hand, educating the public, and especially the media, to increase levels of discernment and, on the other, to continually fine-tune polling methodologies.

Despite the challenges, polling and polls are far from obsolete.

 “If the problem you want to resolve is to make some sense of the state of opinion at a moment in time, you absolutely need polls,” says Wallard.

“There’s a big responsibility to do this right.”

This article is an excerpt of the original “The modern minefield of political opinion polling”, published in the 2019 Global Market Research report (ed.)

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.
Please note that your e-mail address will not be publicly displayed.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles