Pre-election polls in Britain’s general election suggested the two main parties would be almost neck-and-neck after the May 7 vote, likely meaning a hung-parliament and days of political haggling to form a government.
They have instead been caught out by an outright majority for the Conservative Party.
Incumbent Prime Minister David Cameron’s party took 331 of Parliament’s 650 seats compared to Labour’s 232 — or 36.9% and 30.5%.
Is the nature of political polling such that American voters should take their own poll with a grain of salt in 2016?
Ahead of the 2012 U.S. elections, Nate Silver, from the website FiveThirtyEight, correctly predicted who would win all 50 states, even as pundits were saying the race was “too close to call.” In 2008, he had also correctly projected all but one state.
As this year’s British election results started trickling in, Silver tweeted that the world “may have a polling problem.”
“Polls were bad in U.S. midterms, Scottish referendum, Israeli election and now tonight in UK,” Silver said.
“UK polls herded toward all showing the same result as one another — and that result turned out to be pretty wrong,” he continued.
In a commentary on FiveThirtyEight, Silver suggested that forecasters had been overconfident. “Polls, in the UK and in other places around the world, appear to be getting worse as it becomes more challenging to contact a representative sample of voters. That means forecasters need to be accounting for a greater margin of error,” he said.
So what went wrong?
The final election poll released by the research company Ipsos MORI put the Conservatives ahead of Labour by 36 to 35%, indicating that “Britain may be on course for an indecisive general election result.”
Ipsos MORI research director Gideon Skinner said researchers were still digesting the final election results.
“Obviously we’ll need to carry out a review as we do after every election,” he told CNN, “But we have to take a bit of a step back — without wanting to gloss over it.”
Skinner said many of Ipsos MORI predictions had come to pass, such as the poll for Scotland picking up the swing to the Scottish National Party.
“We have to appreciate what polls can do. National polls are not predictions of seats, they’re snapshots of the parties’ vote shares,” he said.
“The Labour vote share is what we’ll want to be concentrating on in our internal review. That clearly was an overestimation.”
Polling company ComRes predicted that the Conservatives would win 35% of the vote, with Labour on 34%.
“We at ComRes have been saying all year that the Tories are ahead, Tom Mludzinski, ComRes’ head of political polling, told CNN. “We’ve tried to be fairly bullish in saying that.”
“Clearly the Tories have outdone expectations and Labour underperformed.”
Mludzinski said a number of factors could have played a role in the discrepancy, including undecided voters.
“We actually did a poll on the day of about 4,000 people, around 12-13% had made up their mind in the last 24 hours,” he said. “The night before the election we had 20% saying they still might change their minds … We’d like everyone to make up their minds nice and early.”
Mludzinski said so-called “shy Tories” — people not wanting to say they voted Conservative — could also have been a factor as could the fragmentation of the vote between so many parties, but that there was no single reason for the difference in predictions and results.
“There clearly was an industry-wide issue in that no one really picked up the size of this win to the Conservatives,” Mludzinski said.
In a blog post entitled “Poll Dancing,” ComRes chairman Andrew Hawkins explained what he thought had happened, pointing out that pollsters had predicted percentages rather than seat numbers.
These had a plus or minus 3% margin of error and the poll on the election eve had been “statistically on the button,” he said.
“We do indeed, together with academics and the media, need to look at how that vote share translates into House of Commons seats — that is certainly true. But there is no need to throw the baby out with the bathwater. Most of the polls from most of the pollsters were within the margin of error. How they are interpreted and reported needs to be a matter of collective consideration,” Hawkins said.
Pollsters Populus tweeted that the election results “raise serious issues” for pollsters.
“We will look at our methods and have urged the British Polling Council (an association of UK polling organizations) to set up a review.”
The Polling Council later announced it was establishing an independent inquiry to examine “the possible causes of this apparent bias” and make recommendations for future polling.
Council president John Curtice — Professor of Politics at Strathclyde University — told CNN while polls should be judged on their percentages rather than seats and while it could well be true that many had fallen within their margins of error, an inquiry was still needed.
The polls had been accurate on the SNP, Liberal Democrats and Greens — but they all had an error in the same direction, he said.
“The reason an inquiry has been set up is that actually the industry collectively clearly underestimated the Conservative lead over Labour,” Curtice said.
“The thing above all you need to get right is the Conservative lead over Labour or vice versa because that is the most politically sensitive,” he said. The miscalculation was one that had diminished but never entirely been eliminated, Curtice said. “(The inquiry) doesn’t presume anything as to the explanation but it is clearly something the polling industry would with profit from trying to understand.”
Writing for The Conversation, Professors Paul Whiteley and Harold D. Clarke said election forecasters were “clearly losers” in the UK election.
“The usual health warnings were issued in the form of statistical uncertainty estimates, but these invitations to prudence were given less attention than they deserved by most consumers of the numbers,” they said.
“Even with high quality survey data with huge sample sizes, predicting hundreds of constituency-level results in a first-past-the-post electoral system with varying patterns of inter-party competition remains a risky business. The 2015 election result forcefully illustrates the point.”
UK vs. US
Jacob Parakilas, the assistant project director for the U.S. Project at the London-based Chatham House think-tank told CNN that a lot of the difficulties and variants in the United Kingdom were down to the “increased complexities of UK elections relative to the U.S. elections.”
“In presidential elections you have 50 constituencies and many of those are safe,” he said, and whether a representative won by 15 or 20% was immaterial.
“At national level, during elections in the U.S. over the past 10-12 years the polls have been fairly accurate,” he said. “The difference is that it’s a lot easier to poll a two -party race.”
There was a “fundamental difference” between the two systems, Parakilas said.
“Despite the fact that its much larger, the U.S. is an easier polling environment,” he said. “The U.S. has gotten relatively polarized,” he said. “I’m fairly confident that the numbers of people who cast split ballots have declined.”
Prediction models for the U.S. elections had also become more reliable, Parakilas said, something he didn’t believe had happened yet in the UK.
“While this will certainly be observed with interest by American pollsters and political observers, there are a lot of fundamental differences between the political systems in the two countries and the polling that has evolved,” he said. “The direct impact might be limited in the U.S.”
CNN’s Stephen Collinson also said polling had been more accurate in the U.S., pointing out that most public non-partisan polls ahead of the 2012 U.S. elections had gotten it right and that 2008 had been a similar story.
“The volume of polling in the U.S. is much deeper given that there are so many states, the election is so long and there are many more companies that poll. That gives a greater dataset to base predictions on — it might make them less volatile.”
But Collinson said pollsters in the U.S. sometimes had difficulty pinpointing exactly the profile for the electorate: “Before you can do a poll you have to work out what the electorate is and weight the poll accordingly.”
Presidential challenger Mitt Romney hadn’t read his electorate correctly in the 2012 race, Collinson said and his pollsters’ predictions were thrown when more African Americans and Hispanics turned out to vote than expected.
“I think there are always cautionary tales to be learned from polls and there are always abominations.”
The abundance of polls in the United States meant there were also many averaging them out. “By doing that you can get a pretty good view of what the election is going to turn out like,” Collinson said.
“In the UK, it’s a disaster if they get a poll wrong because it’s the whole country. In the U.S. it’s state by state,” he said.
“A pollster could get North Carolina totally wrong and still be right in Florida and it wouldn’t matter. A national poll has much less use in the U.S. because it’s meaningless.”
The state-by-state division provided more of a “safety net” for pollsters, he said. “It’s not just like one poll that’s going to tell you everything.”
Meanwhile, back in the UK, one polling company, Survation, said it was kicking itself for playing it too safe.
In a blog post “Snatching defeat from the jaws of victory,” CEO Damian Lyons Lowe said its election eve poll had been close to the final result with the Conservatives on 37% and Labour on 31% (the final results were 36.9% to 30.4%).
“The results seemed so ‘out of line’ with all the polling conducted by ourselves and our peers — what poll commentators would term an ‘outlier’ — that I ‘chickened out’ of publishing the figures — something I’m sure I’ll always regret,” he wrote.
But Lyons Lowe said there would be no internal review.