Over the last few weeks, we've had multiple discussions about the numbers games in the UK general election. Whether it is looking at swing voters, redistricting or proposed changes to the electoral system, the goal has been to examine the relevance and strength of various factors on the optics and outcome of the upcoming election.
With David Cameron and Gordon Brown alternatively on the offensive -- bullying; class war; he's ruined the economy and the budget; the NHS and granny will both die -- and Nick Clegg and the "fourth" parties trying desperately to make the case for wholesale change, personality and polling have made for a dynamic series of news cycles.
This is not lost on the major pollsters in the UK, upon whose work a significant amount of media back-and-forth is written.
Each time a new poll comes out, as it is often in the US context as well, the key reported information is the top-line party and candidate numbers and how they have changed since previous polls done by the given newspaper/TV station and pollster pair. The poll release formula then dictates that any minute changes be cast in quasi-apocalyptic terms for the party(ies) or candidate(s) on the decline. Finally, talking points from the involved candidates and parties are mentioned, usually with the implication that some recent event caused the shift in numbers.
As the election comes to a close, however, it is a slightly different barometer (rather than newsworthiness) that occupies a pollster's mind. In fact, throughout the election campaign, horserace polling is in some ways simply a series of test-runs that calibrate for the real test: the final numbers. Get them right and you reign supreme until next election; get them wrong and travail in obscurity for the foreseeable future.
Hyperbole aside, the truth is that the economics of being a pollster today dictate an approach not too dissimilar from the one above. Nick Moon of the pollster GfK-NOP explained to me that with "very little money," the political polling industry in Britain today is "very cut-throat." When I spoke with Andrew Cooper of Populus (another pollster) he concurred, adding that "with low entry cost in new internet polling and quite small number of viable clients, there is consolidation bound to happen."
Peter Kellner of the very prolific online pollster YouGov went a step further in explaining how the business model works for them. "YouGov does polling on lots of subject, most of them not politics," he said. With numerous offices across Europe, North America and the Middle East, YouGov brands itself as a "market research agency," with a 250 thousand strong internet panel who can answer just about any question a client may like to ask.
This is where the pollsters really make their money, not on quadrennial sets of dense national polling of party politics, but day-in-day-out market research for companies, governments, political and economic organizations or think tanks.
This has an important impact on the way national polling is done in British politics. "Getting it right" for the major British election pollsters is measured by how close to the national popular vote share numbers you get, looking at the three large parties and an aggregate of all the "fourth parties" in another category.
After the 2005 election, the British polling council released the following regarding the accuracy of the top five UK pollsters:
Unfortunately, as we have looked at repeatedly, the popular vote shares listed tell us only a fraction of the story. Only the Gfk-NOP exit polling on election day itself included an actual seat projection (which, by the way, they projected correctly -- admittedly said Nick Moon of GfkNOP because "two minor biases canceled each other out"). For other election day projections, it was up to John Curtice of the University of Strathclyde and his team at the BBC, rather than the pollsters themselves.
In an election where it is likely to come down to the wire, of course, the national popular vote fraction of the story is simply not enough. Even the inclusion of occasional broad polls of marginal seats, which according to Nick Moon "don't get included in the poll-of-polls" calculations by most media outlet, will not explain the whole story.
But at the end of the day, this is not the point of media-based national polling, says Peter Kellner of YouGov. "We don't include undecided or refusals because we ask how people would vote if the election were held tomorrow," he explained. These polls are intended to show snapshot of the race, he continued, "not the final winner."
Of course, in political polling or otherwise, it is pretty easy to win when you set your own rules.
Renard Sexton is FiveThirtyEight's international affairs columnist and is based in Geneva, Switzerland. He can be contacted at email@example.com