There are two versions of what has happened in the past three weeks in the battle to be US President. One is the version told by most nationwide polls and accepted by the media; the second, told by a minority of nationwide polls, including YouGov, and most polls in the key battleground states, is significantly different.
Version one says that the first television debate between Barack Obama and Mitt Romney was a game-changer. If we average the polls conducted by Gallup, Pew, Ipsos, ARG and the Daily Kos, we find that before the debate, Obama was ahead by four points; afterwards Romney led by four – a shift in the lead of eight points. Before the debate, Obama was heading for a clear victory; afterwards, Romney looked the more likely winner. Since then, the contest has narrowed a little, but Romney has held most of his initial gains.
Version two says that the first debate made only a small difference. If we average the polls conducted by YouGov, Rasmussen and ABC/Washington Post, then the debate shifted the nationwide vote shares by just a single point: from an Obama lead beforehand of 2% to an Obama lead of 1% afterwards. The figures have stuck close to that ever since. (YouGov’s latest survey, completed this Monday, shows Obama 2% ahead.)
Movements in polls in the key states sit nearer version two than version one. If we average their findings then Florida tipped from Obama to Romney after the first debate, but Obama remained ahead in other key states – notably Ohio, Pennsylvania, Wisconsin, Nevada and New Mexico. Obama narrowly led in Virginia and Colorado before the first debate; afterwards, they were too close to call. On these figures, Obama would still win the electoral college, even if Romney won Virginia and Colorado.
Why these two kinds of difference – between different national polls, and between most national and most state-level polls? The answer, I’m afraid, requires some technical delving into the data. For those who DON’T want to come on this journey, my advice is to believe the majority of state polls and the minority of national polls (including YouGov). I believe that the bulk of the media (especially, but not only, Fox News), with their natural tendency to fasten onto the more dramatic polling shifts, have simply got the story of the past three weeks wrong. Obama has remained ahead since last month’s Democratic convention. His (modest) lead as probably narrowed fractionally since the beginning of October, but any movements in the national figures, and in most state-level races, have been within the margin of error.
Now for the explanation. In all polls these days, the raw data must be handled with care. It’s normal for the sample to contain too many people in some groups, and too few in others. So all reputable pollsters adjust their raw data to remove these errors. It is standard practice to ensure that the published figures, after correcting these errors, contain the right number of people by age, gender, region and either social class (Britain) or highest educational qualification (US). Most US polls also weight by race.
Beyond that, there are two schools of thought. Should polls correct ONLY for these demographic factors, or should they also seek to ensure that their published figures are politically balanced? In Britain these days, most companies employ political weighting. YouGov anchors its polls in what our panel members told us at the last general election; other companies ask people in each poll how they voted in 2010, and use this information to adjust their raw data. Ipsos-MORI are unusual in NOT applying any political weighting.
A similar variation applies to the US. The difference is that there, most pollsters apply only demographic weights. YouGov and Rasmussen are unusual in taking account of political partisanship – whether people think of themselves generally as Democratic, Republican or Independent. (This is generally known as party identification, or party ID.)
In the case of YouGov’s weekly nationwide polls for The Economist, we apply the same overall principle as in Britain. That is, we draw on baseline information in order to ensure consistency in the political profiles of our samples. Specifically we know the party ID of most panel members last December, and also how they voted in 2008 and/or 2010, from what they told us at the time. (What’s important is not that the month we collected a mass of ID data happened to be last December; rather, the point is that we have a common baseline for anchoring this year’s polls.) This doesn’t remove all risk of sampling variation – sadly, we have been unable to repeal the laws of probability – but it does reduce the risk of a rogue poll in which our sample is demographically fine but politically skewed.
That is not all. YouGov has also conducted two large-scale surveys in 25 states for CBS News, one before the first TV debate and one afterwards. These covered all the battleground states, plus the largest states such as California, Texas and New York. The key point is that this was a true panel study. We questioned the same people twice. This allowed us to investigate what change, if any, took place at the level of individual voters, NOT by comparing results from different samples. Any change in the numbers in such panel studies reflects real changes by real voters. And our overall sample was much larger than normal. We polled almost 33,000 electors in September, and reinterviewed more than 25,000 of them after the first debate.
The message from this study was clear. The Romney bounce was tiny. Overall, YouGov found just a one-point narrowing of Obama’s lead.
Or rather, those were the published figures. Had we adjusted the raw data only for demographics, and treated the two polls as separate samples then we would have reported a five-point shift, from an Obama lead of 4% in these 25 states to a Romney lead of 1%.
However, when we compared the intentions of the people who completed both surveys, we found hardly any net movement. Those who backed Obama in September divided as follows after the first debate: Obama 93%, Romney 3%, undecided 4%. Romney’s September supporters divided: Obama 3%, Romney 94%, undecided 3%. (The modest numbers of those who did not take sides in September, but who did this month divided evenly between the two men.)
So we have a discrepancy between what we reported – only a tiny net movement after the first debate – with what we would have reported had we acted like most other most other pollsters.
Here’s the reason. Those who supported Romney in September were more likely than Obama supporters to respond to our follow-up survey after the first debate. The recontact rate was 80% for Romney supporters and 74% for Obama supporters. That’s why the raw numbers – and the demographically-only adjusted numbers – for the post-debate poll appeared to tilt the whole contest towards Romney. Had we reported these, we would have been in line with most other pollsters.
We didn’t because we take one side in a specific argument about political polling. It is whether party ID is stable or not in the short term. At YouGov, we believe – and experiments from time to time support this – that party ID changes only slowly. Other companies argue that it can change sharply in response to specific events, such as a successful party convention, or a TV debate in which one candidate emerges as the clear winner (as Romney did in the first debate).
At this point, it would plainly be helpful to report how the party ID figures for all the major pollsters moved after the first debate. Sadly, US pollsters are not generally as open as British pollsters. However, we do have numbers from two companies that collect party ID data but don’t weight to it. Both have deservedly high reputations.
The first is Pew, which reported an 8% Obama lead in mid-September and a 4% Romney lead after the first debate. Its party ID figures also moved sharply, from a 9% Democratic lead in September to a 1% Republican lead this month.
The second is the ABC/Washington Post poll. This actually reported a slight INCREASE in Obama’s national lead after the first debate, from 2% to 3%. And its party ID figures moved in the same direction, from a 5% to 9% Democratic lead.
In short, the reason for the gulf between the Pew and ABC/WP results comes down to the different political compositions of their samples. Had both weighted their data to hold their party ID figures constant, both would have reported a tiny shift to Romney – which is exactly what YouGov did report.
What we can therefore be fairly sure of is that the first TV debate made little or no difference to the (high) degree of loyalty Democrats and Republicans display towards the two candidates. It is NOT the case that many voters switched from Obama to Romney. The question, rather, is whether the first debate caused the number of Democratic-ID Americans to fall, and Republican-ID Americans to rise. Pew’s figures suggest they did; YouGov believes (and the ABC/Washington Post figures suggest) they didn’t.
Or, rather, YouGov’s data indicates that Republicans were slightly keener than Democrats to respond to the second survey. This is consistent with other occasions when polls have tended to show movement after specific events, notably post-convention bounces. The figures for Republican ID tend to rise after Republican conventions, and for Democratic ID to rise after Democratic conventions. Past panel-based surveys indicate that this reflects shifts in response rates among Democrat and Republican voters, not significant changes in voters’ attitudes to each party.
Of course, differential response rates can matter. If one party’s supporters are more enthusiastic than the other’s, they are probably more likely to vote. However, this requires two things to happen. First, differential response rates must equate to enthusiasm; second, changes in enthusiasm must persist until election day. Neither is certain. Well-informed poll-watchers in the US, such as Nate Silver (New York Times) and Mark Blumenthal (Huffington Post) say response rates to US telephone polls are now below 10%. So pollsters fail to reach more than nine out of ten Americans they try to contact. It takes only a tiny change in relative response rates to have a large impact on the results.
For example, suppose the non-response rates among supporters of Obama and Romney one week is 90% and a poll shows them level. Suppose the next week, every American has the same voting intention, and all that changes is that the non-response rate among Obama’s supporters rises just one percentage point to 91%. With political weighting, the published result would be the same; without it, Romney jumps to a five point lead.
Secondly, there’s a high chance that, even if response rates a few weeks before polling day do reflect a relative shift in enthusiasm, short-term movements such as these are likely to be reduced as election day draws closer and the campaigning more intense.
That could be why swing-state polls have shown Obama remaining ahead, even as national polls have shown Romney taking the lead. In battleground states, the campaigning is already intense; it is possible that more voters on both sides are firmer in their partisanship – and therefore, the impact on polling response rates of specific events, such as the first TV debate, is likely to be less.
Two further points. Apart from the issue of political weighting, telephone polls in the US divide between traditional polls conducted by human beings, and robopolls done automatically by computers dialing voters. Robopolls are cheaper; or, for the same cost, can reach more people faster. The trouble is that, under US law, robpolls can dial only landlines. They can’t reach the one-in-three Americans who have only mobile phones. Another 18% of Americans have landline phones but seldom use them. These figures rise dramatically among the under 30s. They are the people who are keenest on Obama. So we should not be surprised that robopolls, both nationally and in swing states, tend to produce slightly higher figures for Romney than conventional telephone polls in which real people dial both landline and mobile phone numbers.
If the overall result were clear-cut, the small differences between robopolls and live-interviewer polls would be of interest only to obsessive poll-watchers. In the current very close race, they tell very different stories. It looks increasingly as if the state that will decide the outcome next month will be Ohio. According to robopolls, the state, and therefore the nation, are too close to call; according to the live-interviewer polls (and YouGov’s online polls), Obama enjoys a modest but consistent lead in Ohio and is on course for a second term in the White House.
Secondly, a point that is specific to YouGov. According to US census data, just 71% of eligible Americans are registered to vote. In 2008, almost 90% of those who were registered did vote. So in any poll, it is vital to know which respondents are on the register. Telephone polls simply ask people and accept their answer. Inevitably, some people give the wrong answer – not necessarily dishonestly but because they think they are on the register but in fact are not.
YouGov is different. As we have recruited an online panel, we are able to check with each state’s actual register. In our election surveys we draw on a pool of around 150,000 Americans whom we know are on the register. This doesn’t mean we are bound to be closest to the actual election result. We do all we can to minimize sampling error but can never guarantee absolute accuracy. What it does mean is that we are on firm ground on one important aspect of polling American elections. We do know which members of our panel are actually eligible to vote.
Does all this mean that Obama is certain to win on November 6? No. The race is too close. There have been past elections where a late shift in the national mood has changed the outcome. And the ground war could be decisive – the battle by local Romney and Obama activists in the key states to find all their supporters and make sure they turn out on the day (or vote early, as millions of Americans now do, by post or in person). What we can now be fairly sure of is that neither of the first two debates (a Romney triumph in the first one and an effective Obama fight-back in the second) made much of a difference.
YouGov will return to our panel of voters in 25 states next week for CBS, as well as conducting further national polls for The Economist. Four years ago our final national poll came within one point of the result, and we called the right winner in all bar one very closely-fought state. Meanwhile, when polls report big shifts in support – especially when they are from companies that do not weight their data to ensure politically representative samples – we should remember the old truth: dramatic polling movements make for bold headlines, but are not always right.