Allahpundit wrote a lengthy piece about last night’s CNN poll, complete with some insightful commentary about the cross-tabs. If you missed it — and by the number of comments on the post, it doesn’t appear than many of you have — be sure to read it now. However, the sample data, as AP pointed out, lacked a few details, most prominent of which was the partisan split of the sampling. Given the inclination of media polls to use wildly unrepresentative D/R/I splits in their samples, the lack of transparency on that point is telling.
That might not be the biggest problem with the poll, though. Its biggest problem is … math. Reader Raymond O did some math and asked a rather interesting set of questions in an e-mail last night about how CNN did theirs.
First, let’s start with the topline results, as reported by CNN: Obama 52%, Romney 43% among registered voters, 53/41 among all respondents. If that’s the case, then the number of respondents in the latter case voting for Obama should be 538, and the number supporting Romney 416.
However, when reading the questions on page 3 of the poll report, that’s not at all what we see:
BASED ON 484 RESPONDENTS WHO PLAN TO VOTE FOR OBAMA — SAMPLING ERROR: +/- 4.5 PERCENTAGE PTS.
3. (Asked of Obama voters) Is that more a vote FOR Barack Obama or more a vote AGAINST Mitt
BASED ON 476 RESPONDENTS WHO PLAN TO VOTE FOR ROMNEY — SAMPLING ERROR: +/- 4.5 PERCENTAGE PTS.
4. (Asked of Romney voters) Is that more a vote FOR Mitt Romney or more a vote AGAINST Barack
Since the combined total of the two exceeds their count of registered voters in the survey (910), we have to assume this refers to the general-population response. That’s wildly different than the 53/41 split that CNN reports from the poll. In fact, it’s only a 48/47 split for Obama. And given that the poll shows a slightly better result for Romney among registered voters, it’s not difficult to conclude that Romney probably led in that category before CNN’s pollster shifted the results around to this extent.
Obviously, the pollster weighted the results — dramatically. But on what basis? Pollsters weigh results in likely-voter models; in fact, that’s a vital part of the likely-voter modeling process. I’m not aware of the need to weight gen-pop polls unless the demos in the survey are wildly deficient, and since CNN didn’t disclose those, we don’t know — but it hardly imbues these results with confidence. Speaking of which, the actual 484 responses for Obama is slightly more than 10% less than what it would take to get to a 53/41 split, far outside of the margin of error announced in the data (+/- 4.5%). In other words, CNN’s pollster produced results that we’re supposed to believe represents the general population’s views within 4.5%, and then proceeded to weight them by increasing Obama’s advantage by almost 11%.
Again, I’m not aware of a need to weight results in general-population polls, and certainly not to the extent seen here, nor did CNN disclose that they did weight the results, especially in the general-population results. Without the rest of the demographic data included in this survey and the weighting methodology, this poll should be treated as utter fantasy.
Cross-posted at Hot AIr.