Selection bias at Politifact?
posted at 9:30 am on February 10, 2011 by Ed Morrissey
For quite a while, a debate has simmered about whether PolitiFact operates from a political bias. The Pulitzer Prize-winning feature from the St. Petersburg Times in Florida rates the truthfulness of public statements by politicians and activists on a scale ranging from True to Pants On Fire. Republicans have complained for quite a while that PolitiFact aims more at the GOP, especially when PolitiFact named the allegation that ObamaCare was a government takeover of health care the “Lie of the Year.”
Now, Eric Ostermeier at Smart Politics has published the results of a study he has made that shows Republicans getting much harsher treatment than Democrats over the last 13 months:
PolitiFact assigns “Pants on Fire” or “False” ratings to 39 percent of Republican statements compared to just 12percent of Democrats since January 2010 …
But although PolitiFact provides a blueprint as to how statements arerated, it does not detail how statements are selected.
For while there is no doubt members of both political parties make numerous factual as well as inaccurate statements – and everything in between – there remains a fundamental question of which statements (by which politicians) are targeted for analysis in the first place.
A Smart Politics content analysis of more than 500 PolitiFact stories from January 2010 through January 2011 finds that current and former Republican officeholders have been assigned substantially harsher grades by the news organization than their Democratic counterparts.
In total, 74 of the 98 statements by political figures judged “false” or “pants on fire” over the last 13 months were given to Republicans, or 76 percent, compared to just 22 statements for Democrats (22 percent).
Ostermeier notes that the breakdown of statements reviewed is more or less evenly split, with 50.4% of the statements from Republican public officials, 47.2% from Democrats, and the small remainder from independents. But that’s curious in itself, as Ostermeier later points out:
What is particularly interesting about these findings is that the political party in control of the Presidency, the US Senate, and the US House during almost the entirety of the period under analysis was the Democrats, not the Republicans.
To remind everyone, that control wasn’t exactly an even split, either. Democrats had sixty percent of the Senate seats, and close to the same percentage in the House. A Democrat was in the White House. Republicans controlled nothing in Washington. What made Republicans so attractive to PolitiFact, especially if the oft-expressed purpose of the Fourth Estate is to hold government accountable?
Ostermeier notes that Politifact itself has expressed a rather ad hoc approach to selection:
When PolitiFact Editor Bill Adair was on C-SPAN’s Washington Journal in August of 2009, he explained how statements are picked:
“We choose to check things we are curious about. If we look at something and we think that an elected official or talk show host is wrong, then we will fact-check it.”
If that is the methodology, then why is it that PolitiFact takes Republicans to the woodshed much more frequently than Democrats?
The answer to the overall question could still be that Republicans tell more Pants on Fire and False statements, and that Politifact is merely a disinterested referee. However, the numbers suggest that PolitiFact is more “curious” about Republican statements and less curious about Democratic statements, even when Democrats vastly outnumbered Republicans in the halls of power. And that certainly is … curious.