First off, it should be noted that selection bias occurs whenever a sample, taken for statistical analysis, has not been chosen randomly. It should be clear, given the title, as well as a few other indicators shown later, this article is focusing on party-related selection bias. However, we should not jump to too many conclusions yet. It is important to actually read the article to see if the title is even justified.
MISLEADING TITLES
" [TITLE:] Selection Bias? PolitiFact Rates Republican Statements as False at 3 Times the Rate of Democrats.
[BODY:] PolitiFact assigns "Pants on Fire" or "False" ratings to 39 percent of Republican statements compared to just 12 percent of Democrats since January 2010 "The difference between the title and body is subtle, but incredibly important. The title suggests the trend of "false" ratings is an overall trend throughout the 4 1/2 year history of Politifact. However, the actual article just looks at an 11 month period, mostly 2010. The reader should take note that this is not a random sample. In fact, as I will show later on in this article, this distinction actually seriously challenges any possible conclusion of party-related selection bias.
A BIT OF BACKGROUND
" PolitiFact, the high profile political fact-checking operation at the St. Petersburg Times, has been criticized by those on the right from time to time for alleged bias in its grading of statements made by political figures and organizations. "At least in my experience, these allegations are often incredibly poor (also see linked critiques for analysis of other articles alleging bias within Politifact).
" The organization (and now its more than a half dozen state offshoots) grades statements made by politicians, pundits, reporters, interest groups, and even the occasional comedian (anyone 'driving the political discourse') on a six point "Truth-O-Meter" scale: True, Mostly True, Half True, Barely True, False, and Pants On Fire for "ridiculously" false claims.
But although PolitiFact provides a blueprint as to how statements are rated, it does not detail how statements are selected.
For while there is no doubt members of both political parties make numerous factual as well as inaccurate statements - and everything in between - there remains a fundamental question of which statements (by which politicians) are targeted for analysis in the first place. "Politifact actually does now provide a description of how it chooses its statements:
"Every day, PolitiFact staffers look for statements that can be checked. We comb through speeches, news stories, press releases, campaign brochures, TV ads, Facebook postings and transcripts of TV and radio interviews. Because we can't possibly check all claims, we select the most newsworthy and significant ones.
In deciding which statements to check, we ask ourselves these questions:
- Is the statement rooted in a fact that is verifiable? We don’t check opinions, and we recognize that in the world of speechmaking and political rhetoric, there is license for hyperbole.
- Is the statement leaving a particular impression that may be misleading?
- Is the statement significant? We avoid minor "gotchas"’ on claims that obviously represent a slip of the tongue.
- Is the statement likely to be passed on and repeated by others?
- Would a typical person hear or read the statement and wonder: Is that true?"
This page was not available when Ostermeier wrote this article, so I cannot criticize him for failing to notice it. However, it should be noted that Ostermeier's statement is no longer true (by no fault of his own, of course).
At this point, the article outlines its findings, drawing no conclusions at the moment.
IS IT APPROPRIATE TO DRAW CONCLUSIONS FROM 2010 ALONE?
Before we go any further, let us return to a previous point I wrote earlier. Remember the sample of posts from this article included all posts from January 2010 through January 2011. So we should ask ourselves if the chosen dates are truly arbitrary. To help answer this question, I took a look, at data from 2007 to see if it differed from 2010 (note that when I say 2010 in this section, I'm including January 2011 to reflect Ostermeier's article):
"Let's compare these results to the findings of Ostermeier for 2010:
- Out of 39 statements rated "False" or "Pants on Fire," each main party received approximately 50% of the rulings. This is much different from 2010, where Republicans received 75.5% of these ratings.
- Politifact has devoted approximately equal time between Republicans (124, 52%) and Democrats (110, 46%). This is nearly the same as 2010.
- Republicans were graded in the "False" or "Pants on Fire" categories 16% of the time (39% in 2010) while Democrats were rated in these categories 16% of the time as well (12% in 2010). This absolutely nowhere close to the "super-majority" found in 2010.
- 73% of Republican statements in 2007 received one of the top 3 ratings (47% in 2010) as opposed to 75% of Democratic statements (75% in 2010). When looking at "True" statements alone, Republicans beat Democrats with 5% more of their statements receiving the coveted award than Democrats. It is hard to see how one party was considered more or less truthful than the others. It depends on how much credit you give the "Half True" rating, as opposed to the "True" rating.
- Republicans received a slightly larger percentage of "Mostly False" ratings than Democrats (1.79%). This is the same as in 2010. However, this only results in a 2% difference between Republicans and Democrats for the bottom 3 ratings. This is MUCH different from the 28% difference in 2010.
OSTERMEIER STARTS POKING AT CONCLUSIONS
However, Ostermeier starts unjustly suggesting certain conclusions and not others.
" What is particularly interesting about these findings is that the political party in control of the Presidency, the US Senate, and the US House during almost the entirety of the period under analysis was the Democrats, not the Republicans.
And yet, PolitiFact chose to highlight untrue statements made by those in the party out of power. "(emphasis mine)It sounds like Ostermeier has already come to the conclusion that Politifact is essentially guilty of party-related selection bias since they "chose to highlight untrue statements made by those in the party out of power." This does not necessarily mean Politifact is choosing these statements because they came from Republicans, just that they are the party out of power. The data from 2007 does not provide us with any hint as to the truth of this possibility. Neither party was really "in power" during that time. Democrats controlled both the House and Senate while Republicans controlled the White House. However, Ostermeier fails to ask a few questions:
- Do parties out of power tend to be more likely to spread falsehoods, possibly in order to discredit the party in power (The party in power may not need to spread falsehoods as often since they are already in power)?
- Was there a sensational movement (such as the TEA Party) that may have caused more air time to go towards the party out of power, increasing the chance of falsehoods (see first bullet)?
- Did the party out of power decide its number one priority should be to oust the current party from power? Do these kinds of campaigns tend to produce falsehoods?
- Did the party out of power feel the need to attack legislation (Obamacare) coming from the party in power, regardless of factual accuracy (See first bullet)?
- Was the party out of power often given uncritical air time from incredibly popular news outlets?
OSTERMEIER DOES ACTUALLY MAKE A GOOD POINT
" An examination of the more than 80 statements PolitiFact graded over the past 13 months by ideological groups and individuals who have not held elective office, conservatives only received slightly harsher ratings than liberals. "
One would think this alone would challenge the suspicion of a liberal bias, maybe even the suspicion of Democrat-centered bias. Is Politifact biased against Republicans, but not Conservatives? That sounds like it would take quite a conspiracy theory-styled rationale to justify. In fact, could there be other possibilities that could explain this kind of trend in 2010?
- Were there sensational primary elections within the Republican Party that may have contributed to extra coverage? Did this election deal with a bloc of furious voters, who may have been more prone to falsehoods?
- Could it be that at least a few former politicians became political commentators (also remember party out of power points)?
" When PolitiFact Editor Bill Adair was on C-SPAN's Washington Journal in August of 2009, he explained how statements are picked:
"We choose to check things we are curious about. If we look at something and we think that an elected official or talk show host is wrong, then we will fact-check it."First off, it should be noted that Bill Adair has essentially admitted Politifact's selection of statements is not actually random. However, since his selection criteria does no appear to be dependent on political affiliation, the criteria can still be thought of as random in regards to it.
If that is the methodology, then why is it that PolitiFact takes Republicans to the woodshed much more frequently than Democrats?
One could theoretically argue that one political party has made a disproportionately higher number of false claims than the other, and that this is subsequently reflected in the distribution of ratings on the PolitiFact site. "
Ostermeier does suggest a possible reason for why Republican politicians are getting the brunt of bad ratings. In fact, I can imagine a Democrat saying Ostermeier's data "shows that Republicans are more prone to falsehoods than Democrats." Now, as our 2007 data shows, whatever trend Ostermeier found in 2010 did not exist in 2007. We could ask ourselves if something changed in the Republican Party (like the TEA Party) that may have made the party more likely to embrace falsehoods. Notice that the stats for Democrat statements were the same in 2007 as they were in 2010. The big difference came in Republican statements, possibly suggesting a change in the Republican party. Now I'm not saying this is definitely the answer. However, I see no evidence that eliminates this possibility in favor of any possibility involving party-related selection bias.
" However, there is no evidence offered by PolitiFact that this is their calculus in decision-making. "I'm not 100% sure why he says "However." If one political party is making "a disproportionately higher number of false claims than the other," wouldn't one expect a random large sample of those statements to also contain a disproportionately higher number of false claims (leading to "false" ratings from fact checkers like Politifact)? I'm not sure what the discrepancy is. If it truly is the case that Republicans are making more false claims than Democrats, and Politifact has an approximately equal number of false claims for each party, would this not be an indication of party-related selection bias? Politifact would be trying, possibly with some kind of "fairness doctrine" to make Republicans look more honest than they truly are. As Bill Adair noted, "we choose to check things we are curious about. If we look at something and we think that an elected official or talk show host is wrong, then we will fact-check it." No mechanisms are in place to ensure some kind of contrived fairness when one political party is more likely than the other to play fast and loose with the truth.
"Nor does PolitiFact claim on its site to present a 'fair and balanced' selection of statements, or that the statements rated are representative of the general truthfulness of the nation's political parties or the elected officials involved. "This is correct. However, as the Washington Post Glen Kessler pointed out in a recent interview with NPR "I don't really keep track of, you know, how many Democrats or how many Republicans I'm looking at until, you know, at the end of the year, I count it up." There is no indication Politifact does any different. So it may not be the case that Politifact highlights any explicit methods for ensuring fairness (contrived or not). However, as professional journalists, is it possible they employed standard journalistic practices of objectivity so common-sense they didn't see the need to mention them?
It is possible that, given the tendency for journalists to be themselves personally liberal, that we should expect party-related selection bias as a default? It is a possibility (like many others). However, unless the entire staff is liberal (a claim that would require evidence), one has to wonder if even a few conservative journalists may actually be enough to prevent other liberal journalists from falling prey to party-related selection bias. Remember, Politifact articles are reviewed by a panel of at least 3 editors, further decreasing the probability of party-related selection bias.
In addition, a person with that hypothesis would have to explain why ratings did not favor politicians from either party over the other in 2007. Did Democrats lie more in 2007, causing party-related selection bias to make it seem like Democrats were just as honest as Republicans? Or was Politifact just not biased back then, as opposed to now? At least one would need to be investigated before drawing any conclusions.
Overall, although Politifact does not explicitly call itself "fair and balanced" (one may actually be skeptical of news organizations that do), once cannot come to the conclusion that they aren't actually fair and balanced. This would be a classic "argument from ignorance."
"And yet...
In defending PolitiFact's "statements by ruling" summaries - tables that combine all ratings given by PolitiFact to an individual or group - Adair explained:
"We are really creating a tremendous database of independent journalism that's assessing these things, and it's valuable for people to see how often is President Obama right and how often was Senator McCain right. I think of it as like the back of a baseball card. You know - that it's sort of someone's career statistics. You know - it's sort of what's their batting average." (C-SPAN Washington Journal, August 4, 2009) "I will admit Bill Adair does oversell the importance of Politifact's summaries and tables. Since the selected statements are not truly random, only so much can be extrapolated from them. It may be a good idea for Politifact to provide a disclaimer stating that statements from individual politicians are not randomly chosen. They are instead chosen via the criteria outlined in their "Principles." However, with no evidence that party-related selection bias is a factor in their decision making, there would be no reason to post any disclaimer to suggest it is a factor.
HOW FAR SHOULD THE MEDIA GO TO APPEAR UNBIASED?
" Adair is also on record for lamenting the media's kneejerk inclination to treat both sides of an issue equally, particularly when one side has the facts wrong.
In an interview with the New York Times in April 2010, Adair said:
"The media in general has shied away from fact checking to a large extent because of fears that we'd be called biased, and also because I think it's hard journalism. It's a lot easier to give the on-the-one-hand, on-the-other-hand kind of journalism and leave it to readers to sort it out. But that isn't good enough these days. The information age has made things so chaotic, I think it's our obligation in the mainstream media to help people sort out what's true and what's not." "Bill Adair makes a good point. Why should the media present both sides equally when one side is clearly wrong? In addition, why should the media present both parties as being equally truthful when one is actually playing fast and loose with the truth? It would seem odd that a political party that once lambasted the Fairness Doctrine would have any problem with what Bill Adair is saying.
SHIFTING THE BURDEN OF PROOF
" The question is not whether PolitiFact will ultimately convert skeptics on the right that they do not have ulterior motives in the selection of what statements are rated, but whether the organization can give a convincing argument that either a) Republicans in fact do lie much more than Democrats, or b) if they do not, that it is immaterial that PolitiFact covers political discourse with a frame that suggests this is the case. "
This is an interesting quote. My original critique of this quote may have been a bit over-simplified1. As a result I thought it would be better to do a bit more thorough job on this quote, due to the implications it contains:
Notice how his last statement claims that Politifact frames Republicans in a way that suggests they lie more than Democrats. Before judging the validity of the rest of this statement, it would probably be a good idea to see if Politifact even does this at all. First off, notice that Politifact has never had an overall Dem vs GOP report card. Ostermeier was the one who did this. So this means that certain conclusions one may be tempted to draw from Ostermeier's study are not ones people may necessarily draw from Politifact itself. This is because all Ostermeier did was total up all statements made by Democrats and Republicans, and compared them. Of course he could have easily done the same with any arbitrary breakdown of statements: Men versus women, majorities versus minorities, southerners versus northerners, etc... If we found similar disparities in the data to the Dem vs GOP data, would we be accusing them of covering "political discourse with a frame that suggests this is the case?" The ultimate problem is that, Politifact rates individuals (and specific groups), not overall groupings of those individuals. In fact, Adair's earlier statement about his report cards shows that he is really only focusing on individuals. If one were to infer any trend about an entire grouping of those individuals, not only would they would need a sufficient sample size of individuals within those groups, but they would also need to filter out individuals with only a few statements to their name. It should be clear to anyone with common sense that, when you only have a small number of statements from an individual, you cannot come to any conclusion about the honesty of that individual with any decent level of certainty. As a result, people with only a few grades to their names would have to be ignored. For example, if you eliminated candidates with fewer than 5 statements to their name in 2010, you would have to eliminate Jim DeMint, Kevin McCarthy, Mike Prendergast, Dan Coats, and whole host of others... This will probably leave you with a very small sample of individuals. Seeing as how there are literally thousands of Republican politicians at all levels of government, you cannot come up with a half-way reasonable level of certainty over the honesty of Republican politicians as a whole from just Politifact's ratings alone. These kinds of things should be pretty intuitive to people just looking at Politifact's articles and report cards. As a result, Politifact can hardly be blamed for creating a framework suggesting Republicans lie more when they have never actually done such a thing, unlike Ostermeier. So as a result, does Politifact have the burden to give Republicans a convincing argument that "Republicans in fact do lie much more than Democrats?" At the very least, this is a straw man since Politifact never even implicitly made that claim. It was an observation based on a context-ignoring analysis performed by Ostermeier himself.
But let's say Politifact did look at a large number of claims from a large number of politicians to where you can gauge the honesty of enough individuals from each party to come up with a conclusion that one party does in fact lie more than the other. If Politifact is merely presenting the data they found, without coming to any conclusion, does Politifact really have the burden of proof? Whatever conclusions that are made about the data must be supported by the person making the conclusion. So if we are stuck in some dichotomy where either Politifact is biased, or Republicans lie more, should Politifact be the one with the burden of proof? As we can see, the data is agnostic to either claim. So if Politifact is making no claims, yet someone wants to claim they are biased, is it Politifact's burden to prove that Republicans lie more to keep these people from being justified in their belief in Politifact's bias? As Ostermeier points out, it is not a question over whether or not Politifact can prove outright that they don't have "ulterior motives." However, if Politifact fails to answer either of Ostermeier's questions, what can be their conclusion? Ostermeier gives Republicans no questions of their own to answer before coming to a conclusion. And as Ostermeier's last paragraph shows, he is comfortable supporting one particular conclusion over any others:
OSTERMEIER CONCLUDES BIAS
" In his August 2009 C-SPAN interview, Adair explained how the Pants on Fire rating was the site's most popular feature, and the rationale for its inclusion on the Truth-O-Meter scale:
"We don't take this stuff too seriously. It's politics, but it's a sport too."
By levying 23 Pants on Fire ratings to Republicans over the past year compared to just 4 to Democrats, it appears the sport of choice is game hunting - and the game is elephants. " (emphasis mine)Up until this point, Ostermeier has been pretty good at not owning the conclusion that his data suggests Democrat-favored selection bias at Politifact. He has appeared sympathetic to concerns of this sort of bias by specifically highlighting them and not investigating other possible explanations. However, this last paragraph indicates he does actually believe that there is some kind of bias of this sort at Politifact. However, his logic in regards to "Pants on Fire" ratings suffers from the same problems readers may have when concluding in favor of any kind of party-related selection bias at Politifact (see "Conclusion" below). He fails to investigate other possible causes, instead flipping the burden of proof on Politifact to demonstrate it is not in fact guilty. In fact, if he were to look back at 2007, he would see a near polar opposite trend of assignments for "Pants on Fire" ratings. Democrats were the recipients of over half the "Pants on Fire" ratings in 2007. In addition, they received twice as many "Pants on Fire" ratings as Republicans that year, despite the fact that Democrats actually had slightly fewer total statements rated. Now there weren't a lot of "Pants on Fire" ratings that year, but it is clear the trend from 2010 was absolutely non-existent in 2007. Shouldn't that make Ostermeier wonder what changed in 2010? Was it Politifact or the Republican Party? To my knowledge no evidence has been given for the former (or even suggested), yet the rise of the TEA Party in 2010 shows the latter is certainly true.
CONCLUSION
Although it is hard to draw too many conclusions from Ostermeier's data sets. A few questions still need to be investigated before coming to any conclusions (some of these come from a previous post):
- Could it be that many of these statements came from a single issue (or small collection of issues) where Republicans may have been more likely to spread falsehoods? A month-by-month breakdown could be useful in this regard. This kind of breakdown could also help investigate the question as to whether the Republican primaries had anything to do with the trends in 2010, as I pointed out earlier.
- Could it be that the perceived rise of anti-intellectualism in the Republican Party has made it less likely to care about factual accuracy?
- Could the Republican party be more ideological and less prone to checking facts that could challenge their ideology?
- Could it be that the Republican tendency to call neutral fact-checking sites "biased" creates a tendency to play fast and loose with the truth, knowing their constituents already distrust any site that could shed light on their claims?
Throughout most of this article, Ostermeier presents interesting data for Politifact in 2010. Although there are many possible explanations of the data, Ostermeier only really focuses on one. In addition, he ignores at least one other for no better reason than he should have for ignoring the one he focuses on as well. Ostermeier fails to ask many questions, such as whether or not the trends he found in 2010 even existed in previous years. A quick analysis of Politifact in 2007 showed this is not the case for at least one year. In the end, Ostermeier makes it clear he thinks Politifact is guilty of party-related selection bias, succumbing to the same fallacies any reader would if he/she came to the same conclusion from the information presented in this article alone (not that other articles I've seen do any better).
Overall, this article demonstrates a particularly erroneous way of detecting bias. A mere look at ratings totals cannot give a reader much information about bias. Even if one were to tally up all of Politifact's articles, few conclusions could be reached. Out of Politifact's 4 1/2 years in fact checking, 3 have been under a Democrat president, and 3 1/2 under a Democrat controlled congress. With the possibility of the party out of power being more prone to spreading falsehoods, one cannot instantly jump to conclusions if Republicans end up having more bad ratings than Democrats. I admit that demonstrating bias can be very tough. However, if someone is having THAT much trouble demonstrating bias, maybe they should wonder if they are even justified in suspecting bias in the first place. If Ostermeier had done this, maybe this article would have been more useful. Instead this article is just another failed attempt to try and poison the well for Politifact.
1 - Update 1-31-12: Original critique: "Why should the organization have to give a convincing argument that Republicans lie more than Democrats in order to convince Republicans they do not have ulterior motives? Essentially this is the argument that, unless Politifact gives evidence it does not has ulterior motives, Republicans are justified in believing it does. This again, is a textbook argument from ignorance (see link above). Shouldn't Republicans have to ask for somewhat definitive evidence that Politifact in fact does have ulterior motives before accepting that claim as true? In fact, couldn't the data found in this entire article point to Republicans politicians spreading more falsehoods than Democrat politicians in 2010 just as much as it could point to Politifact having ulterior motives? In fact, the data from 2007 significantly challenges the latter while having practically no effect on the former."
2 - Update 1-31-12: I took this bulleted list from the "SHIFTING THE BURDEN OF PROOF" section and placed it in the conclusion, to help with the revised flow of the critique.