Monday, August 6, 2012

"Politifact Bias" Still Making The Same Mistakes



The anti-fact-checker website "Politifact Bias" has attempted to prove that the popular fact checking website, Politifact.com, has a liberal slant. Previously, I examined a 2011 review article to check their complaints and found the criticisms lacking. However, that article was not a good representation of what the site considers evidence of bias. I previously started a post examining one of their articles that claims Politifact has been inconsistent and unfair to Republicans when using employment statistics to evaluate claims. I plan on finishing the article soon. In the meantime I was notified by Karen Street of the blog Poly Psychotics about a study from one of Politifact Bias' authors Bryan White that claimed to find clear evidence of bias in how Politifact assigns "False" verses "Pants on Fire" ratings. On his website, Bryan describes this study as the "highest priority on the research list" and argues the study tells a "story of a strong anti-GOP bias at PolitiFact's national operation." I was intrigued. He describes the study as follows:

"Early last year I realized that PolitiFact's own rating system created a natural opinion poll for PolitiFact journalists. PolitiFact distinguishes between its "False" and "Pants on Fire" claims according to a single criterion:
FALSE – The statement is not accurate.
PANTS ON FIRE – The statement is not accurate and makes a ridiculous claim.
"False" and "Pants on Fire" claims are not accurate. "Pants on Fire" claims are, in addition, ridiculous. Ridiculous means subject to ridicule. Ridicule is, of course, the specialty of objective journalists."
"Republicans are about 74 percent more likely than Democrats to receive a "Pants on Fire" rating--74 percent more likely to speak not just the false but the ridiculous in the eyes of PolitiFact's national operation. It's not an easy statistic to explain without liberal bias."
Eric Ostermeier attempted a study similar to this back in 2011, although it was much more subtle in its attempt to poison the PolitiFact well. In a previous post, I pointed out the flaws in concluding bias from his data. Bryan's attempt was a bit more focused, and he was much more forward about his conclusion. Yet the study fails to provide the evidence needed to support his claim of liberal bias.

The study sets up a false dichotomy of two possibilities to explain the data. Either Republicans lie more often or Politifact is biased. Yet other options are ignored. Politifact does not grade a large sample of Republican politicians, which means that any conclusion about these politicians cannot be applied to Republicans as a whole without large uncertainty. PolitiFact has also stated "We are journalists who choose our fact-checks based on what is newsworthy. We are not social scientists and are not using any kind of random sample to select statements to check." As a result, you can only apply conclusions to the the few "newsworthy" statements PolitiFact has encountered. Republicans telling more ridiculous lies than Democrats could explain the data, but it is a fallacy to say it therefore must be THE explanation.

Other explanations could also explain the data. For instance, it could be the case that the loudest and most media-covered Republicans are more likely to not only lie, but tell ridiculous lies, than the loudest and most media-covered Democrats (sensationalism, my friend). And there are many reasons why this would be expected. During the short time PolitiFact has existed, popular Republican thought has been practically hijacked by TEA Party ideas (Something that has caused a stir even among Republicans). And you wouldn't expect as much accuracy from a group with a tendency toward anti-intellectualism. Conversely, the Tea-Party’s liberal counterpart, OCCUPY Wall Street, failed to take nearly as much of a hold over Democratic politicians. The Republican Party has also been in the minority in government and underwent a particularly nasty primary fight in 2011. Just like the liberal anti-war protestors during the years of the Republican majority, minority parties may be more prone to extremism in protest of the other party's agenda. All these factors could suggest that a small group of loud media-covered Republicans are likely to blame for the discrepancy. However as with the former explanation, the data alone does not necessarily suggest this is the correct explanation. Further research outside the grade of PolitiFact would need to be performed.
Bryan fails to explore either of these explanations. Instead, he merely dismisses the former explanation by suggesting that Republicans lying more often would result only in more combined “False” and “Pants on Fire” ratings. He claims it would not account for the discrepancy between the two. But, as can be seen from PolitiFact’s definitions of the "False" and "Pants on Fire" ratings, lying comes nowhere into the equation, only the accuracy and absurdity of the claim. Not all false claims are lies. Most false claims can probably be explained by ignorance. However, a repeatedly debunked claim could spark a shift for some claims from “False” to “Pants on Fire.” The more likely you are to repeat an already debunked claim, the more likely your refusal to accept the debunking may constitute ridiculous lying. As can be seen, Bryan's justification fails to dismiss the possibility that Republicans lie more often.2 However, there are many possible explanations for the discrepancy in the data. And it could be the case that more than one explanation is needed. If Bryan wants us to accept  his bias explanation by process of elimination, he needs to actually eliminate all other possible explanations first (within reason, of course).

His study further attempts to prove that PolitiFact's justifications for assigning "False" verses "Pants on Fire" ratings are completely subjective. While I will admit there can be some subjectivity in what should be considered ridiculous, there are other claims where the possibility of subjectivity is trivial. If practically everyone who knows a claim is false finds it ridiculous, does subjectivity even matter? However, none of this matters if PolitiFact is actually using some kind of objective criteria to justify which statements get a "False" or "Pants on Fire" rating. Bryan claimed he attempted to find such criteria:

No consistent pattern of justification emerged from the data, though justifications for “Pants on Fire” ratings, when they occurred, did vary somewhat from those typically used to justify a “False” rating. For example, the perceived use of fear to increase the impact of a political claim, as PolitiFact said Sarah Palin did with her infamous “death panel” comment, occurred about 15% of the time in the “Pants on Fire” group but not at all in the “False” group. On the other hand, more than half of the “Pants on Fire” sample carried no particular justifying rationale for the rating, instead using language along the lines of “This claim was so false we rate it Pants on Fire!”(emphasis mine)
However, nowhere in Politifact's justification of Sarah Palin's "Death Panel" claim did the word "fear" come up. In fact, nowhere can I find the word "fear" in any definition of the word "ridiculous." I'm not sure why he used that as a criteria. And nowhere does he explain why he chose it. Likely, this means that Bryan merely guessed at possible random criteria and, when he could not find one, he assumed none existed. Is he guilty of an argument from ignorance? Possibly. It depends on whether or not he  seriously tried to find an objective criteria (given the motivation for the study, i doubt he did), or whether or not he is even capable of finding objective criteria (a concept as complex as the idea of "ridiculousness" cannot be easily diagnosed with just one unconditional justification). Can we trust he did all he reasonably could to seriously attempt to find objective criteria? Given the name of his site and his apparent lack of impartial peer review, I seriously doubt it. It may be that his approach to finding objective criteria is misguided from the start. For instance, look at some of the Farlex examples of what people mean when they say "ridiculous":
""the absurd excuse that the dog ate his homework"; "that's a cockeyed idea"; "ask a nonsensical question and get a nonsensical answer"; "a contribution so small as to be laughable"; "it is ludicrous to call a cottage a mansion"; "a preposterous attempt to turn back the pages of history"; "her conceited assumption of universal interest in her rather dull children was ridiculous""
Can you find any objective criteria for what makes the subject matter of these quotes ridiculous, as opposed to just false? Probably not. But I highly doubt almost anyone would fail to recognize the subject matter of these quotes as being ridiculous, even if they do not have a clear idea of what objective criteria makes them ridiculous. For example, if I were to say that Martin Luther King Jr. hated black people, or that Hitler loved the Jews, you would have no trouble finding my claim ridiculous.Yet the attempt to find a super-dictionary use of the word ridiculous is likely pointless. What matters is how people use the word. And there is little doubt there are numerous examples of false statements in which nearly everyone would find ridiculous (so long as they realized it was false, obviously). No doubt there are also grey areas. But there is no indication Bryan ever attempted to differentiate between obvious and grey areas in his examination. Is this an easy task? Probably not. Determining whether or not Politifact is justified in finding every PoF-rated claim ridiculous may be tough, given the nature of what constitutes ridicule, but there is little doubt you would need much more than Bryan's sole conservative opinion.

So even if calling a claim ridiculous is at least somewhat subjective, you may be wondering
how he comes to the conclusion that Politifact does in fact have a liberal bias.

"If no such criteria exist then the results reflect bias in the PolitiFact organization."
If you don't think the conclusion follows from the premise, you are right.1 As I noted before, there are obvious examples of things that are ridiculous. It could be the case that "the loudest and most media-covered Republicans" are saying more of these obviously ridiculous falsehoods than "the loudest and most media-covered Democrats". This would explain the discrepancy without the need for bias. But Bryan thinks more can be explained with his bias theory:
"A liberal ideological bias by PolitiFact, for example, helps explain a number of features in the data, for example the close correlation between the percentage of “Pants on Fire” ratings for party politicians and bureaucrats compared to the ratings for other partisan figures (the group featuring media figures such as Rachel Maddow and Rush Limbaugh)."
What?! If you are also left scratching your head, i empathize with you. Bryan does not really elaborate on why this phenomenon fits his theory. Nor does he reference any papers that explain this either.

The rest of his article attempts to explain the phenomena with all kinds of speculative conspiracy-sounding explanations. He believes that Politifact's debunking of Democrat statements is actually some nefarious plot to make themselves look non-partisan (If this is the case, they could have chosen much less damning claims to debunk). Overall, it wreaks of conspiracy theory style rationalization.

If this constitutes  "The best evidences showing PolitiFact's liberal slant," one has to wonder if there is any good evidence of "PolitiFact's liberal slant" at all. I've looked and I have yet to find any. But when one is dedicated to poisoning the well of any non-partisan source that debunks their favorite political party claims, any turd is gold.

Note: Karen Street also has a great review of Bryan's study. It is worth reading.

Update 8/15/2010: Links Added
1 Update 8/16/12: This was originally worded incorrectly. The wording has been rephrased.  
2Karen Street at Politi-Psychotics has found a few examples of these. She has also taken a look at individual writers within PolitiFact and compared their "False" verses "Pants on Fire" ratings for Democrats and Republicans. 

Update 8/19/2010: Links Added

The Roundup

Fact Checker Edition

Fact Check: Tax Facts: Lowest Rates in 30 Years

Glenn Kessler lectures the National Capital Area Skeptics about fact ckecking the Presidential Candidates. As I have said, Fact Checking is just another form of Skepticism.

PolitiFact: 'Obamanomics Explained' is not a PolitiFact chart 
Conservatives, after long decrying fact checking contain liberal bias, are suddenly obsessed with cherry picking their stats.

PolitiFact: Rush Limbaugh claims link between Batman’s Bane and Romney’s Bain

‘Crony capitalism:’ the RNC’s look at the case of Steve Westly 
"In the end, the case the RNC video makes does not add up, yelling “fire” when it may, at this point, only be fog. Still, the case it makes —and the questions it raises — hold together a bit better than many other claims of “crony capitalism.""

FactCheck: Romney and the Tax Return Precedent
"Again, Romney isn’t wrong about McCain releasing just two years of returns. And Romney may decide to stick to that “precedent.” But his comments imply that the two years released by McCain is standard, and that he has been unfairly asked t o release more than others have in the past. In fact, with the lone exception of McCain, all candidates over the last 30 years each have released more than two years of tax returns."
WP Fact Checker: Obama uses out-of-date data to criticize Romney’s Medicare plan 
Ryan's NEW 2012 plan preserves the option of traditional medicare for everyone. You can certainly argue this may be abandoned if the GoP gets full control of the government, but at this point it seems to be nothing but speculation. There is plenty to complain about with the Ryan plan, but his Medicare plan doesn't seem to be one of them.

Politifact: Will Barack Obama be outraised by Mitt Romney?
"In looking at the evidence, we see things that both support and contradict the Democrats’ predictions of being outspent. How much money outside spending groups on Romney’s side will actually raise is the biggest unknown.
But the Obama campaign isn’t without its arsenal of cash, either. A particular point of strength is that the Obama campaign itself controls its money, a strategic advantage.
For fundraising totals through May, "Romney is still badly outgunned by Obama when it comes to the total amount of money he's raised, fundraising appeals by Democrats to the contrary," Allison said in an email.
It’s too soon to determine if the Democrats’ predictions will come true. For now, the rhetoric is a tactic to portray the Democrats as the underdog."
FactCheck: Romney’s ‘Racist’ Reference to Palestinian Culture
"It’s true that he did not directly speak of the Palestinian culture. But he did indirectly address it by citing Israel’s culture as a reason for the “dramatic, stark difference in economic vitality” between Israel and the areas under Pales tinian Authority. He also did not mention Israeli economic sanctions — which, as the New York Times points out, the Palestinians have long blamed for their economic problems."
FactCheck: Senator Mangles Facts on Drilling Moratorium
"The independent Energy Information Administration estimated the moratorium would reduce oil production by 31,000 barrels a day in the fourth quarter of 2010 and 82,000 barrels a day in 2011 — a year that saw the Gulf of Mexico produce 1.3 million barrels a day."

3 comments:

  1. It's amazing you can spout such bunkum using such serious language. Street didn't understand the study. You don't either.

    I think this is my favorite line:

    "If you don't think the conclusion supports the premise, you are right."

    If therefore the logic is good then if the premise supports it?

    ReplyDelete
  2. Bryan:
    So if I do not understand, enlighten me.

    So Bryan, do you really think I do not understand the difference between a premise and a conclusion? Or are you just jumping on a small obvious and harmless mix-up do try and discredit the article? Pretty sad stuff.

    I'll correct it for you if it makes you feel better.

    ReplyDelete
  3. How about instead of taking cheap shots you actually comment on the substance?

    ReplyDelete