Monday, January 16, 2012

Politifact: Selection Bias or Bad Statistics

In a recent post where I critiqued a list of issues that should "keep discerning readers from trusting Politifact", I was alerted to an article examining potential selection bias in how Politifact chooses which statements to rate. The article, entitled "Selection Bias? PolitiFact Rates Republican Statements as False at 3 Times the Rate of Democrats," was written by Eric Ostermeier, "Research Associate at the Humphrey School's Center for the Study of Politics and Governance." This article has also been cited in a recent attack on fact checkers by The Weekly Standard (3-post critique: 1,2,3). Given the attention this article received, I thought it would be useful to see if this article actually gives any good reason to justify suspicion of selection bias at Politifact.
First off, it should be noted that selection bias occurs whenever a sample, taken for statistical analysis, has not been chosen randomly. It should be clear, given the title, as well as a few other indicators shown later, this article is focusing on party-related selection bias. However, we should not jump to too many conclusions yet. It is important to actually read the article to see if the title is even justified.

MISLEADING TITLES

" [TITLE:] Selection Bias? PolitiFact Rates Republican Statements as False at 3 Times the Rate of Democrats.
[BODY:] PolitiFact assigns "Pants on Fire" or "False" ratings to 39 percent of Republican statements compared to just 12 percent of Democrats since January 2010 "
The difference between the title and body is subtle, but incredibly important. The title suggests the trend of "false" ratings is an overall trend throughout the 4 1/2 year history of Politifact. However, the actual article just looks at an 11 month period, mostly 2010. The reader should take note that this is not a random sample. In fact, as I will show later on in this article, this distinction actually seriously challenges any possible conclusion of party-related selection bias.

A BIT OF BACKGROUND
" PolitiFact, the high profile political fact-checking operation at the St. Petersburg Times, has been criticized by those on the right from time to time for alleged bias in its grading of statements made by political figures and organizations. "
At least in my experience, these allegations are often incredibly poor (also see linked critiques for analysis of other articles alleging bias within Politifact).

" The organization (and now its more than a half dozen state offshoots) grades statements made by politicians, pundits, reporters, interest groups, and even the occasional comedian (anyone 'driving the political discourse') on a six point "Truth-O-Meter" scale: True, Mostly True, Half True, Barely True, False, and Pants On Fire for "ridiculously" false claims.
But although PolitiFact provides a blueprint as to how statements are rated, it does not detail how statements are selected.
For while there is no doubt members of both political parties make numerous factual as well as inaccurate statements - and everything in between - there remains a fundamental question of which statements (by which politicians) are targeted for analysis in the first place. "
Politifact actually does now provide a description of how it chooses its statements:  

"Every day, PolitiFact staffers look for statements that can be checked. We comb through speeches, news stories, press releases, campaign brochures, TV ads, Facebook postings and transcripts of TV and radio interviews. Because we can't possibly check all claims, we select the most newsworthy and significant ones. 
In deciding which statements to check, we ask ourselves these questions:
  • Is the statement rooted in a fact that is verifiable? We don’t check opinions, and we recognize that in the world of speechmaking and political rhetoric, there is license for hyperbole.
  • Is the statement leaving a particular impression that may be misleading?
  • Is the statement significant? We avoid minor "gotchas"’ on claims that obviously represent a slip of the tongue.
  • Is the statement likely to be passed on and repeated by others?
  • Would a typical person hear or read the statement and wonder: Is that true?"

This page was not available when Ostermeier wrote this article, so I cannot criticize him for failing to notice it. However, it should be noted that Ostermeier's statement is no longer true (by no fault of his own, of course).

At this point, the article outlines its findings, drawing no conclusions at the moment.

IS IT APPROPRIATE TO DRAW CONCLUSIONS FROM 2010 ALONE?

Before we go any further, let us return to a previous point I wrote earlier. Remember the sample of posts from this article included all posts from January 2010 through January 2011. So we should ask ourselves if the chosen dates are truly arbitrary. To help answer this question,  I took a look, at data from 2007 to see if it differed from 2010 (note that when I say 2010 in this section, I'm including January 2011 to reflect Ostermeier's article):

 "Let's compare these results to the findings of Ostermeier for 2010:
  •  Out of 39 statements rated "False" or "Pants on Fire," each main party received approximately 50% of the rulings. This is much different from 2010, where Republicans received 75.5% of these ratings.
  • Politifact has devoted approximately equal time between Republicans (124, 52%)  and Democrats (110, 46%). This is nearly the same as 2010.
  • Republicans were graded in the "False" or "Pants on Fire" categories 16% of the time (39% in 2010) while Democrats were rated in these categories 16% of the time as well (12% in 2010). This absolutely nowhere close to the "super-majority" found in 2010.
  • 73% of Republican statements in 2007 received one of the top 3 ratings (47% in 2010) as opposed to 75% of Democratic statements (75% in 2010). When looking at "True" statements alone, Republicans beat Democrats with 5% more of their statements receiving the coveted award than Democrats. It is hard to see how one party was considered more or less truthful than the others. It depends on how much credit you give the "Half True" rating, as opposed to the "True" rating.
  • Republicans received a slightly larger percentage of "Mostly False" ratings than Democrats (1.79%). This is the same as in 2010. However, this only results in a 2% difference between Republicans and Democrats for the bottom 3 ratings. This is MUCH different from the 28% difference in 2010. 
As you can see, the results from 2007 can seriously undermine many possible conclusions that a person could draw from the 2010 data."

OSTERMEIER STARTS POKING AT CONCLUSIONS

However, Ostermeier starts unjustly suggesting certain conclusions and not others.
" What is particularly interesting about these findings is that the political party in control of the Presidency, the US Senate, and the US House during almost the entirety of the period under analysis was the Democrats, not the Republicans.
And yet, PolitiFact chose to highlight untrue statements made by those in the party out of power. "(emphasis mine)
It sounds like Ostermeier has already come to the conclusion that Politifact is essentially guilty of party-related selection bias since they "chose to highlight untrue statements made by those in the party out of power." This does not necessarily mean Politifact is choosing these statements because they came from Republicans, just that they are the party out of power. The data from 2007 does not provide us with any hint as to the truth of this possibility. Neither party was really "in power" during that time. Democrats controlled both the House and Senate while Republicans controlled the White House. However, Ostermeier fails to ask a few questions:
  • Do parties out of power tend to be  more likely to spread falsehoods, possibly in order to discredit the party in power (The party in power may not need to spread falsehoods as often since they are already in power)?
  • Was there a sensational movement (such as the TEA Party) that may have caused  more air time to go towards the party out of power, increasing the chance of falsehoods (see first bullet)?
  • Did the party out of power decide its number one priority should be to oust the current party from power? Do these kinds of campaigns tend to produce falsehoods?
  • Did the party out of power feel the need to attack legislation (Obamacare) coming from the party in power, regardless of factual accuracy (See first bullet)?
  • Was the party out of power often given uncritical air time from incredibly popular news outlets?
These are just some of the questions Ostermeier should have posed to readers before coming to the conclusion that Politifact "chose to highlight untrue statements made by those in the party out of power." Essentially, he neglected to ask if the party out of power (or the media) highlighted its own untrue statements?

OSTERMEIER DOES ACTUALLY MAKE A GOOD POINT

" An examination of the more than 80 statements PolitiFact graded over the past 13 months by ideological groups and individuals who have not held elective office, conservatives only received slightly harsher ratings than liberals. "

One would think this alone would challenge the suspicion of a liberal bias, maybe even the suspicion of Democrat-centered bias. Is Politifact biased against Republicans, but not Conservatives? That sounds like it would take quite a conspiracy theory-styled rationale to justify. In fact, could there be other possibilities that could explain this kind of trend in 2010?
  •  Were there sensational primary elections within the Republican Party that may have contributed to extra coverage? Did this election deal with a bloc of furious voters, who may have been more prone to falsehoods?
  • Could it be that at least a few former politicians became political commentators (also remember party out of power points)?
FINDING ANSWERS:
" When PolitiFact Editor Bill Adair was on C-SPAN's Washington Journal in August of 2009, he explained how statements are picked: 
"We choose to check things we are curious about. If we look at something and we think that an elected official or talk show host is wrong, then we will fact-check it."

If that is the methodology, then why is it that PolitiFact takes Republicans to the woodshed much more frequently than Democrats?

One could theoretically argue that one political party has made a disproportionately higher number of false claims than the other, and that this is subsequently reflected in the distribution of ratings on the PolitiFact site. "
First off, it should be noted that Bill Adair has essentially admitted Politifact's selection of statements is not actually random. However, since his selection criteria does no appear to be dependent on political affiliation, the criteria can still be thought of as random in regards to it.

Ostermeier does suggest a possible reason for why Republican politicians are getting the brunt of bad ratings. In fact, I can imagine a Democrat saying Ostermeier's data  "shows that Republicans are more prone to falsehoods than Democrats." Now, as our 2007 data shows, whatever trend Ostermeier found in 2010 did not exist in 2007. We could ask ourselves if something changed in the Republican Party (like the TEA Party) that may have made the party more likely to embrace falsehoods. Notice that the stats for Democrat statements were the same in 2007 as they were in 2010. The big difference came in Republican statements, possibly suggesting a change in the Republican party. Now I'm not saying this is definitely the answer. However, I see no evidence that eliminates this possibility in favor of any possibility involving party-related selection bias.
" However, there is no evidence offered by PolitiFact that this is their calculus in decision-making. "
I'm not 100% sure why he says "However." If one political party is making "a disproportionately higher number of false claims than the other," wouldn't one expect a random large sample of those statements to also contain a disproportionately higher number of false claims (leading to "false" ratings from fact checkers like Politifact)? I'm not sure what the discrepancy is. If it truly is the case that Republicans are making more false claims than Democrats, and Politifact has an approximately equal number of false claims for each party, would this not be an indication of party-related selection bias? Politifact would be trying, possibly with some kind of "fairness doctrine" to make Republicans look more honest than they truly are. As Bill Adair noted, "we choose to check things we are curious about. If we look at something and we think that an elected official or talk show host is wrong, then we will fact-check it." No mechanisms are in place to ensure some kind of contrived fairness when one political party is more likely than the other to play fast and loose with the truth.
"Nor does PolitiFact claim on its site to present a 'fair and balanced' selection of statements, or that the statements rated are representative of the general truthfulness of the nation's political parties or the elected officials involved. "
This is correct. However, as the Washington Post Glen Kessler pointed out in a recent interview with NPR "I don't really keep track of, you know, how many Democrats or how many Republicans I'm looking at until, you know, at the end of the year, I count it up." There is no indication Politifact does any different. So it may not be the case that Politifact highlights any explicit methods for ensuring fairness (contrived or not). However, as professional journalists, is it possible they employed standard journalistic practices of objectivity so common-sense they didn't see the need to mention them?

It is possible that, given the tendency for journalists to be themselves personally liberal, that we should expect party-related selection bias as a default? It is a possibility (like many others). However, unless the entire staff is liberal (a claim that would require evidence), one has to wonder if even a few conservative journalists may actually be enough to prevent other liberal journalists from falling prey to party-related selection bias. Remember, Politifact articles are reviewed by a panel of at least 3 editors, further decreasing the probability of party-related selection bias.

In addition, a person with that hypothesis would have to explain why ratings did not favor politicians from either party over the other in 2007. Did Democrats lie more in 2007, causing party-related selection bias to make it seem like Democrats were just as honest as Republicans? Or was Politifact just not biased back then, as opposed to now? At least one would need to be investigated before drawing any conclusions.

Overall, although Politifact does not explicitly call itself "fair and balanced" (one may actually be skeptical of news organizations that do), once cannot come to the conclusion that they aren't actually fair and balanced. This would be a classic "argument from ignorance."
"And yet... 
In defending PolitiFact's "statements by ruling" summaries - tables that combine all ratings given by PolitiFact to an individual or group - Adair explained: 
"We are really creating a tremendous database of independent journalism that's assessing these things, and it's valuable for people to see how often is President Obama right and how often was Senator McCain right. I think of it as like the back of a baseball card. You know - that it's sort of someone's career statistics. You know - it's sort of what's their batting average." (C-SPAN Washington Journal, August 4, 2009) "
I will admit Bill Adair does oversell the importance of Politifact's summaries and tables. Since the selected statements are not truly random, only so much can be extrapolated from them. It may be a good idea for Politifact to provide a disclaimer stating that statements from individual politicians are not randomly chosen. They are instead chosen via the criteria outlined in their "Principles." However, with no evidence that party-related selection bias is a factor in their decision making, there would be no reason to post any disclaimer to suggest it is a factor.

HOW FAR SHOULD THE MEDIA GO TO APPEAR UNBIASED?
" Adair is also on record for lamenting the media's kneejerk inclination to treat both sides of an issue equally, particularly when one side has the facts wrong. 
In an interview with the New York Times in April 2010, Adair said: 
"The media in general has shied away from fact checking to a large extent because of fears that we'd be called biased, and also because I think it's hard journalism. It's a lot easier to give the on-the-one-hand, on-the-other-hand kind of journalism and leave it to readers to sort it out. But that isn't good enough these days. The information age has made things so chaotic, I think it's our obligation in the mainstream media to help people sort out what's true and what's not." "
Bill Adair makes a good point. Why should the media present both sides equally when one side is clearly wrong? In addition, why should the media present both parties as being equally truthful when one is actually playing fast and loose with the truth? It would seem odd that a political party that once lambasted the Fairness Doctrine would have any problem with what Bill Adair is saying.

SHIFTING THE BURDEN OF PROOF
" The question is not whether PolitiFact will ultimately convert skeptics on the right that they do not have ulterior motives in the selection of what statements are rated, but whether the organization can give a convincing argument that either a) Republicans in fact do lie much more than Democrats, or b) if they do not, that it is immaterial that PolitiFact covers political discourse with a frame that suggests this is the case. "

This is an interesting quote. My original critique of this quote may have been a bit over-simplified1. As a result I thought it would be better to do a bit more thorough job on this quote, due to the implications it contains:

Notice how his last statement claims that Politifact frames Republicans in a way that suggests they lie more than Democrats. Before judging the validity of the rest of this statement, it would probably be a good idea to see if Politifact even does this at all. First off, notice that Politifact has never had an overall Dem vs GOP report card. Ostermeier was the one who did this. So this means that certain conclusions one may be tempted to draw from Ostermeier's study are not ones people may necessarily draw from Politifact itself. This is because all Ostermeier did was total up all statements made by Democrats and Republicans, and compared them. Of course he could have easily done the same with any arbitrary breakdown of statements: Men versus women, majorities versus minorities, southerners versus northerners, etc... If we found similar disparities in the data to the Dem vs GOP data, would we be accusing them of covering "political discourse with a frame that suggests this is the case?" The ultimate problem is that, Politifact rates individuals (and specific groups), not overall groupings of those individuals. In fact, Adair's earlier statement about his report cards shows that he is really only focusing on individuals. If one were to infer any trend about an entire grouping of those individuals, not only would they would need a sufficient sample size of individuals within those groups, but they would also need to filter out individuals with only a few statements to their name. It should be clear to anyone with common sense that, when you only have a small number of statements from an individual, you cannot come to any conclusion about the honesty of that individual with any decent level of certainty. As a result, people with only a few grades to their names would have to be ignored. For example, if you eliminated candidates with fewer than 5 statements to their name in 2010, you would have to eliminate Jim DeMint, Kevin McCarthy, Mike Prendergast, Dan Coats, and whole host of others... This will probably leave you with a very small sample of individuals. Seeing as how there are literally thousands of Republican politicians at all levels of government, you cannot come up with a half-way reasonable level of certainty over the honesty of Republican politicians as a whole from just Politifact's ratings alone. These kinds of things should be pretty intuitive to people just looking at Politifact's articles and report cards. As a result, Politifact can hardly be blamed for creating a framework suggesting Republicans lie more when they have never actually done such a thing, unlike Ostermeier. So as a result, does Politifact have the burden to give Republicans a convincing argument that "Republicans in fact do lie much more than Democrats?" At the very least, this is a straw man since Politifact never even implicitly made that claim. It was an observation based on a context-ignoring analysis performed by Ostermeier himself.

But let's say Politifact did look at a large number of claims from a large number of politicians to where you can gauge the honesty of enough individuals from each party to come up with a conclusion that one party does in fact lie more than the other. If Politifact is merely presenting the data they found, without coming to any conclusion, does Politifact really have the burden of proof? Whatever conclusions that are made about the data must be supported by the person making the conclusion. So if we are stuck in some dichotomy where either Politifact is biased, or Republicans lie more, should Politifact be the one with the burden of proof? As we can see, the data is agnostic to either claim. So if Politifact is making no claims, yet someone wants to claim they are biased, is it Politifact's burden to prove that Republicans lie more to keep these people from being justified in their belief in Politifact's bias? As Ostermeier points out, it is not a question over whether or not Politifact can prove outright that they don't have "ulterior motives." However, if Politifact fails to answer either of Ostermeier's questions, what can be their conclusion? Ostermeier gives Republicans no questions of their own to answer before coming to a conclusion. And as Ostermeier's last paragraph shows, he is comfortable supporting one particular conclusion over any others:


OSTERMEIER CONCLUDES BIAS
" In his August 2009 C-SPAN interview, Adair explained how the Pants on Fire rating was the site's most popular feature, and the rationale for its inclusion on the Truth-O-Meter scale: 
"We don't take this stuff too seriously. It's politics, but it's a sport too." 
By levying 23 Pants on Fire ratings to Republicans over the past year compared to just 4 to Democrats, it appears the sport of choice is game hunting - and the game is elephants. " (emphasis mine)
Up until this point, Ostermeier has been pretty good at not owning the conclusion that his data suggests Democrat-favored selection bias at Politifact. He has appeared sympathetic to concerns of this sort of bias by specifically highlighting them and not investigating other possible explanations. However, this last paragraph indicates he does actually believe that there is some kind of bias of this sort at Politifact. However, his logic in regards to "Pants on Fire" ratings suffers from the same problems readers may have when concluding in favor of any kind of party-related selection bias at Politifact (see "Conclusion" below). He fails to investigate other possible causes, instead flipping the burden of proof on Politifact to demonstrate it is not in fact guilty. In fact, if he were to look back at 2007, he would see a near polar opposite trend of assignments for "Pants on Fire" ratings. Democrats were the recipients of over half the "Pants on Fire" ratings in 2007. In addition, they received twice as many "Pants on Fire" ratings as Republicans that year, despite the fact that Democrats actually had slightly fewer total statements rated. Now there weren't a lot of "Pants on Fire" ratings that year, but it is clear the trend from 2010 was absolutely non-existent in 2007. Shouldn't that make Ostermeier wonder what changed in 2010? Was it Politifact or the Republican Party? To my knowledge no evidence has been given for the former (or even suggested), yet the rise of the TEA Party in 2010 shows the latter is certainly true.

CONCLUSION

Although it is hard to draw too many conclusions from Ostermeier's data sets. A few questions still need to be investigated before coming to any conclusions (some of these come from a previous post):
  •  Could it be that many of these statements came from a single issue (or small collection of issues) where Republicans may have been more likely to spread falsehoods? A month-by-month breakdown could be useful in this regard. This kind of breakdown could also help investigate the question as to whether the Republican primaries had anything to do with the trends in 2010, as I pointed out earlier.
  • Could it be that the perceived rise of anti-intellectualism in the Republican Party has made it less likely to care about factual accuracy? 
  • Could the Republican party be more ideological and less prone to checking facts that could challenge their ideology? 
  • Could it be that the Republican tendency to call neutral fact-checking sites "biased" creates a tendency to play fast and loose with the truth, knowing their constituents already distrust any site that could shed light on their claims? 
These are all possibilities (although I am not necessarily claiming any of them to be true). So did the article control for or check any of these, or alert readers to these possibilities? NOPE.2

Throughout most of this article, Ostermeier presents interesting data for Politifact in 2010. Although there are many possible explanations of the data, Ostermeier only really focuses on one. In addition, he ignores at least one other for no better reason than he should have for ignoring the one he focuses on as well. Ostermeier fails to ask many questions, such as whether or not the trends he found in 2010 even existed in previous years. A quick analysis of Politifact in 2007 showed this is not the case for at least one year. In the end, Ostermeier makes it clear he thinks Politifact is guilty of party-related selection bias, succumbing to the same fallacies any reader would if he/she came to the same conclusion from the information presented in this article alone (not that other articles I've seen do any better).

Overall, this article demonstrates a particularly erroneous way of detecting bias. A mere look at ratings totals cannot give a reader much information about bias. Even if one were to tally up all of Politifact's articles, few conclusions could be reached. Out of Politifact's 4 1/2 years in fact checking, 3 have been under a Democrat president, and 3 1/2 under a Democrat controlled congress. With the possibility of the party out of power being more prone to spreading falsehoods, one cannot instantly jump to conclusions if Republicans end up having more bad ratings than Democrats. I admit that demonstrating bias can be very tough. However, if someone is having THAT much trouble demonstrating bias, maybe they should wonder if they are even justified in suspecting bias in the first place. If Ostermeier had done this, maybe this article would have been more useful. Instead this article is just another failed attempt to try and poison the well for Politifact.

1 - Update 1-31-12: Original critique: "Why should the organization have to give a convincing argument that Republicans lie more than Democrats in order to convince Republicans they do not have ulterior motives? Essentially this is the argument that, unless Politifact gives evidence it does not has ulterior motives, Republicans are justified in believing it does. This again, is a textbook argument from ignorance (see link above). Shouldn't Republicans have to ask for somewhat definitive evidence that Politifact in fact does have ulterior motives before accepting that claim as true? In fact, couldn't the data found in this entire article point to Republicans politicians spreading more falsehoods than Democrat politicians in 2010 just as much as it could point to Politifact having ulterior motives? In fact, the data from 2007 significantly challenges the latter while having practically no effect on the former."

2 - Update 1-31-12: I took this bulleted list from the "SHIFTING THE BURDEN OF PROOF" section and placed it in the conclusion, to help with the revised flow of the critique.

11 comments:

  1. Why should the organization have to give a convincing argument that Republicans lie more than Democrats in order to convince Republicans they do not have ulterior motives?

    Because, as Ostermeier pointed out (and you conceded), PolitiFact publishes sets of statistics that make it look like Republicans lie more yet without explaining to readers that the selection is non-random (and what that means for said statistics). And Ostermeier specifically said that it wasn't about whether PolitiFact had ulterior motives. We can do without that straw man.

    Essentially this is the argument that, unless Politifact gives evidence it does not has ulterior motives, Republicans are justified in believing it does.

    Bah. Straw man. Ostermeier was clear that this is about an issue you have already partly conceded. PolitiFact issues "report cards" that supposedly inform potential voters. The report cards tend to make Republicans look bad. We know the selection process isn't random, and PF has offered no reassurance of the existence of any check on ideological selection bias.

    This again, is a textbook argument from ignorance

    No it isn't. It would be an argument from ignorance to claim that if we don't have evidence of bias that therefore PolitiFact isn't biased. Ostermeier argues his point based on the evidence, not a lack of evidence. And you've conceded that the majority of journalists are liberal. And for some reason you draw reassurance that PF votes on the ratings using a panel of three editors. I have information that the votes are not always unanimous. That means that a majority of liberal journalists can always control the ratings produced. Very reassuring, yes?

    ReplyDelete
  2. "Because, as Ostermeier pointed out (and you conceded), PolitiFact publishes sets of statistics that make it look like Republicans lie more yet without explaining to readers that the selection is non-random (and what that means for said statistics)."
    But shouldn't they ask if Republicans really did give inaccuracies in 2010 more than Democrats?

    "And Ostermeier specifically said that it wasn't about whether PolitiFact had ulterior motives. We can do without that straw man."
    I misread the statement. However, the point still stands. He puts the burden of proof on Politifact to provide evidence against a claim that Politifact suffers from party-related selection bias. Why should Politifact have to give evidence to Republicans that they in fact lie more? Shouldn't Republicans have equal burden to provide evidence of party-related selection bias at Politifact? To say one does and the other doesn't is essentially an argument from ignorance.

    "Bah. Straw man. Ostermeier was clear that this is about an issue you have already partly conceded. PolitiFact issues "report cards" that supposedly inform potential voters. The report cards tend to make Republicans look bad. We know the selection process isn't random, and PF has offered no reassurance of the existence of any check on ideological selection bias."

    Yes but he is shifting the burden of proof on Politifact to prove they have some kind of check on ideological selection bias. He then states that politifact has provided no evidence yet. His last paragraph makes it painfully obvious he supports the idea of party-related selection bias. He isn't consistent about his conclusions. Sometimes he does back off of it, but he still shows support for the idea and never suggests any of the questions Republicans should be asking before they come to the conclusion that Politifact is guilty of party-related selection bias.

    "No it isn't. It would be an argument from ignorance to claim that if we don't have evidence of bias that therefore PolitiFact isn't biased. Ostermeier argues his point based on the evidence, not a lack of evidence."
    You are starting to get the point. The problem is the statistics he provides as evidence also point to a tendency for Republicans to make more inaccurate statements (at least very public statements). The evidence does nothing to solve the issue of whether Republicans "lied" more in 2010 than Democrats or if Politifact is guilty of party-related selection bias.

    "And you've conceded that the majority of journalists are liberal. And for some reason you draw reassurance that PF votes on the ratings using a panel of three editors. I have information that the votes are not always unanimous. That means that a majority of liberal journalists can always control the ratings produced. Very reassuring, yes?"
    That is assuming that the liberal journalists are actively suppressing the opinions of conservative journalists. Care to provide evidence to back this up?
    My point is that, with conservative journalists and editors, a majority of liberal journalists seeking to be objective in their reporting can avoid party-related selection bias by checking with conservatives on their team.
    Do you have evidence they are not seeking to be objective?
    Do you have evidence conservative journalists are actively being ignored more than liberal journalists?
    If not then you have to remain neutral and certainly cannot come to the conclusion that party-related selection bias is the null hypothesis.

    ReplyDelete
  3. (S)houldn't they ask if Republicans really did give inaccuracies in 2010 more than Democrats?

    In a word, no. PF's methodology is at issue. If PF publishes information that gives the appearance of an authoritativeness and the methodology fails to measure up, it helps little or not at all that Republicans lie more because that conclusion is not supported by PF's findings. Their findings are (apparently) not treated rigorously enough to support that type of conclusion. Try applying your skepticism of Ostermeier to PF instead, just to see what it looks like.

    "Why should Politifact have to give evidence to Republicans that they in fact lie more? Shouldn't Republicans have equal burden to provide evidence of party-related selection bias at Politifact? To say one does and the other doesn't is essentially an argument from ignorance."

    Good grief. You're hopeless.

    Do you know what "framing" is? Just in case ...
    http://marc-latham.suite101.com/what--is-media-framing--opinions-on-what-influences-the-news-a264197
    http://www.scottlondon.com/reports/frames.html

    ReplyDelete
  4. "In a word, no. PF's methodology is at issue.
    That is the problem. The evidence supports the claim that Republicans lie more just as much as it supports the claim that there is party-related selection bias at Politifact. Its like saying that a graph that shows more people drive foreign cars is evidence whoever made the graph is biased.

    "If PF publishes information that gives the appearance of an authoritativeness and the methodology fails to measure up, it helps little or not at all that Republicans lie more because that conclusion is not supported by PF's findings."
    You can question the methodology. But no evidence is no evidence. You cannot suggest one side or the other because you have no evidence either way. And yes, the conclusion that Republicans lie more is just as supported by Politifact's findings as is any indication of bias (minus other lines of evidence not part of Ostermeier's study).

    "Their findings are (apparently) not treated rigorously enough to support that type of conclusion."
    How do you know? Once again, no evidence either way.

    "Try applying your skepticism of Ostermeier to PF instead, just to see what it looks like."
    But I'm not coming to any conclusion/suggestion. Nor am I saying the data makes it look like anything. I do that because I know enough about Politifact's methods to know neither conclusion is justified. I do think Politifact is over-playing the significance of their report cards. However, I am saying that Ostermeier is supporting once conclusion without seriously questioning any other.

    "Good grief. You're hopeless."
    You just have a double standard of evidence.

    "Do you know what "framing" is? Just in case ..."
    But any evidence of framing in this case is also evidence that republicans are lying more. I'm saying that this article gives data that should be indeterminate. He could have asked questions to help rule out other explanations of the data, but he do not. Instead he investigated one possible explanation without ruling out others (except in one case where he did with the same kind of justification that should have ruled out his preferred explanation):

    1. He looked at Politifact's 2010 data.
    2. He essentially said it could be selection bias or republians lie more.
    3. He found no evidence to rule out selection bias and no evidence that Republicans lie more. Barring an argument from ignorance, this should have been left indeterminate.
    4. He then put the burden of proof on Politifact to show that Republicans lie more then democrats. That shift of the burden of proof is a fallacy.

    So once again:

    "Why should Politifact have to give evidence to Republicans that they in fact lie more? Shouldn't Republicans have equal burden to provide evidence of party-related selection bias at Politifact? To say one does and the other doesn't is essentially an argument from ignorance."

    ReplyDelete
  5. "The evidence supports the claim that Republicans lie more just as much as it supports the claim that there is party-related selection bias at Politifact. Its like saying that a graph that shows more people drive foreign cars is evidence whoever made the graph is biased."

    Huh. And here I thought I remembered you admitting that a majority of newspaper journalists lean left. Do we also have evidence that a majority of individual graph makers show a preference for foreign cars? What is a "foreign car" to a French graph maker, anyway. A Cadillac?

    Do you seriously mean to say that there is no evidence at all of political media bias? That claim seems absurd on its face. And we know how you hate absurdity.

    ReplyDelete
  6. "
    Huh. And here I thought I remembered you admitting that a majority of newspaper journalists lean left."

    Yes but they are also trained to be objective.

    "Do we also have evidence that a majority of individual graph makers show a preference for foreign cars? What is a "foreign car" to a French graph maker, anyway. A Cadillac?"
    So the reason you think Politifact needs to bear the burden of explanation is because a majority of journalists are personally liberal. I've already pointed out why this doesn't provide any reason to shift the burden of proof on politifact.

    Do you seriously mean to say that there is no evidence at all of political media bias? That claim seems absurd on its face. And we know how you hate absurdity.
    Correct. It is absurd. But just because some outlets are biased doesn't mean they all are. inductive fallacy, my friend. Does your whole argument rely on some "liberal bias until proven innocent" kind of term?

    ReplyDelete
  7. "(T)hey are also trained to be objective."

    Are they? Do you have a degree in journalism or mass media studies? Is it possible that the training in objectivity isn't all that great? If I have a degree in mass media studies does that make me objective?

    "So the reason you think Politifact needs to bear the burden of explanation is because a majority of journalists are personally liberal."

    That's part of the reason. I mainly mentioned it so that you'd have a chance to reconcile that with your implicit claim that there is no evidence at all of media bias ("The evidence supports the claim that Republicans lie more just as much as it supports the claim that there is party-related selection bias at Politifact").

    http://qje.oxfordjournals.org/content/120/4/1191.short

    "I've already pointed out why this doesn't provide any reason to shift the burden of proof on politifact."

    Hopefully you're not referring to your claim that journalists are trained in objectivity. What are you referring to, please?

    ReplyDelete
  8. "Are they? Do you have a degree in journalism or mass media studies? Is it possible that the training in objectivity isn't all that great?"
    No I haven't examined the curriculum of each and every one of their schools. However, can you tell for sure they are bad? if not then you are stuck with no conclusion. From what I can tell from journalists I know, training in objectivity is fairly good.

    " If I have a degree in mass media studies does that make me objective?"

    You may be trained in how to be objective. However, I don't get the impression you make any effort to try and be objective.

    "That's part of the reason."
    It is a pretty poor reason. It is essentially motivational evidence, which is pretty crappy without other definitive evidence (unlike Osermeier's study). Does the fact that doctors would see a better profit by ensuring the population is sick provide evidence they are engaging in some kind of sickness conspiracy? Does the fact that anti-virus companies would ensure job security and increased demand for their product count as evidence they are creating viruses? Does the fact that socialism poisons liberal ideas evidence that stalin was a pawn of a conservative underground? This, as well as your liberal media evidence, is all conspiratorial thinking. And it's not a recipe for obtaining true beliefs.

    "I mainly mentioned it so that you'd have a chance to reconcile that with your implicit claim that there is no evidence at all of media bias"

    No more evidence of media bias than there is that Republicans lie more than democrats. The "evidence" you have provided is pretty poor.

    http://onlinelibrary.wiley.com/doi/10.1111/j.1460-2466.2000.tb02866.x/abstract

    ReplyDelete
  9. "opefully you're not referring to your claim that journalists are trained in objectivity. What are you referring to, please?"


    YOU: "And you've conceded that the majority of journalists are liberal. And for some reason you draw reassurance that PF votes on the ratings using a panel of three editors. I have information that the votes are not always unanimous. That means that a majority of liberal journalists can always control the ratings produced. Very reassuring, yes?"

    ME: "That is assuming that the liberal journalists are actively suppressing the opinions of conservative journalists. Care to provide evidence to back this up?
    My point is that, with conservative journalists and editors, a majority of liberal journalists seeking to be objective in their reporting can avoid party-related selection bias by checking with conservatives on their team.
    Do you have evidence they are not seeking to be objective?
    Do you have evidence conservative journalists are actively being ignored more than liberal journalists?
    If not then you have to remain neutral and certainly cannot come to the conclusion that party-related selection bias is the null hypothesis."

    ReplyDelete
  10. "No I haven't examined the curriculum of each and every one of their schools."

    Just most of them, eh? ;-)

    "From what I can tell from journalists I know, training in objectivity is fairly good."

    Ah. Scientific sampling. Very good.

    "The 'evidence' you have provided is pretty poor."

    How much evidence do I need in order to contradict the claim that there is no evidence?

    ReplyDelete
  11. "Just most of them, eh? ;-)"

    Are you saying journalists have no training in objectivity? Or that they are particularly poor?

    "Ah. Scientific sampling. Very good."
    Nice straw man, seeing as how i never meant it as more than an observation.

    "How much evidence do I need in order to contradict the claim that there is no evidence?"
    Just some decent quality evidence. Horrible evidence is only trivially different from no evidence.

    ReplyDelete