I may have more to say the New Republic article tomorrow, but I find some of the points interesting enough for discussion on their own. It’s basically a study of the right wing think tank studies which have resulted in conclusions about systematic exclusion of conservative faculty and ideas on college campuses. I’ll throw together some thoughts in the morning. Here are some highlights.

Professors are all Democrats, except those who are communists. Professors all hate Bush. Professors favor like-minded students and love converting those who love God, country and the president. You’ve read all the claims and more, in right-leaning blogs and columns. Frequently, these claims are based on studies — many have been released in the last two years — of professors. Party registration is documented, or professors respond to surveys, or syllabus content is rated.

A new study being released today aims to debunk all of those studies. “The ‘Faculty Bias’ Studies: Science or Propaganda,” takes eight of the recent studies on faculty politics and judges them by five general tests of social science research. Today’s study finds that the eight all come up short in adhering to research standards. The new study was sponsored by the American Federation of Teachers and the work was conducted by John B. Lee, an education researcher and consultant who said that once the AFT commissioned the work, it did not restrict his approach or findings in any way.


Lee’s analysis finds some support for the first theme. “Taken together, these studies at best suggest that college faculty members are more likely to be Democrats than Republicans,” he writes. However, even on this theme, he notes that the studies tend to exclude community college faculty members and to focus on faculty at elite institutions — probably skewing the results.

The second theme takes a more thorough beating in the study. “Among the most serious claims the authors make is that this liberal dominance results in systematic exclusion of conservative ideas, limited promotion opportunities for conservative faculty, and expression in the classroom of liberal perspectives that damage student leaning,” Lee writes. “These claims, however, are not supported by the research. Basic methodological flaws keep a critical reader from accepting the conclusions suggested by the authors.”

The flaw Lee identifies most frequently with this theme is one in which researchers note a correlation and — in Lee’s opinion — then see a causal relationship without sufficient evidence that one exists.


The new AFT study looks at eight studies, including some that have attracted substantial attention (both praise and criticism), such as work published in 2005 in The Forum that analyzed faculty attitudes at four-year institutions and concluded that conservatives, practicing Christians and women are less likely than others to get faculty jobs at top colleges. That study was based on a survey of 1,643 faculty members. Other studies looked at faculty attitudes in certain disciplines or at certain institutions.

Some of the studies were prompted by specific events, such as the American Council of Trustees and Alumni’s “How Many Ward Churchills?,” which analyzed class materials online at top institutions and found that the controversial Colorado professor’s ideas — which have been in the news while his university has considered whether to fire him — are shared by many professors. Some of the reports are by social scientists, published in peer-reviewed journals. Others were issued by associations that are players in the culture wars of academe.


Lee said that to test the validity of the studies, he wanted standards that could not be considered partisan, so he used a 2006 statement by the White House Office of Management and Budget about objectivity in research. Based on that statement, he asked five questions about each of the faculty bias studies:

  • Can another researcher with a different perspective replicate the results using the information provided by the author?
  • Are the definitions used in the studies clear enough?
  • Does the research eliminate alternative explanations for the results?
  • Do the conclusions follow logically from the evidence?
  • Has the author guarded against assumptions that could introduce systematic bias into the study?

Using this framework, Lee gives the studies failing grades. Four studies had data that could be replicated, and he gave three studies acceptable reviews on clarity of terms, but it was downhill from there, and he argues that none of the reports can truly back up their contentions.


Another theme he returns to over and over again is one of demonstrating (or not) causal relationships. He notes that there are many explanations for political trends and demographics among the professoriate, so it is unfair to assume that a liberal tilt (assuming one exists) reflects bias. He notes, for example, that the studies do not explore whether there could be non-political explanations.

This last point is salient I think. The conservative pundits quoting the studies simply point out that the history, sociology, and literature departments tend to be dominated by liberals, without reference to the ideological makeup of business administration or engineering departments. They point out how many liberals are hired compared to conservatives, but they provide no ratios for the applicants. Maybe there are correllations between ideology and actual interest in various disciplines.

The conservative response might not contain much substance, but it is certainly abundant in vehemence.

Anne Neal, president of the American Council of Trustees and Alumni (which issued two of the reports reviewed), criticized the AFT for commissioning the study. Via e-mail, she said: “Faced with mountains of evidence from ACTA and others documenting a troubling lack of professionalism in the academy, AFT chooses, instead, to shoot the messenger. In doing so, far from undermining ACTA, it discredits itself. AFT’s study is severely flawed. It is filled with inaccurate and tendentious interpretations — for instance, framing the debate in terms of politics rather than professional standards outlined by ACTA; applying irrelevant ’scientific’ standards to textual analysis; and offering such shoddy research that the sections on ACTA totally confuse and conflate two different reports, rendering the critique invalid, even laughable.“