So now Levitt and his co-author Stephen Dubner have a sequel, Superfreakonomics, which includes a chapter on climate change. Do they deploy Levitt's trademark economic techniques to shed new light on old questions? Because that might be useful! Alas, no, there's nothing of the sort. Levitt and Dubner just parachute into the field of climate science and offer some lazy punditry on the subject dressed up as "contrarianism." There's no original research. There's nothing bold or explosive. It's just garden-variety ignorance. . .
In just a few dozen pages, Dubner and Levitt manage to repeat the myth that the scientific consensus in the 1970s predicted global cooling (quite untrue), imply that climatologists are unaware of the existence of water vapor (no, they're quite aware), and traffic in the elementary misconception that CO2 hasn't historically driven temperature increases (RealClimate has a good article to help with their confusion). The sad thing is that Dubner and Levitt aren't even engaging in sophisticated climate-skepticism here—there's just a basic unwillingness to gain even a passing acquaintance with the topic. You hardly need to be an award-winning economist to do that.
If bad science from non-scientists isn't bad enough, Paul Krugman suggests that they unintentionally (one hopes) misrepresented the work of another economist who tackled the question of the discount rate of tackling the climate change issue now (that is, figuring out what the value of possible future benefits is to us right now):
Yikes. I read Weitzman’s paper, and have corresponded with him on the subject — and it’s making exactly the opposite of the point they’re implying it makes. Weitzman’s argument is that uncertainty about the extent of global warming makes the case for drastic action stronger, not weaker.
That's a pretty serious breach, something you'd expect more from the makers of tripe like that "What the Bleep Do We Know?" movie about quantum mechanics than from a respected economist and journalist. It seems the irresponsibility in the book may not be limited to the chapter on climate. Ezra Klein took issue with a chapter that apparently attempts to show that driving drunk may be safer than walking home drunk:
It's terrifically shoddy statistical work. You'd get dinged for this in a college class. But it's in a book written by a celebrated economist and a leading journalist. Moreover, the topic isn't whether people prefer chocolate or vanilla, but whether people should drive drunk. It is shoddy statistical work, in other words, that allows people to conclude that respected authorities believe it is safer for them to drive home drunk than walk home drunk. It's shoddy statistical work that could literally kill somebody. That makes it more than bad statistics. It makes it irresponsible.
The sentiment is echoed by Krugman in another blog post considering the climate change chapter:
And that’s not acceptable. This is a serious issue. We’re not talking about the ethics of sumo wrestling here; we’re talking, quite possibly, about the fate of civilization. It’s not a place to play snarky, contrarian games.
I, of course, haven't read any of this yet-to-be-released book (Levitt himself insists it's not that bad) but that won't stop me from speculating as to why someone would engage in these kinds of games. There really is a marketplace of ideas out there: in this case it takes the form of the bestseller list. Competitive pressures in such a marketplace aren't necessarily for correct ideas but rather for interesting ideas. Book sales determine whether ideas sink or swim. That's why the Dan Brown school of lousy fiction is so successful and why Briane Greene can make money off "non-fiction" books about fantasy physics with little in the way of experimental support. That's interesting, imaginative stuff. Who cares if it's well-written or supported empirically?
Of course, that leads one to wonder whether it's okay for a public intellectual to lend his credentials to a questionable idea in pursuit of royalty money. For undergrads, being contrarian can give a competitive edge in the market for attention because it can lend one the appearance of intellectual depth. For full-blown professors with book deals, contrarianism can generate a competitive boost because it sells books. But just as regular markets can be dysfunctional in the absence of regulations, the market for ideas can also produce perverse outcomes in the absence of self-restraint (self-regulation). That's where the responsibility lies.
We can zoom out a bit and consider a slightly larger issue: namely, the insulation of academics from responsibility more generally. Academics have a hugely important role to play in the public discourse, in that they produce much of the research that decision-makers look to for guidance on the most pressing issues of the day. But responsibility for the consequences of a public policy decision doesn't fall on an academic who suggests it, it falls on the decision-maker who implements it. To be sure, some academics do find themselves at the actual policymaking table. Federal Reserve Chairman Ben Bernanke, regarded by many as an expert on the causes of the Great Depression when he has his professor's hat on, found himself in the unexpected position of having the public responsibility of preventing a second one. Policy advisers at all levels of government have often (or will later) spent time in the academic world.
But many prisoners in the Ivory Tower haven't spent much time in the world of consequences. I enjoy reading the Becker-Posner blog for intellectual reasons but there's something strange about reading two tenured professors (one of whom also holds a federal judgeship for life) poo-pooing the job- and benefits security that workers with less cushy jobs seek to attain through the bargaining power of unions. They draw paychecks from one of the most perverse systems around (the American higher education system) and dictate how others should be happy to abide by the vaunted rules of the free market.
So what can we take away from such examples of academics not buying what they're selling (neither in the marketplace of ideas nor of books)? Simply that academics are not the sole arbiters of truth in the modern world. They have great value but can still be blinded by their own accolades or book sales just like everyone else. It isn't anti-intellectual to suggest that extended real-world experience--residence in the world of real consequences and responsibilities--is an extremely important supplement to one's time behind the ivy-covered walls. It at least serves as a reminder that policy is more than a mathematical exercise or a rhetorical game useful for boosting book sales: it impacts real lives and has enormously significant repercussions for the segment of the population that doesn't enjoy tenure.
Update: I'm angry at myself for not including a Rick James joke somewhere in the above.
The field of history has a similar problem (other than Dan Brown) with what-if historians. I don't have as much of a problem with what-if history as some of my professors and history friends from college, however it does seem a little degrading to the profession when you give a few paragraphs of background information and then proceed to use your imagination and little to no research in order to come up with a shaky prediction of how you think things could have changed. Considering the generic background information put in each chapter of a book like "What If...", granted the only what-if book I've read, the use of esteemed professors to come up with the predictions seems little more than a ploy to make some money; any serious student of history could come up with the actual history in these books, and almost anyone at all could create the predictions.
ReplyDeleteThis segues into another idea you mentioned, Stanek, that I feel strongly about: "...academics are not the sole arbiters of truth in the modern world." At the close of my undergraduate studies I got a lot of comments and the occasional arrogant remark about how if I wanted to continue the road to being a good, serious historian I really had to go to graduate school. I'm sure everyone here has seen an outgrowth of this attitude with professors' remarks about wikipedia and the internet in general: anyone can put information on it! I understand there are a lot of idiots out there who claim to know what they're talking about (some would say I'm one) and that obviously one has to be careful using internet sources, but there seemed to be an underlying, borderline elitist issue here; namely that professors think only those in academia can ever really know what they are talking about. By their logic if I wanted to be a real historian, then I had to get into grad school and academia of course. For awhile I fell for it and even though I really did not want to go, I started looking into grad schools because I certainly didn't want to look like a joke of a historian.
I eventually decided not to go, at the time because I had started looking into schools too late. But in hindsight, I'm glad I did not go because I was only interested for the wrong reasons (embarrassment, essentially, although now I'm starting to see that graduate school is becoming more and more mandatory in order to get any kind of job at all). Today, I'm confident in my knowledge of the particular areas of history I've studied and I personally didn't need a graduate school for validation of my knowledge. For the record, I completely respect those who do actually want to go, there is nothing wrong with graduate school if you go for the right reasons (as the members of this blog do). However, academia does not have a monopoly on knowledge. There have been some professors that I flat-out had more knowledge than (I'm not trying to be arrogant, it's simply that there are always those who are simply bad at their job but manage to get by, academics included), and there were some who knew more than me in some areas of history and less in others. Academia does have the proof of training, but it's not the sole path to knowledge and it certainly isn't immune to bias or simple ignorance.