August 31, 2010

Marc Hauser and the fragility of specialized research

In case you haven't heard, a superstar Harvard scientist has been found responsible for research misconduct, possibly including fabrication of data, and leading to a key journal article being retracted. Rather than being some minor fudge that can be easily set straight, the comments from those in the field -- primate cognition -- make it sound like an entire edifice could crumble. How could just one bad apple spoil the bunch?

The answer comes from Nassim Taleb's view on the fragility vs. robustness of complex systems, and the accelerating per unit costs of correcting harms as those harms get bigger. Go to his webpage and read around, or listen to his podcast at EconTalk, or read the afterword to the paperback edition of The Black Swan. The ideas are spread throughout. Here's how it works in this case:

With the advent of the modern research university, work on a given research problem gets concentrated into fewer and fewer hands as academics over-over-over-specialize. Before, they were generalist enough to understand most problems, have something to say or carry out work about them, and interpret and judge other people's thoughts and findings on them. Academics may always be prone to unintentionally seeing their data with biased eyes, to maliciously faking the data, and to having their equipment crap out and give them a faulty reading.

But before, there were, say, 100 academics working on some problem, so that if any one of them made a serious error, the others could correct it at a small cost -- by trying to replicate the results themselves, asking for the data to re-analyze it themselves, proposing rival explanations, pointing out logical flaws (that would still require enough knowledge of the problem to see), and so on. So, if the system is moving in the direction of greater-truth-discovering, and we perturb it away by having one of these 100 academics make an error, the system goes back to where it was and the error doesn't spiral out of control. Also, the errors from various academics don't compound each other -- that would require belonging to some larger framework or vision or coordination that wove them together, making them interact.

However, in the modern setting of hyper-specialization, there are very few qualified to correct the errors of the one academic (or maybe a small handful) who are working on the problem. They mostly have to take the raw data and the interpretations on faith -- deferring to expert consensus is out since, again, so very few are qualified that the law of large numbers cannot work to yield an unbiased consensus. Thus, when a crucial error is made, there are few external checks to dampen it, and its harm grows and grows as it becomes taken for granted among the broader academic fields that cite the erroneous article.

Moreover, these errors compound each other because they all come from the same academic, not a group of unrelated researchers. In his vision, his big ideas all mesh together and reinforce each other; most academics don't toy with a bunch of totally unrelated ideas, but instead seek to build an interlocking tower of ideas.

Let's say that there are a total of 4 errors made relating to some problem and that go uncorrected for awhile -- they're only spotted after they've been incorporated into the broader field. In the generalist world, those errors probably came from 4 unrelated researchers, so they will hit at 4 distant spots in the problem -- like having 4 blows made to a chair, one to each of the 4 legs. It will get shaken around, but it won't get upended. Now go to the hyper-specialized world: those 4 errors will all be part of an integrated vision, so they'll have a stronger total force, given that they work with each other. And they'll be more focused in where they strike since the scope of research for one person is smaller than it is for four people combined. This is like an even stronger force striking at just one of the chair's legs -- now it will topple over.

In other words, the harm done by the 4 errors of a single, specialized academic is greater than 4 times the harm done by a single error of a generalist academic (because of compounding). We would boost the health of the academic system by breaking up research territory so that it looks more like the generalist than the specialized world. It doesn't matter if there are economies of scale, benefits of specialization (like becoming a master at one thing rather than being a jack-of-all-trades), or whatever. Those benefits all get wiped out when some series of errors gets out of control, as it inevitably will -- then the whole system crashes, and the so-called gains of specialization were illusory.

In the generalist world, errors are less likely to go uncorrected, and they do not get compounded, so it is much more robust -- at the "cost" of greater inefficiency. Greater efficiency in the short-run only serves to boost the status of researchers, as they get praised and financially rewarded for their impressive short-term performance. The fake efficiency of specialization counts for nothing once the body of work on a problem is found to be contaminated beyond repair. *

A central source of robustness is redundancy, like having spare parts in case one stops working. An academic world where 100 people are working on a problem is more robust, even though it looks inefficient -- why not just hand it over to a handful of specialists? Well, why have two lungs and why ever back up your files? The cost of redundancy is more than outweighed by the benefit of superior survival value, whereas the benefits of specialization are more than outweighed by the cost of getting wiped out before too long.

This is surely a big part of why so little great research was done during the 20th century, especially during the second half when peer review became institutionalized (but that's another story). For thousands of years before the modern research university, people found new big problems all the time, made progress on them, and this lead to solutions for other problems (because generalists can see connections between disparate problems). They were subject to fads and groupthink, but so are specialists -- the latter more so since they're so insulated -- and in any case, they weren't so fragile to the errors of a couple of researchers.

"Yeah, well this is just one case -- so what?" This is the only one that we know about here and now. We only found out because Hauser was so out there, according to his graduate students, that they couldn't stay silent any longer. Most grad students are of course a bunch of big fat pussies who value career stability over higher goals. So there are all sorts of cases, past and present, that we aren't even aware of. Sometimes people actually trace the chain of citations back to the original data and find that the original results were wrong, or got transformed in a game of broken telephone. Every field has their share of academic urban legends, and that's much more likely in a specialist world where people have no choice but to take data and interpretations on faith, and where non-specialists are likely to mutate the original finding or interpretation.

The only solution is to break up the academic territory and go back to the generalist world that prevailed before the modern research university, before publish-or-perish incentives that only made the temptation to fudge even greater, and before supposedly hard-headed academics began to take others' findings on faith.

* It's just like with banks: there's a mismatch between the time scale of performance and getting bonuses. Managers get bonuses every year, yet it could take a decade or more to see if they'd been taking dangerous risks that ultimately will wipe the company out for good. So their incentive is to take a lot of hidden risks and massage the numbers so that they get their yearly bonus. Once the hidden risks expose themselves beyond the power of massaging the data, the company goes bust -- but the banker still keeps the bonuses from those first 9 years of illusory efficiency. Same with academics, who get praised, promoted, and paid on an annual basis, but whose performance may take a decade or more to truly evaluate.

3 comments:

  1. The only solution is to break up the academic territory and go back to the generalist world that prevailed before the modern research university

    How is that logistically possible? Specialization arose out of necessity, did it not?

    ReplyDelete
  2. Curiously, in my academic field, Epidemiology, we often have an extremely dense body of replication and some very spirited and data-based fights. So I am not 100% sure that this applies to all academic fields.

    ReplyDelete
  3. Joseph, you should go over to rawfoodsos and read the writer's slaughter of the China Study. Very interesting stuff.

    ReplyDelete

You MUST enter a nickname with the "Name/URL" option if you're not signed in. We can't follow who is saying what if everyone is "Anonymous."