Science is Hard

Cross-posted from RM’s PI’s blog.

Nature Communications has an inspiring piece about a DARPA project baking replication into the original study. The DARPA Biological Control program applies independent validation & verification (IV&V) to synthetic biology. The article is a Lessons Learned piece.

Although DARPA oversight presumably mitigated the element of competition, the IV&V teams had at least as much trouble as some historical cases discussed below. It’s worth reading their Hard lessons table. Here’s one:

We lost more than a year after discovering that commonly used biochemicals that were thought to be interchangeable are not.

And this one seems to apply to any discipline: “pick a person”

The projects that lacked a dedicated and stable point of contact were the same ones that took the longest to reproduce. That is not a coincidence.

They also report how much they needed personal interaction — a result familiar to those of us in Science Studies (more later).

A key component of the IV&V teams’ effort has been to spend a day or more working with the performer teams in their laboratories. Often, members of a performer laboratory travel to the IV&V laboratory as well. These interactions lead to a better grasp of methodology than reading a paper, frequently revealing person-to-person differences that can affect results.

But I was still surprised how much.

A typical academic lab trying to reproduce another lab’s results would probably limit itself to a month or so and perhaps three or four permutations before giving up. Our effort needed capable research groups that could dedicate much more time (in one case, 20 months) and that could flexibly follow evolving research.

To be fair, this is biology, a proverbially cantankerous field. But the canonical Science Studies reference of the difficulty of replication is laser physics.

Before I explore that, pause to appreciate the importance of this DARPA work: (1) The value of baked-in replication for really understanding the original result, and (2) the real difficulty in achieving it. I encourage you to read the NC piece and imagine how this could be practiced at funding levels below that of DARPA programs.

Echoes of the Past

In Science Studies, replication troubles evoke the “experimenters’ regress”. The canonical reference is Collins’s 1974 paper on laser physics (or really, his 1985 book):

to date, no-one to whom I have spoken has succeeded in building a TEA laser using written sources (including preprints and internal reports) as the sole source of information, though several unsuccessful attempts have been made, and there is now a considerable literature on the subject. …. The laboratories studied here … actually learned to build working models of TEA lasers by contact with a source laboratory either by personal visits and telephone calls or by transfer of personnel.

Shortly thereafter, he notes that the many failures were:

simply because the parameters of the device were not understood by the source laboratories themselves. … For instance, a spokesman at Origin reports that it was only previous experience that enabled him to see that the success of a laser built by another laboratory depended on the inductance of their transformer, at that time thought to be a quite insignificant element.

This is of course echoed in the new NC piece about the DARPA program.

Collins expands on this and other episodes in his (1985), attempting to make sense of (then nascent) attempts to detect gravity waves. As Turner (2014) summarizes:

The point was to show that replication was not and could not be understood as a mechanical process…

So the crisis isn’t merely that it’s hard to replicate from publications – any more than it’s a crisis that it’s so hard to learn to ride a bicycle by studying a manual. And no doubt many failed replications are failures of technique, caused by researchers foolishly attempting replication without contacting the original authors. The crisis is that we have many reasons to expect half the original results were indeed spurious. One of the goals of Replication Markets and the larger DARPA SCORE program is to help sort out which.

Back to the Future

I’ve fallen behind in the literature. I see that Collins has a 2016 chapter on the modern situation: “Reproducibility of Experiments: Experimenters’ Regress, Statistical Uncertainty Principle, and the Replication Imperative.” I look forward to reading it.

And of course this brings us to Nosek and Errington’s preprint, “What is replication?”, where they argue that replication itself is an “exciting, generative, vital contributor to research progress”.

But now, as Gru says, back to work.

Notes

Collins, Harry M. 1974. “The TEA Set: Tacit Knowledge and Scientific Networks”. Science Studies 4. 165-86. (Online here.)

Collins, Harry M. 1985. Changing order: replication and induction in scientific practice.

Nosek, Brian, and Errington, Tim. 2019-2020. What is replication? MetaArXiv Preprints.

Turner, Stephen P. 2014. Understanding the Tacit. (p.96)

Contribute to the discussion...

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Want to receive updates about Replication Markets? Share your contact information below.

We’re sorry to see you go! Please visit our social media sites.

This site uses cookies to provide you with a better browsing experience.

Visit our Privacy Policy for more information.