I’ve got my copy of Steve Wolfram’s A New Kind of Science proudly displayed in my living room. Okay, so that’s just where my bookcases are. I am proud though: I look at the bright red and yellow on the spine and remember the excitement of 2002. I have even on occasion read some of the words inside.
Wolfram positioned himself as the next Galileo, bringing about a fundamental change in the practice of science. Some computationally-minded folks in the science community appear to have taken this seriously. At least, that’s what I gather from John Markoff’s recent write-up of The Fourth Paradigm in the NY Times.
The editors and contributors to The Fourth Paradigm take as a given already existing paradigms of 1) experiment, 2) theory, and 3) computation. Now they present a next step forward, which on quick glance appears to be a kind of super-charged empiricism reliant on computer-instrument hybrids.
From Markoff’s summary:
Now, as a testimony to his passion and vision, colleagues at Microsoft Research, the company’s laboratory that is focused on science and computer science, have published a tribute to Dr. Gray’s perspective in “The Fourth Paradigm: Data-Intensive Scientific Discovery.” It is a collection of essays written by Microsoft’s scientists and outside scientists, some of whose research is being financed by the software publisher.
The essays focus on research on the earth and environment, health and well-being, scientific infrastructure and the way in which computers and networks are transforming scholarly communication. The essays also chronicle a new generation of scientific instruments that are increasingly part sensor, part computer, and which are capable of producing and capturing vast floods of data. For example, the Australian Square Kilometre Array of radio telescopes, CERN’s Large Hadron Collider and the Pan-Starrsarray of telescopes are each capable of generating several petabytes of digital information each day, although their research plans call for the generation of much smaller amounts of data, for financial and technical reasons. (A petabyte of data is roughly equivalent to 799 million copies of the novel “Moby Dick.”)
I think there’s plenty of interesting stuff here, aside from the way these scientists use a created tradition of scientific history to frame their work. (Is this a meta-narrative?)
For one, there’s the sponsor: Microsoft Research. If we’re interested in the history of science in America, we have to pay as much attention to these sorts of non-academic sites of science as we do to universities.
Second, there’s the renewed focus on data analysis. These scientists are pitching this work as fundamentally new (indeed: paradigm-breaking), which historians of science might consider worth refuting. I’m partial—I just organized a panel in Phoenix to call for more attention to this sort of empirical work in modern science over the last century, though we focused on the persistence of natural history practices (like collecting) more generally. More on this to come…