This post builds on the one Lukas put up last week. Most commentaries on the Snowden Affair, PRISM, and the other NSA programs that have come to light have focused on whether these programs are constitutional, whether Snowden is a hero or villain or something else, and, now, what these programs will mean for US foreign relations. I have also heard people ask how any of us could be surprised by these programs, and for a few days, people spent a lot of time talking about Snowden’s girlfriend’s pole-dancing skills. In other words, the Snowden Affair has all the markings of a major American media event.
In this post, I’d like to exercise the historian’s prerogative by exploring how these NSA programs fit into a longer historical trajectory, namely how government spending and procurement influence technological change.
The history and sociology of science and technology are full of well-known stories of how government funding affected the direction and growth of technological innovation. The best known stories in the United States have to do with technical advances made at MIT, Harvard, and Los Alamos during World War II and the wide variety of scientific breakthroughs and technologies that emerged from Cold War defense spending. (Mark Buchanan recently put up an entertaining post about the many technologies that ultimately have roots in government spending.) There are many earlier examples in the United States from WWI and even the 19th century. Of course, any comprehensive history of the military-science-technology relationship would have to go back much further; in the West, right through 18th century French science societies through da Vinci at least back to Archimedes.
We can assume that spending on intelligence and the technology that undergirds it exploded after 9/11. 9/11 was to the surveillance-industrial complex what Sputnik was for Cold War sci-tech funding. It would be interesting to know whether the programs that developed after that date were merely extensions—if massively scaled up extensions—of things that were already in the works. It would also be interesting to know how many new programs developed after that date (versus building on old programs).
But it would also be fascinating to learn how these programs have influenced technological change, if at all. Do fundamentally new and largely unknown computing technologies lay behind the NSA’s capabilities. Are these capabilities mostly the result of hugely scaling up technologies that are already well known (server farms, data mining algorithms, etc.)? Or will we look back at the NSA’s programs as greatly changing computing technologies? If so, which companies would have produced these technologies for the agency? Mostly defense contractors? Or mostly computing firms? Or might the government have its own internal R&D shops?
Economists and historians often examine “spill over” to see how government, typically military, spending ends up influencing the broader economy. To the degree that new technologies, processes, and techniques are being developed through these programs, for several reasons, it will likely be very difficult down the road to determine how much these things have moved into the domestic sector. First of all, the NSA can likely prevent the spread of new technological systems (if truly new, x-technologies, like quantum computing, are a part of the programs). But the agency cannot easily stem the dissemination of the experience and tacit knowledge that people will gain by working in these programs. People will move to other jobs and take their experiences in, say, developing data mining algorithms with them. Again, the movement of this knowledge will be very difficult to track.
Second, contractors, like Snowden, do a significant portion of US intelligence work. Today,
on Meet the Press
, Rep. Nancy Pelosi said that the Obama Administration has done a great deal to decrease the role of contractors in classified projects. I don’t know where Pelosi is getting her information, but my instinct says that she is overestimating the decline of contractors under Obama. As long as intelligence remains tied to the use of enormous computer networks, contractors will likely continue to play an essential role. Even in the midst of news about Snowden, we have learned more
about Amazon’s contract to build cloud computing infrastructure for the CIA. Booz Allen Hamilton employees and other such consultants and contractors will take the lessons they learn in working for the NSA and apply them elsewhere. I’m sure the opposite is also true: the contractors are bringing lessons learned from private industry and using them for the intelligence agency. Indeed, in terms of the movement and synthesis of knowledge, between the private and public sectors, these contracting firms are likely important nodes that historians and sociologists would do well to examine . . . if they ever can . . . all of this is veiled in such horrible secrecy. In this case, however, secrecy might also have dire implications for our ability to study the realities of US innovation policy, since the surveillance-industry complex will have an unknown relationship to technological change and economic growth.
What I keep wondering is how we will see these things in ten or twenty years. Will we see the NSA’s influence on technology as we now see Cold War sci-tech funding, that is, as a hugely important source of technological change and knowledge production? Or will the NSA’s programs seem like just seem like another (ultimately boring) application of “big data” and the “app economy”?
If any readers—especially those readers with deep knowledge of computing and/or computing history—have thoughts about the relationship between the NSA’s programs and technological change, I’d love to hear them.