At some point about halfway through the hurly-burly of pulling together our special issue on what I’d taken to calling The Data Age, senior associate editor Ryan Bradley noticed that Stephen Wolfram had created a timeline of significant milestones in the historical march of data. We thought it would be an excellent piece of contextual glue to apply to our analysis of the burgeoning power of data, well wielded, to both illuminate and influence our world. Fortunately, Wolfram agreed, and the timeline ran as connective tissue along the bottom of our magazine pages. Wolfram was at that time about to host his second annual Wolfram Data Summit in Washington, D.C., a gathering of database curators and purveyors and open-source savants to discuss how best to cultivate and process and liberate our geometrically expanding data bounty. Wolfram’s data-processing answer engine, Wolfram Alpha, is an outrageously ambitious and optimistic enterprise. It combines the algorithmic might of his own Mathematica software with brute-strength data-curation efforts to answer questions users may not even be aware they’re asking. I figured he would be an interesting person to chat with about data.
The phone call lasted about an hour and 45 minutes, from me thanking him for allowing us to exploit his curatorial skills and asking whether he agreed with our proposition about the burgeoning power of Big Data all the way through to Wolfram’s description of his algorithmic pursuit of the origin and laws of our universe, basically a mash note to the brilliant irreducible complexity of nature (as opposed to the iterative plodding simplicity of human engineering). It was fun. I’ve edited the full transcript for clarity and tightened it just a bit for your benefit, but otherwise I think it’s worth reading in full. Enjoy the trip.
MARK JANNOT: First off, thank you for letting us use your timeline in our Data Issue. We’ve employed it as a helpful tool to drive readers through the entire feature well and package this issue together in a nice way, so it’s really great.
STEPHEN WOLFRAM: I’d always thought systematic data is important in the progress of civilization, but I have to say that as we put together the poster, I really realized how different steps in the creation of important directions in civilization were made possible because there was systematic data about this or that, or people could count on data being available. Of course, there were some bad things that happened in the course of history as a result of large chunks of data being collected, too, but I like to look at the positive progress.
Did you choose to leave out the negatives in the timeline?
I don’t know the history tremendously well, but once you can do censuses of people, you can decide, We don’t like who they are and we can figure out where they are based on the census.
It’s true of anything, I guess—anything that’s going to lead to progress will open us up to abuses and negative results. One of the pieces we’re publishing is about Albert-László Barabási and the implications of his current thinking about hubs and nodes, and learning which are the action hubs and nodes. Once you know that, you can use that knowledge to control that system, and of course while there is much in the way of positive outcomes of such a thing to anticipate, you can easily imagine all the potential negative consequences too.
I’ve been involved in two sides of this. One is data as content, and the other is the science of large collections of things that get represented by data. On both sides of that, I’ve seen all these things: people worrying about understanding networks for how terrorist organizations are set up, un