Not an obviÂous conÂcluÂsion, I’ll agree. HowÂevÂer, Chris AnderÂson, ediÂtor of Wired, presents the arguÂment like this: as all sorts of data accuÂmuÂlate into a vast ocean of petabytes, our abilÂiÂty to synÂtheÂsize it all into eleÂgant theÂoÂries and laws will disÂapÂpear. The stoÂry is the covÂer of this monÂth’s issue of Wired but I came across it in a newsletÂter from The Edge, a group of thinkers tryÂing to proÂmote a “third culÂture” of online intelÂlecÂtuÂal thought.
AnderÂsonÂ’s arguÂment isn’t realÂly that the sciÂenÂtifÂic method will disÂapÂpear, but rather that corÂreÂlaÂtion will become as good as it gets in terms of anaÂlyzÂing real-world data. EveryÂthing will be too messy, noisy and changÂing too quickÂly for propÂer hypotheÂses and theÂoÂrems. As AnderÂson puts it, it will be “the end of theÂoÂry.”
The nice thing about readÂing this on Edge is that the newsletÂter comes with sevÂerÂal critÂiÂcal responsÂes includÂed from “The RealÂiÂty Club,” which includes thinkers like George Dyson, Kevin KelÂly and StuÂart Brand. But I say that as the conÂsumers and proÂducÂers of most of these massÂes of data, the vote should lie with you, readÂer: does Google’s brute force approach to data hordÂing spell the end sciÂenÂtifÂic eleÂgance?