Big Data and the End of History

2015

Publication History:

“Big Data and the End of History.” Perspecta 48 (2015),  Amnesia: 46-60.  Revised and republished in DAH, International Journal for Digital Art History 3 (2018): 21-35.   Electronic publication: https://dahj.org/article/big-data-and-the-end-of-history

The text posted here is an earlier draft and may differ from the published versions. Please only cite from copy in print

 

 

At the time of writing, in the summer of 2014, Big Data is a cultural trope more than a technical term.  Yet the expression  “Big Data” originally referred, simply, to our technical capacity to collect, store, and process increasing amounts of data at decreasing costs, and this original meaning still stands, regardless of hype, media improprieties or semasiological confusion.[1]  Historians of information technology would be hard pressed to see this as a novelty: Moore’s law, which describes a similar trend, has been known since 1965, and it holds true to this day.  Yet, even more than the adoption of the term by specialists and in some professions, the “Big Data” phenomenon indicates that some crucial qualitative threshold may indeed have been crossed of recent—or at least, suggests a general belief that it may soon be.  And there may be more to that than media hype.  Today, for the first time ever in the history of humankind, data are abundant and cheap, and getting more abundant and cheaper every day.  If this trend continues, one may logically infer that at some point in the future an almost infinite amount of data could be recorded, transmitted, and retrieved at almost no cost.  Evidently, a state of zero-cost recording and retrieval will always be impossible; yet this is where today’s technology seems to be heading to, asymptotically.  But if that is so, then we must also come to the inevitable conclusion that many technologies of data-compression currently in use will at some point become unnecessary, as the cost of compressing and decompressing the data (sometimes losing some in the process) will be greater than the cost of keeping the raw data in their pristine state for a very long time, or even forever.  If we say data-compression technologies, we immediately think of JPEG or MP3.  But let’s think outside of the box for a second.

 

Data compression technologies we don’t need any more

Big Data, which many today see as a solution, were more often a problem throughout the history of humankind.  For example, hand-processing big numbers with traditional arithmetical tools takes times and effort—and the bigger the numbers, the higher the risk of errors.  Hence that glorious invention of baroque mathematics, logarithms, which use tables of conversion in print to turn big numbers into small numbers, and the other way around; and, crucially, convert the multiplication of two big numbers into the addition of two smaller ones.  Laplace, Napoleon’s favorite mathematician, famously said that logarithms, by “reducing to a few days the labor of many months, doubled the life of the astronomer.”[2]  As well as, we may add, of many twentieth century engineers: logarithms are at the basis of that other magical tool, the slide rule, whereby engineers of the twentieth century could calculate almost everything in almost no time.  But, even though I myself still studied logarithms in school for many months, I never learned to use my father’s slide rule, because by the time I was 15 I could buy a Texas Instruments pocket calculator for next to nothing.  That worked much faster and more precisely than all logarithmical tables and slide rules combined.  Today, logarithms are a relic of an age gone by: a fascinating chapter in the history of early modern mathematics, but which no astronomer or engineer would waste time on.  Logarithms are a technology of data compression we don’t need any more.

And to take another example closer to the daily life of today’s design professionals: scaled drawings in plans, elevations, and sections have been the basic tool of the designer’s trade since the Renaissance, and the geometrical rules of parallel projections were famously published in 1799 by Gaspard Monge under the name of descriptive geometry.  But seen from a historical perspective, and from today’s vantage point, descriptive geometry is another cultural technology typical of a Small Data environment, as descriptive geometry uses parallel projections to compress big 3D objects into a small flat drawings, which can be easily recorded, stored and transmitted on simple sheets of paper.  In this instance, the compression of data is also a visible and physical one.  No one could store the Seagram Building in reality—it is quite a big building—but many offices could store (and some did, in fact, store) the batch of drawings necessary to make it, and, if needed, to remake it.  Today, however, using digital technologies, we can store not only a huge number of planar drawings, but also full 3D avatars of buildings, on a single memory chip—including all the data we need to simulate that building in virtual reality, or to build it in full.  And technologies already exist that allow designers to operate directly in 3D, hence avoiding the mediation of planar drawings and of the geometrical projections underpinning them.  In short, if buildings can be entirely notated as informational models in three dimensions from the start, the ways to represent them may change at all times based on need, and in many cases without falling back on plans, elevations, and sections.  Descriptive geometry is another cultural technology for data compression already on the way out (and in fact, few schools of architecture still teach it).

The list of cultural technologies that have been with us from time immemorial, but which are being made obsolete by today’s Big Data, is already a long one.  To take an unusual suspect, the alphabet is a very old and effective technology for data compression: a voice recorder, in fact, which converts an infinite number of sounds into a limited number of signs, which can be easily notated, recorded, and transmitted across space and time.  This strategy worked well for centuries, and it still allows us to read transcripts from the living voice of famous people we never listened to and who never wrote a line, such as Homer, Socrates, or Jesus Christ, for example.  Yet today’s technologies allow us to record, transmit, retrieve, and process speech as sound, without converting it into alphabetical signs (for the time being, the machine still does it, unbeknownst to us; but this may change, too).  Thus today we can already speak to some machines without using keyboards, and receive answers from machines that vocalize words we no longer need to read.  Keyboards used to be our interface of choice to convert the infinite variations of our voice (Big Data) into a short list of standardized signs (Small Data), but this informational bottleneck, typical of Small Data environments, is already being bypassed by today’s speech recognition technologies.  

If the prospect of building without drawing frightens many architects, a civilization without writing may appear more than apocalyptic—almost a contradiction in terms.  Yet, in purely informational terms, most technologies of notation (from numerals to Euclidian geometry; from musical scores to the Laban scripting for dancers) are tools we created in order to convert complex, data-rich phenomena into stripped-down and simplified transcriptions, easier to keep, edit and forward to others.  These transcriptions are less and less necessary today as, thanks to Big Data, we can record, transmit and manipulate digital avatars that are almost as rich in data as the originals they replace (and far richer in meta-data).  Of course, no digital replica will ever replace a natural phenomenon in full, as every copy implies some degree of abstraction, hence some data will always be left out or lost in the process.  But in practice, once again, it is the tendency that matters, and this tendency has already started to affect today’s science, technology, and culture.

Let’s take another example—and one well-known one among architects, as it has been at the core of digitally intelligent design for the last twenty years: the calculus- based, digital spline.[3]  Calculus, another great invention of baroque mathematics, is the ultimate Small Data technology, as it compresses an infinite number of points into a single short notation, in the format y = f (x).  In practice, the script of a function contains all the points we may ever need to draw or produce that line (curve, or, adding one letter, surface) at all possible scales.  But let’s put ourselves, again, into a Big Data state of mind, and let’s assume we can have access to unlimited, zero-cost data storage and processing power.  In that case, we could easily do away with any mathematical notation, and simply record instead a very long, dumb log: the list of the positions in space (X,Y, Z coordinates) of many points of that line—as many as necessary, perhaps a huge number of them.  That file will record the individual positions in space of a cluster or cloud of points that may not appear to follow any rule or pattern—rule or pattern which would in fact be of no use, so long as the position of each point is known, measured, and recorded, in two or three dimensions.  This is exactly the kind of stuff humans don’t like, but computers do well.  A long time ago we invented dimensionless Euclidean points and continuous mathematical lines to simplify nature and translate its unruliness into short, simple, elegantly compressed notations—Small Data notations, made to measure for the human mind; notations we could write down, and work with.  But today computers do not need any of that any more.  Today’s digital avant-garde has taken due notice: some of today’s digital designers have already discarded the Small Data, calculus-based spline (Bézier’s spline), which was so important for the digital style of the 90s, and have started to use Big Data and computation to engage the messy discreteness of nature as it is, in its pristine, raw state—without the mediation of elegant, streamlined mathematical notations. 

 

Search, don’t Sort: the End of Classical and Modern Science

This new trend in today’s post-spline digital style may be discounted as a fad, a quirk, or an accident.  It shouldn’t be.  All major, pervasive changes in our visual environment are the sign of a concomitant change in our technical, economic, and scientific paradigms.  Indeed, if we look back at our data-starved past from the perspective of our data-opulent present, all the instances just mentioned (and more could be added) suggest that Western science as a whole, from its Greek inception, could be seen today as a data compression technology strenuously developed over time to cope with a chronic shortage of data storage and of processing power.  As the data we could record and retrieve in the past were not many, we learned to extrapolate and generalize patterns from what data we had, and to record and trasmit condensed and simplified formal notations instead of the data themselves. Theories tend to be shorter in writing than the description of most events they apply to, and indeed syllogisms, then equations, then mathematical functions, were, and still are, very effective technologies for data compression.  They compress a long list of events that happened in the past into a very short script, generally in the format of a causal relationship, which we can utilize to describe all other events of the same kind, including future ones.  In his last book, Discorsi e Dimostrazioni Matematiche, which had to be smuggled out of Tuscany and printed in a Protestant country to escape the Inquisition’s censorship, Galileo reported and illustrated a number of experiments he had made to study how some beams break under load.[4]  But we need not repeat any of his experiments, nor any other experiment, today, to determine how standard beams will break under most standard loads, because, generalizing from Galileo’s experiments, and from many more that followed, we have obtained a handful of very general laws, which all engineers study in school: a few, clean lines of mathematical script, easy to commit to memory, which derive from all the beams that broke in the past, and describe how beams will break in the future under similar conditions.  This is how modern science worked—till now.

For let’s imagine, again, that we can collect an almost infinite amount of data, keep it forever, and search it at will, and at no cost.  We could then assume that every one of those experiments, or more in general, the experiential breaking of every beam that ever broke, could be notated, measured, and recorded.  In that case, for every future event we are trying to predict, we could expect to find and retrieve a precedent, and the account of that past event would allow us to describe the forthcoming one—without any mathematical formula, function, or calculation.  The spirit of Big Data, if there is one, is probably quite a simple one, and it reads like this: whatever happened before, if it has been recorded, and if it can be retrieved, will simply happen again, whenever the same conditions reoccur.  This is not very different from what Galileo and Newton thought.  But Galileo and Newton did not have Big Data; in fact, they often had very few data indeed.  Today, instead of calculating predictions based on mathematical laws and formulas, using Big Data we can simply search for a precedent for the case we are trying to predict, and retrieve it from the almost infinite, universal archive of all relevant precedents that ever took place.[5]  When that happens, search can replace the method of modern science in its entirety.

This apparently weird idea is not science fiction.  This is already happening, in some muted, embryonic way, in several branches of the natural sciences, and more openly, for example, in weather forecasting.  Once again, this may not be either a rational or a palatable fact, but it is a tendency—it is in the air, whether we like to admit it or not.  And sure enough, some historians of science have already started to investigate the matter—with much perplexity and reservations, as we could expect.[6]  Indeed, from an even more general, philosophical point of view, mathematical abstractions such as the laws of mechanics or of gravitation, for example, or any other grand theory of causation, are not only practical tools of prediction, but also, and perhaps first and foremost, ways for the human mind to make sense of the world—and some could argue, as many did in the past, that the laws thus discovered are Laws of Nature, of which they disclose and unveil the inner workings. Conversely, if abstraction and formalization (i.e., most of classical and modern science, in the Aristotelian and Galilean tradition) are seen as merely contingent data compression technologies, one could argue that in the absence of the technical need to compress data in that particular way, the human mind can find many other ways to relate to, or interpret, nature.  Epics, myth, religion, and magic, for example, offer vivid historical examples of alternative, non-scientific methods; and nobody can prove that the human mind is or ever was hard-wired for modern experimental science.  Many post-modern philosophers, for example, would strongly gainsay that notion.   

This is probably not a coincidence, as the post-modern science of Big Data marks a major shift in the history of the scientific method.  Using the Big Data approach to science (i.e. prediction by search and retrieval of precedent, instead of prediction by the transmission of general laws), modern determinism is not abandoned, but employed at a new, granular scale.  Western science used to apply causality to bigger and bigger groups, or sets, or classes of events—and the bigger the group, the more powerful, the more elegant, the more universal the law that applied to it.  Science, as we knew it, tended to universal laws—laws that bear on as many different cases as possible.  Today’s new science of Big Data is just the opposite: using information retrieval and the search for precedent, Big Data causality can be applied to smaller and smaller sets, and it works best when the sets it refers to are the smallest.  Indeed, the new science of Big Data only works in full when it does not apply to a class or group of events, but only to one, specific, individual case—the one we are looking for.[7]  In that, too, Big Data represents a complete reversal of the classical (Aristotelian, Scholastic, and modern) scientific tradition, which always held that individual events cannot be the object of science: most Western science only dealt with what Aristotle called forms (and today we more often call classes, universals, sets, or groups).[8In social science and in economics, this novel Big Data granularity means that instead of referring to generic groups, social and economic metrics can and will increasingly relate to specific, individual cases.  This points to a brave new world where fixed prices, for example, which were introduced during the industrial revolution, will cease to exist (and famously, this is already happening)[9]; likewise, the cost of a medical insurance, calculated as it is today on the basis of actuarial and statistical averages, could become irrelevant, because it will be possible to predict, at the granular level, that some individuals will never have medical expenses, hence they will never need any medical insurance, and some will have too many medical expenses, hence no one will ever sell them any medical insurance.  This is a frightening world, because the individual that becomes the object of this new science of granular prediction is no longer a statistical abstraction— it becomes each of us, individually.  This may be problematic from a philosophical and religious point of view, as it challenges traditional ideas of determinism and free will; but in more practical terms, it is also simply incompatible with most principles of a liberal society and of a market economy in the traditional, modern sense of both terms.

In the field of natural sciences, however, the picture is quite different, and apparently less frightening.  Following the old (i.e., the “modern”) scientific method, natural sciences too used to proceed by generalization and abstraction, applying more and more general formulas to larger and larger swaths of the natural world.  To the contrary, using the new method of granular retrieval of precedent we are no longer limited to predicting vast and general patterns—we can try and predict smaller and smaller events, up the most singular ones.  As recent works by Neri Oxman and Achim Menges, among others, have proven, we can now design structural materials at minuscule, almost molecular scales, or quantify and take into account the infinite, minute and accidental variations embedded in all natural materials.[10]  This runs counter to the method of modern science, which traditionally assimilated all materials, natural and artificial alike, to homogeneous chunks of continuous matter: for the last two centuries the main mathematical tool at our disposal to describe structural and material deformations was differential calculus, which is a mathematics of continuity, and abhors singularities; to allow for mathematical modeling, i.e. a quantitative prediction of their behavior, new industrial materials were designed to be isotropic and continuous, and natural materials were doctored, processed, and tinkered with in order to achieve some degree of homogeneity.  To the contrary, using the granularity of Big Data we may now model the structural behavior of each individual part in a hyper-complex, irregular, and discontinuous 3D mesh, including the behavior of the one part we are interested in—that which will fail, one day.  Used this way, the new science of granular prediction does not constrain but liberates, and almost animates, inorganic matter.[11]

Search, Don’t Tell: The End of Classical and Modern History

Since data scarcity has been a truly universal human condition across all ages, cultures, and civilizations, we can expect to find similar strategies of data compression embedded in most, if not all, cultural technologies we have been familiar with to this day.  Historiography, or the writing of history, codified as an academic discipline and cultural practice in the course of the nineteenth century, is no exception.  Like the modern scientist, the modern historiographer must infer a theory (in the case of history, more often an argument, or a story) from a vast archive of findings.  These data are not reported as such (even though today facts and sources are often listed or referred to in footnotes), but they are subsumed and transfigured, as it were, in a larger narrative—namely, a history—that derives from the original findings, but encompasses and describes them in more general terms.  In this sense the historiographer, similar to a bard or ancestral storyteller, must condense and distill an accumulation of accidental experiences into one streamlined sequence or story, following a linear plot easier to remember and to recount, whereas most factual events that inspired it are destined to remain anecdotal—i.e., literally, unsaid.  Half way between the storyteller’s plot, and the scientist’s theory, the historiographer’s narration, or history, weaves endless anecdotes into one meaningful narrative.  This narrative, once again, functions as a lossy data-compression technology: only the story thus construed will be recorded and transmitted and will bear and convey memories, wisdom or meaning, whereas most of the individual events, experiences or (in the Aristotelian sense of the term) accidents that inspired it will be discarded and forgotten. 

But, as already Walter Benjamin had intuited,[12] today’s increasingly abundant dissemination of raw information goes against this ancestral strategy of story-building and story-telling.  Let’s imagine, once again pushing the argument to its limits, that a universal archive of historical data may be collected, recorded, transmitted, and searched at will, by all and forever.  The term “historical” would become ipso facto obsolete, as all facts must have occurred at some point in time in order to have been recorded, hence all data in storage would be “historical”, and none more so than any other.  And since Goggle has already proven that no two searches are the same, every search in this universal archive would likely yield always new results—based on user preference, context, endless more or less secret parameters and the sheer complexity and whim of search algorithms.  Consequently, at that point no “narrative”, theory, story, or sequence would be stronger than any other; and in fact no narrative, theory, sequence or story would even be needed or warranted any more.  Only the data would speak—forever, and whenever asked, never mind by whom, and every time anew.  

Again, it is easy to dismiss this Big Data scenario (which, it will be noted, is structurally similar to the post-scientific, prediction-by-retrieval paradigm I outlined earlier) as a sci-fi nightmare.  Yet, once again, signs of this impending change can be already detected in today’s technology and culture, and they have seeped through contemporary social practices.  Take this example, which will sound familiar to many scholars of my generation.  For all undergraduates studying architectural history in Italy (and elsewhere, for that matter) in the late 1970s or early 1980s, the architectural history survey was basically a book, sometimes two, outlining a much simplified, teleological and ideological narrative of the rise of the Modern Movement in architecture (in Pevsner’s and Giedion’s historiographical tradition, that was the Hegelian and Marxian story of a linear rise to culmination, with no fall ever to ensue).  Our textbooks (no names mentioned) contained a limited selection of small, and often very poor black-and-white pictures of the buildings under discussion, which in most cases where the only visual evidence available of buildings that few students, and indeed not many of our teachers, had ever seen.  This is why we went to class, when we could: not so much to hear but to see.  Giovanni Klaus Koenig’s lectures were so popular across the whole university of Florence (and beyond) thanks to Koenig’s own unparalleled talent as a jocular storyteller, but also because of the color slides he showed, which he had made himself, traveling by car to Germany, Austria, and Switzerland (therefore no images of French or American buildings, for example, were ever shown).  Most of the learning we could glean in such a techno-cultural environment was indeed, and could not have been other than, a narrative: a story, or a theory, which we got to know much better than any of those famous buildings upon which those stories were supposedly predicated.  As for the buildings themselves, some we could visit when traveling, and some we would get a glimpse of, somehow, through a handful of color photographs in the relatively few books we could peruse at university libraries or buy in bookshops.  A quarter of a century after Malraux’s Musée Imaginaire, even the most famous buildings of the twentieth century were then still known exclusively through a very limited repertoire of authorial pictures, often due to some well-known photographers working in collaboration with the architects.  Before I first traveled to Berlin as a student, I knew Ezra Stoller’s photographs more than Mies van der Rohe’s buildings.

Compare that situation—which was the norm throughout the age of printing and of modern mechanical technologies—to today’s wealth of visual and verbal documentation, available by a click of the mouse, by tapping on a touch-screen or—as I just did—by saying “Mies van der Rohe” to my smartphone.  One generation ago, the same scant data were imparted to all.  Today, information is so abundant and easily searchable that each user can find her or his own.  But due to the largely unauthorial, raw or crowdsourced nature of most of this wealth of information, each person doing the same search will likely come up with slightly different results, and sometimes with conflicting or incompatible information.  That is indeed the way Big Data works, for scientists no less than for students or for the general public: Big Data is useful and usable only on average, and in the aggregate.[13]  Failing that, each end-user will construct her or his own argument based on a random or arbitrary selection from an extraordinary array of data, and each selection will likely be different from all others.  Each narrative or argument thus put together will therefore tend to idiosyncrasy and ephemerality.  Anyone who has tried to teach a research seminar in the traditional way (i.e. expounding a sequential argument) in front of a group of doctoral students busily ferreting out odd happenstances, photoshopped images and wrong information from their tablets in real time will be familiar with this predicament.  This does not mean that a thousand anonymous pictures on Instagram and a promenade through Google Earth are worth less than a single shot by Iwan Baan—in fact, in statistical terms, the opposite is true.  But this does mean that many cultural habits we used to take for granted were in fact the accidental fallout of data skimping, and are already incompatible with the data rich environment we live in.  Whether we like it or not, when an infinite amount of facts are equally available for anyone’s perusal, search and retrieval, we may no longer need theories, stories, histories, or narratives to condense or distill data, and to present them in a linear, clean, and memorable array.  Again, one may argue that we will always need theories and stories for a number of other reasons, but—as mentioned earlier—that is difficult to prove.

And so it would appear that many anti-modern and post-modern ideological invocations or vaticinations, from Nietzsche’s “eternal recurrence” and Lyotard’s “fragmentation of master narratives” to Baudrillard’s or Fukuyama’s “end of history,” to name a few, all came, in retrospect, a bit too early—but all may soon be singularly vindicated by technological change.[14]  What ideology could not accomplish in the twentieth century, technology is making inevitable in the twenty-first.  If Search is the new science, Big Data is the new history.  But not the history we once knew.

 

   

[1] On the origin and different meanings of the expression, see Victor-Mayer Schönberger, Kenneth Cukier, Big Data. A Revolution that Will Transform How We Live, Work, and Think (Boston and New York: Houghton Mifflin Harcourt,  2013), 6 and footnotes.

[2] Pierre-Simon de Laplace, Exposition du système du monde (Paris: Imprimerie du Cercle-Social, IV- VI [1796-98]), vol. II, 5, IV, p. 266 ([Kepler…] eut dans ses dernières années, l’avantage de voir naître et de profiter de la découverte des logarythmes, artifice admirable, dû à Neper, baron écossais ; et qui, réduisant à quelques heures, le travail de plusieurs mois, double, si l’on peut ansi dire, la vie des astronomes, et leur épargne les erreurs et les dégoûts inséparables des longs calculs ; invention d’autant plus satisfaisante pour l’esprit humain, qu’il l’a tirée en entier, de son propre fonds. Dans les arts, l’homme emploie les matériaux et les forces de la nature, pour accroître sa puissance ; mais ici, tout est son ouvrage.”)

[3] I discussed this in “Breaking the Curve. Big Data and Digital Design,”  Artforum 52,6 (2014): 168-173. 

[4] Galileo Galilei, Discorsi e Dimostrazioni Matematiche intorno à due nuove scienze, attinenti alla mecanica e i movimenti locali (Leiden: Elzevir, 1638), 116-133.

[5] In July 2008 a groundbreaking article by Chris Anderson in Wired first argued for a scientific revolution brought about by Big Data (“The End of Theory,” Wired 7, 2008, 108-09; that issue of Wired was titled “The End of Science,” although the other essays in the “Feature” section of the magazine did little to corroborate Anderson’s vivid arguments).   The main point in Anderson’s article was that ubiquitous data collection and randomized data mining would enable researchers to discover unsuspected correlations between series of events, and to predict future events without any understanding of their causes (hence without any need for scientific theories).  A debate followed and Anderson retracted some of his conclusions (Schönberger and Cukier, 2013, 70-72 and footnotes).  From an epistemological point of view, however, what was meant by “correlation” in that debate did not differ from the modern notion of causality, other than in the practicalities of the collection of much bigger sets data, and in today’s much faster technologies for data processing.  Both classical causation and today’s computational “correlation” posit quantitative, cause-to-effect relationships between phenomena; and at the beginning of the scientific enquiry, both the old (manual) way and today’s computational way need some hypotheses to select sets of data among which even unexpected correlations may emerge.  Evidently, today’s computational processes make the testing of any such hypotheses much faster and more effective, but the methodological and qualitative changes that would follow from such faster feedback loops between hypotheses and verification were not part of that discussion.  A somewhat similar but more promising debate is now taking place in some branches of applied technologies, such as structural engineering: see Carpo, “The New Science of Form Searching,” forthcoming in AD,  Architectural Design, 85 (2015), 5 (Material Synthesis: Fusing the Physical and the Computational, guest-edited by Achim Menges).

[6] See D. Napoletani, M. Panza, and D.C. Struppa, “Agnostic Science: Towards a Philosophy of Data Analysis,” Foundations of Science, 16 (2011), 1, 1-20.

[7] As search always starts with, and aims at, one individual event, the science of search is essentially a science of singularities, but the result of each search is always a cloud of many events, which must be compounded, averaged, and aggregated, using statistical tools.  Thus there are no limits to the level of “precision” of a search (a lower level of precision, i.e. less intension, will generate more hits, i.e a larger extension in the definition of the set.)

[8] On this aspect of Aristotelian science, see Carlo Diano, Forma e evento. Principi per una interpretazione del mondo greco (Venezia: Neri Pozza, 1952).   The rejection of modern science as a science of universals is central to the post-modern philosophy of  Gilles Deleuze and Félix Guattari.  In Milles Plateaux, in particular, Deleuze and Guattari opposed the “royal science” of modern science, based on discretization (“striated space”) to the “smooth space”of “nomad sciences,” based on “nonmetric, acentered, rhizomatic multiplicities that occupy space without counting it and can be explored only by legwork,”  which “seize and determine singularities in the matter, instead of constituting a general form… they effect individuations through events or haecceities, not through the object as a compound of matter and form.” Deleuze and Guattari saw the model of nomad sciences in the artisan lore of medieval master builders, i.e. in the past, before the rise of modern science; and they had no foreboding of the then nascent new technologies that would inspire digital makers one generation later.  See Gilles Deleuze and Félix Guattari, A Thousand Plateaus, tr. B. Massumi (London and New York: Continuum, 2004), ch. 12, “Treatise on Nomadology,” 406-09 and 450-51. (First publ. as Milles Plateaux, Paris: Les Editions du Minuit, 1980; English transl. first publ. Minneapolis: University of Minnesota Press, 1987)

[9] See Carpo, “Micro-managing Messiness: Pricing, and the Costs of a Digital Non-Standard Society,”  Perspecta 47 (2014), Money: 219-226.

[10] See Neri Oxman, “Programming Matter,” AD, Architectural Design, 82 (2012), 2 (Material Computation: Higher Integration in Morphogenetic Design, guest-edited by Achim Menges), 88-95, on variable property materials; Achim Menges, “Material Resourcefulnes: Activating Material Information in Computational Design,” AD, Architectural Design, 82 (2012), 2, 34-43, on non-standard structural components in natural wood. 

[11] This vindicates the premonitions of Ilia Prigogine, another post-modern thinker whose ideas were a powerful source of inspiration for the first generation of digital innovators in the 1990s.  See in particular Ilya Prigogine, Isabelle Stengers, Order out of Chaos: Man’s New Dialogue with Nature (New York: Bantham Books, 1984; first published in French as La Nouvelle Alliance : métamorphose de la science, Paris : Gallimard, 1979). 

[12] Walter Benjamin, “The Storyteller.  Reflections on the Works of Nikolai Leskov,” in  Illuminations: Essays and Reflections, transl. Harry Zohn, ed. Hannah Arendt (New York: Schocken Books, 1968), 83-109.  First published as “Der Erzhäler. Betrachtungen zum Werk Nikolai Lesskows” in Orient und Okzident, October 1936.  Benjamin however considers the ancestral storyteller, as well as the oral chronicler, as the conveyors of raw data, and sees only the modern novel, and modern historiography, as abstract, simplified, linear narratives that are construed independently from the events upon which they are based and from which they derive.  After the works of Marshall McLuhan and particularly of Walter Ong on the cultures of orality, it is easier today to see the bard, or storyteller’s recitals as tools of abstraction and memory devices.

[13] See note 7, above

[14] Nietzsche’s first mention of “eternal recurrence” is in aphorism 341 of The Gay Science (Die fröliche Wissenschaft, 1882-87).  Lyotard spoke of the “décomposition des grand Récits,” or “métarécits”: Jean-François Lyotard, La condition postmoderne (Paris: Les éditions de Minuit, 1979), 31. The “end of history” may have been first proclaimed by Jean Baudrillard, Simulacres et Simulations (Paris: Galilée, 1981), 62-76 (see in particular 70: “l’histoire est notre référentiel perdu, c’est-à-dire notre mythe”). Francis Fukuyama, The End of History and The Last Man (New York: Free Press; Toronto: Maxwell Macmillan Canada, etc., 1992). See also Fukuyama, “The End of History?” The National Interest 16 (Summer 1989): 3-18

Publication

Perspecta 48

Citation

“Big Data and the End of History.” Perspecta 48 (2015),  Amnesia: 46-60.  Revised and republished in DAH, International Journal for Digital Art History 3 (2018): 21-35.   Electronic publication: https://dahj.org/article/big-data-and-the-end-of-history