Breaking the Curve. Big Data and Digital Design

2014

Publication History:

“Breaking the Curve. Big Data and Digital Design.” Artforum 52,6 (2014): 168-173

The text published here is an earlier draft and it is different from the published version. Please only cite from copy in print

The digital is all about variation. We know from daily experience that the many digitally based media we use today, from text to images to music, are permanently in flux; their variations can be designed by one or more end users (think of a Wikipedia entry), or by machinic algorithms (think of a Google search), and may at times appear to be entirely out of control, changing in some random and inscrutable way. The same logic applies to the design of physical objects, from teaspoons to entire buildings. Since the early 1990s, the pioneers of digitally intelligent architecture have claimed that computational tools for design and fabrication will be able to mass-produce variations at no extra cost: Economies of scale do not apply in a virtual environment, and (within given limits) digitally produced, mass-customized objects, all individually different, should cost no more than standardized and mass-produced ones, all identical. In this sense, digital culture and technology should be seen as the ultimate embodiment of postmodernity, of an infinitely flexible world—a postmodern dream come true. But in architecture and design, things have not yet worked out that way. For right at the start, the digital turn in architecture was hijacked by one tool that soon outweighed all others to become the protagonist—almost the monopolist—of the new digital design scene: spline modelers.

Spline modelers are those magical instruments, now embedded in most software for computer graphics and computer-aided design, that translate every ran- dom cluster of points, every doodle or uncertain stroke of the hand, into perfectly smooth, curving, and continuous lines. The mathematics of spline modelers, however, predates the rise of today’s digital technologies: Its present form was invented in the late ’50s and early ’60s by two French engineers, Pierre Bézier and Paul de Casteljau, working for the Renault and Citroën carmakers, respectively. A few years apart, Bézier and de Casteljau found two slightly different mathematical ways to join a set of nonaligned geometrical points with a continuous curve, called a spline in English. This operation was previously performed by hand, typically by joining and bending flexible slats of wood or rubber; using Bézier’s and de Calsteljau’s formulas, it became possible to calculate that curve mathematically, without performing any manual operation. Splines, notated as mathematical functions, could hence be recorded and transmitted as mere sequences of letters and numbers and, when necessary, recalculated, edited, or tweaked via the assignment of different values to some parameters. In turn, the great novelty of the early ’90s was that innovations in early software for computer-aided design made some of this mathematics easily accessible through graphic user interfaces: control points and vectors that anyone could easily edit, move, and drag on the screen with the click of a mouse. As this game turned out to be faster and more intuitive than the high school calculus on which it was based, such software tools, dubbed spline modelers, soon became hugely popular among digital designers, and a swooping, spline-based style famously became the hall- mark of early computer-based design.

Bézier’s and de Casteljau’s splines are pure mathematical objects. As such, they do not fit with the phenomenological world we inhabit; designers using spline modelers “model” reality by converting it into a stripped-down mathematical script, and the continuous lines and uniform surfaces they draw or make are ultimately only a material approximation of the mathematical functions that computers have calculated for them. Of course, not every digitally intelligent designer in the ’90s was a pure spline-maker: Greg Lynn and Bernard Cache explicitly claimed to use calculus as a primary tool of design (the former intro- duced the now-legendary term blob to describe his sinuous shapes, exemplified by his Embryological Houses [1997–2001] and the almost coeval design for his Alessi tea and coffee set), whereas Frank Gehry, for example, used computers to scan, measure, notate, and build irregular, nongeometric three-dimensional shapes. From its inauguration in 1997, Gehry’s Guggenheim Bilbao was hailed as a global icon of the new digitally driven architectural style. But as most digital designers of the era (including those working for Gehry) used spline modelers extensively to even out the final lines and surfaces of their designs, the divide between free-form shapes and mathematical splines was often a tenuous one, and not easily perceived. In either case, if one just looks, what one sees is, sim- ply, the sweeping arcs that were so widely noted at that time. Fast-forward to the end of the first decade of the 2000s. The Internet boom famously busted in 2001, and the new emphasis on participation and interactivity that marked the digital domain in the early years of the millennium (the so-called Web 2.0) never found much currency among designers. At the same time, the spline-dominated, curvilinear style that came to be associated with digital design in the course of the ’90s lived on, still successfully practiced by some of the early pioneers and handed down to a new generation of younger digital designers. Spline making has been repeatedly celebrated as the quintessential style of the digital age (more recently under the slightly misleading title parametricism), and sure enough, with technical progress, many visionary predictions from the digital ’90s are becoming a reality. Big, streamlined surfaces can now be built at more and more affordable prices, and recent projects by Zaha Hadid Architects and others propose to deploy this sinuous language at increasingly urban scales.

Today, however, another trend gaining momentum among technologists and scientists has started to affect digital design in many different ways, both overt and covert. The notion that computers can manage and process huge troves of data is nothing new, but the “big data” approach to computation (which scientists tend to call, more discreetly, data analysis) may be more disruptive than it appears. For throughout history, and since the beginning of time, Western science and technology had to make do with very little data indeed. Ars longa, vita brevis: We can only perform a given number of experiments in a lifetime, and we can only record and transmit a chosen few of them. Before the invention of print, the amount of information that could be relayed through space and time was so limited that it is a miracle that any scientific tradition was established at all. Even in modern times, data transmission through print remained relatively cumbersome and expensive, and had to be limited to a careful selection of high-added-value information. To alleviate this chronic shortage of retrievable data, Western science has developed increasingly effective strategies over time to compress as much data as possible into the shortest and simplest notations. This was mostly done by dint of generalizations—handing down a few universal principles rather than endless descriptive records of as many disparate facts. The ancients used alphabetical words and phrases for this purpose, while the moderns found it more convenient to use numbers; thus principles were first recorded as propositions and syllogisms, then as equations and functions.

The logic at play, however, was the same: We observe a certain regularity among some phenomena, and out of that regularity we extrapolate a rule, as general as possible. That rule summarizes some aspects common to many events we have witnessed, and we expect it will help us predict similar future ones. This is the way modern science has worked to this day. But imagine that all events that ever happened could be recorded and transmitted in their entirety and at will, thanks to the unlimited power of digital computation. In that case, evidently, there would be no need to replace a record of actual events with their partial and abridged, laboriously compressed transcriptions in the form of mathematical laws or rules. The best way to predict a future event in a given set of circumstances would then be, simply, to sift through this database of past evidence and look for an exact precedent. Whatever hap- pened before (if known) would simply happen again, whenever the same con- ditions recurred; retrievable data would then replace rules, and search could replace predictive science. If this appears far-fetched, similar strategies have already proven successful in various applied sciences, such as weather forecasting, and epistemologists have started to speak—with some perplexity—of data analysis as a new kind of “agnostic science,” which can make effective predictions without any understanding of the reasons or patterns (or laws or principles) we used to need to make some sense of the world.

This postscientific paradigm has already started to renew and retool contemporary digital design. Consider the mathematical notation of a continuous line, or spline, mentioned above, in the well-known format y = f(x). How many points does any such notation include and describe? Plenty: an infinite number of them. In practice, that script contains all the points we would ever need to draw or produce that line at all possible scales. But let’s assume we have access to unlimited, zero-cost data storage and processing power. In that case we could easily do away with any synthetic mathematical notation and record instead a very long, dumb log: the list of the positions in space (x-, y-, z-coordinates) of as many points of that line as necessary. The resulting cluster of points would not appear to follow any rule or pattern, but it would not need to, so long as each point is duly identified, tagged, and registered. This is exactly the kind of stuff humans don’t like, but computers do well.

Some digital designers have been testing a similar way of working for a while now, using an alternative to spline modeling that is based on “subdivisions,” or discrete fragments that can be made as small as needed, and which today are increasingly displayed in their apparently disjointed and fragmentary state. Structural engineers have long been using a related approach, called finite ele- ment analysis, and as the volumetric units of design and calculation are called voxels, the style resulting from this mode of composition is often called voxelization. Among the earliest examples of objects designed with this approach are the T1 Chair Studies produced by Philippe Morel of EZCT Architecture & Design Research in 2004.

Other examples could follow, but the spirit of the game is the same: In all such instances, designers use “big data” to notate reality as it appears at any chosen scale, without having to convert it into simplified and scalable mathematical notations or laws. The inherent discreteness of nature (which, after all, is not made of dimensionless Euclidean points nor of continuous mathematical lines but of distinct chunks of matter, all the way down to molecules, atoms, electrons, etc.) is then engaged as such, ideally, or in practice as close to its material struc- ture as needed, with all of the apparent randomness and irregularity that will inevitably appear at each scale of resolution. Designers can use a similar data- driven, “dumb” approach to test complex quantitative phenomena (structural design, energy performance, patterns of use and occupation) without needing to interpret them through cause-and-effect, deterministic rules or models. This means that design choices that used to be based on analytic calculations can now be made by trial and error, or even by intuition. Digital designers are discovering that they may often learn and design by making, just as artisans always have, but now at much bigger scales. Using the power of digital simulations, a designer can make and break more chairs, beams, or roofs in a few minutes on a screen than a traditional craftsman made and broke in a lifetime, learning—often tacitly—from this experience.

The tools we use inevitably produce feedback with the things we make. And so today’s digital style increasingly represents the heuristic and holistic logic of this process, as a metaphor and image as well as a material index or trace of this novel—and at the same time ancestral—way of making. In hindsight, the curving forms of the first digital age appear today more and more as a transitional style, where the newly discovered power of digital computation was tasked to perform traditional (i.e., modern) mathematical operations: a marriage of that signature invention of modernity, calculus, with computers that used calculus-based splines for the design and mass production of actual physical objects. But today’s data- driven computation is alien to this modern logic. In fact, the former is in many ways the latter’s reversal and nemesis. Splines, a crowning point of modern mathematics, were a tool of simplification; “big data,” a disorderly offspring of post- modern digitality, is a tool for coping with, managing, and some would even say extolling complexity. Yesterday’s spline-dominated environment was elegant and modern; today’s data-driven design environment is messily postmodern: disconnected, broken, fragmentary, rickety, patchy, and aggregated. For example, even a cursory look at the recent ArchiLab 2013 exhibition in Orléans, France, which was conceived as a showcase of advanced digital design and research, reveals this new formal landscape (in particular, works by Alisa Andrasek and Jesse Sanchez of Biothing, or Michael Hansmeyer and Benjamin Dillenburger).

Only a generation ago, the breakdown of modern science was the ideological ambition of a clique of weird postmodern thinkers. But indeterminacy and non- linearity are now a fact of our daily life, embedded in most digital technologies we use. We may pretend it is not so, but many designers are not pretending anymore. This is why today’s digital style looks messy and rough. This is the way “big data” looks, because we have no clue how to interpret it, and this is the way digital computation works, even if we cannot say why, or how. It is a jungle out there, and it increasingly looks like it.

 

Publication

Artforum

Citation

“Breaking the Curve. Big Data and Digital Design.” Artforum 52,6 (2014): 168-173