Particlized: Computational Discretism or the Rise of the Digital Discrete

2019

Publication History:

“Particlised. Computational Discretism, or the Rise of the Digital Discrete.” AD 258 (2019): 86-93 

The text posted here is an earlier, unedited draft and it is different from the published version. Please only city from copy in print

 

Digitally intelligent architecture no longer looks the way it did.  While some in the profession, and many in the general public, still seem to assume that the use of computational tools inevitably results in the design and production of smooth and curvy lines and surfaces, most experiments in schools and in avant-garde practices for the last ten years or so have been probing a very different visual environment: disjointed, disconnected, and fragmentary—often voxellized, filamentous, or chunky.

I first became aware of the rise of a new visuality—indeed, of a new style—in digital design and fabrication at the start of the years 2010s, and my first account of the matter is in an Art Forum article I published in early 2014, titled “Breaking The Curve”.  I eventually wrote a book (The Second Digital Turn, 2017) mostly devoted to the same interpretive conundrum: Why are computer-generated voxels (or other, similarly discrete parts) replacing computer-generated splines as the distinctive image of today’s computational technology?  Today’s computers are not very different from those of 30 or 20 years ago. 

For sure, fabrication hardware has changed, if silicon chips haven’t.  The technical protagonist of digital fabrication in the 1990s was the CNC milling machine—a legacy, subtractive fabrication technology that, by the way it works, favors the milling of smooth, streamlined curves.  To the contrary, today’s industrial grade 3D printers print out small little cubes, or voxels, which designers often choose to leave in plain sight.  Ditto for the robotic weaving of extruded filaments, or the robotic assembly of prefabricated components: based on the addition of discrete parts, these rapidly developing fabrication technologies favor an “aggregational” way of building, at all scales and sizes.  And of all the above this issue of AD offers memorable instances.

But this is only part of the story.  The first epiphany of what I call the second digital style, or the style of computational discretization—Philippe Morel’s Bolivar Chair of 2004—was not 3D printed.  It was fabricated using traditional laser cutting and manual assembly.  But its voxellated look was deliberately meant to reveal the computational method used to calculate it—Finite Element Analysis, a mathematics of discrete parts that is at the core of today’s design tools, and which is conceptually and technically remote from all traditional methods of structural design.  And this is, I suggest, where today’s computation is breaking new ground: today’s computers are so fast and powerful that methods of calculations that would have been practically unusable only a few years ago are now perfectly functional—merely due to the brute force and speed of today’s machines.  Some like to think of these new computational tools as of a new form of Artificial Intelligence—an expression that harks back to earlier (and failed) computational experiments.  In fact, it is easier to admit that we are dealing here with the rise of a new scientific method, or indeed of an entirely new science, which is the reverse and nemesis of the old science we knew—the science of Galileo and Newton, aka modern science.  This was until recently the only functioning science we had.  No more.  

Modern science and technology—the science we studied at school, and the technology that propelled the industrial revolution—always aimed at making things simple.  The world as we see it is a meaningless mess: in order to understand it, predict it, and act on it, we must convert it into simpler formulas or laws we can more easily comprehend within our mind.  But our mind is hard-wired for small data: there are limits to the amount of stuff we can remember and we are slow in processing quantitative information.  This is why science favors short, user-friendly notations, mostly expressed through mathematical equations and functions, to determine relations of cause and effect among a very limited number of measurable factors.  Built on similar premises, statics and mechanics of materials were among the most successful of modern sciences.  In the 19th century the theory of elasticity adopted the most powerful mathematical tool then available—differential calculus—to describe the deformations under stress of an ideal building material, perfectly isotropic, homogeneous, and continuous at every scale.  Leibniz’s and Newton’s differential calculus was the culmination of classical mathematics, and it modeled nature using ideal notions—the infinite and the infinitesimal—that do not exist as such in reality; due to the way it notates infinitesimal increments, calculus best describes natural phenomena subject to smooth and continuous variations—the kind of variations that can be scripted as a mathematical function and graphed as a continuous curve.  Not surprisingly, Leibniz also thought that continuity is a universal law of the physical world: in his classical worldview, which became a tenet of modern science, nature does not jump (natura non facit saltum).

Natural building materials, however, did and do—as, far from being continuous, they tend to have all sorts of gaps and cracks and knots and slits and fissures all over.  This is why in the course of the industrial revolution scientists had to invent new, artificial materials that could be as continuous and homogeneous as the mathematical functions then used to describe them; industrial grade steel was the best match they could concoct, as steel is designed and made to have the same elastic properties everywhere and in every direction; a bit later, reinforced concrete came up as the second best.  The theory of elasticity was the cornerstone of late 19th-century and of 20th-century civil engineering, but that theory has well known limits, too: predicated as it is on a postulate of absolute continuity, both material and mathematical, it can calculate the deformations of the Eiffel Tower, which is made of iron, but it cannot calculate the resistance of a 10-feet tall brick and mortar garden wall, because each mortar joint in that wall represents one of those “jumps” or gaps that Leibniz’s nature is not supposed to make—and Leibniz’s mathematics cannot describe.

Which is why Finite Element Analysis, an alternative mode of calculation based on three-dimensional grids of very small, discrete particles, started to be developed as of the mid-twentieth century; but finite element models generate such huge amounts of data that no one could make any use of them before the recent adoption of powerful electronic computing.  Likewise, as of the 1970s several post-modern theories of science, known collectively today under the name of complexity theory, started to favor discontinuous scientific models for all kinds of purposes and tasks—from the theory of evolution to social sciences, from thermodynamics to the science of materials.  Among the mathematical spinoffs of such theories one in particular, cellular automata, posited that some natural phenomena can be best modeled by dividing continuous matter into rows of indivisible cells, then writing rules that describe simple interactions between each cell and those next to it.  Once again, this particlized method was long seen as a mathematical curiosity of no practical use—until it became clear, not long ago, that today’s electronic computers, unlike human computers, can work with cellular automata just fine.  Indeed, using cellular automata computers can already outsmart us, in some cases solving problems long considered unsolvable.  Cellular automata simulations are based on the endless repetition of a limited number of very simple operations.  Computers can work that way, and we can’t, because no human being could perform so many calculations in any practical amount of time.  Unlike humans, computers don’t need science to compress data and limit the number of operations.

Think of the mathematical notation of a continuous function, in the usual format y = f (x) : a few letters and numbers suffice to notate an infinite number of points, which all satisfy the same stated conditions (i.e., they all belong to the same curve when we graph it).  Electronic computers, however, would typically translate that formula into a huge log of coordinates for a never-ending (literally) list of points—all individually designated, one by one.  That list would not make any sense to us, and it would be very unpractical for us to use—because it would take forever to compile it, for a start.  But computers are not in the business of making sense of the world.  Besides, they can work out any never-ending log of discrete data faster and better than we would by dint of our shortest and most elegant mathematical formulas.

More examples could follow, and the impact of such epistemic changes is already ubiquitous—including in many technologies of daily life.  And sure enough, the design professions have already taken notice, as many among today’s digital innovators are trying to compose with this unprecedented data-opulence—trying to make some design sense of the overwhelming levels of figural and structural granularity that Big Data, or AI—or whatever we chose to call this new scientific paradigm— already allows or abets.  Sometimes the super-human resolution of artificial intelligence is patently, even ostentatiously displayed in the finished work: a surface where one could count 4 billion discrete voxels is the outward and visible sign of an inward but non-human logic at play, as no human could notate 4 billion voxels one by one, the way computers do.  Ditto for a building made by the robotic assembly of 4 billion standard and non-standard parts—a problem of data management that would be all but unimaginable in the absence of advanced computational tools: no human could take in, and take on, that much information.  In all these instances, the rising style of computational discretism, or particlization, is giving visible form to a new, post-scientific method, which already reveals the inner workings of an artificial mind that no longer works the way ours always did—and still does. 

Not surprisingly, signs of this new scientific and technical paradigm, and of the way of building derived from it, are increasingly visible even outside the rarefied and often self-referential precincts of the digital avant-garde.  Kengo Kuma is one of the protagonists of today’s global architectural scene.  He is not known for having ever nurtured any interest in computational experiments.  In fact, quite the opposite: after first gaining international acclaim as a controversial PoMo classicist, in the ’90s Kengo Kuma shifted the target of his stark anti-modern stance from figure and form to the technical logic of industrial modernism itself.  In his tirades against what he then called “the method called concrete” Kuma fulminated against the (deceptive) structural continuity of cast-in-mold concrete, and against the new tools for computer-based design that architects were then learning to use in the pursuit of smooth and curving volumes and surfaces.[1]  Then, starting with his seminal theory of the “anti-object,” first published in book form in year 2000,[2] Kuma went on to develop the non-figural, aggregational, atomized, or “particlized” style for which he is now famous, and which has become his trademark and rallying call.  For, as he claimed back then, particlization is way more than an architectural style: “it is a view of the world, a philosophy itself. […] In the past, such a flat [particlized] world was only thought to be a mess, and unaccountable: something that could not be handled. […] However, contemporary technology makes it possible to process this mess of specific particles without the introduction of structure, hierarchy, or assembly.  For this to function, each element needs to be relieved from contact or structure beforehand, and placed under free conditions.  This is the image of what I call a particlized world; this is the image of what I call freedom.”[3] Simply put, Kuma’s project of particlization goes counter to, and upends, all scientific principles and technical foundations of modernity.  

Thus it will be seen that, following an independent, almost idiosyncratic itinerary, entirely motivated by his aversion to industrial modernity, one of the protagonists of contemporary architecture ended up advocating a theoretical stance very similar to that shared by today’s computational avant-garde.  At the same time, it is impossible not to notice that some of Kuma’s world-famous buildings look at times remarkably similar to some of the experiments currently pursued by the young digital avant-garde featured in this issue of AD.  Indeed, there is a logic in that.  Kengo Kuma and the young designers of the second digital turn have a common enemy: industrial modernity.  One can avoid the technical logic of industrial modernity by looking back at the artisanal logic that preceded the industrial mode of production; or by fast-forwarding to the computational logic that is now superseding it.  But, as we all learnt long ago (and as I discussed at length elsewhere) post-industrial, digital making, and pre-industrial, artisanal craft, have much in common.  And if in some cases, as in Kengo Kuma’s recent architecture, the finished work seems to mysteriously bridge the gap between the non-quantitative intuition of a traditional craftsman and the post-human logic of electronic machines, it is because AI today is just that: the pre-scientific logic of the non-mathematical mind powered by the post-scientific speed of advanced computation.

[1] Kengo Kuma, Material, Structures, Details (Basel: Birkhäuser, 2004), 9. (First published in Japanese in 2003). 

[2] Tokyo: Chikuma Shobo, 2000.   English translation: Kengo Kuma, Anti-Object (London: The Architectural Association, 2008; transl. Hiroshi Watanabe).  The original title in Japanese roughly translates as Anti-Object : Melting and Crushing Buildings.

[3] Kuma, Material, Structures, Details, 14.

Publication

AD 258

Citation

“Particlised. Computational Discretism, or the Rise of the Digital Discrete.” AD 258 (2019): 86-93