Skip to content

Audio Version

See more about

More info

Essay by Kiel Moe.

Kludge: Our Model of Models

In contemporary architecture’s perpetual rush towards short-term market differentiation through pseudo-technical platitudes such as “innovation”, “sustainability”, and “resilience”, there is often a paucity of critical reflection granted to the techniques employed in this treadmill of technological escalation. In place of such reflection, architecture’s capitulating habituation to technological determinism has instead long defined its unnecessarily reductive relationship with technology, technique, and technics. As architects and educators, we often seem to simply lack a sense of irony and curiosity about the concepts and techniques that have willfully, if not gleefully, come to dominate our practices, pedagogies, and collective lives through the persistence of this recidivist determinism. There is perhaps no less considered working procedure in contemporary architectural pedagogies and practices than simulation.


Simulation of a Simulation

In this unreflexive state, Eric Winsberg’s astute book, Science in the Age of Computer Simulation (The University of Chicago Press, 2010), provides a welcome critical perspective that is as refreshing as it is necessary today. With an inordinately sharp philosophical approach, Winsberg immediately identifies the many epistemological questions and ontological problematics that are central not only to simulation as a prevalent and largely unquestioned technique in architecture, but—as a model of intellectual engagement with technique—to the larger intellectual malaise regarding the culture of technology in architecture today.

Winsberg was trained in the history, philosophy and social studies of science. So in place of the rhetorical escalations and platitudes that often accompany work related to technology in architecture, Winsberg instead poses questions of central consequence: What, exactly, is simulation? When is it reliable as a mode of knowledge production? How is if different than experimentation? How is a simulation validated and verified? Perhaps most importantly, what kind of knowledge does Simulation produce in the proper ontological sense?

Such rudimentary yet inordinately substantive questions are absolutely essential to a basic technological literacy that remains so absent in architectural discourse and practice today. Winsberg helps provide us with some of the basic philosophical vocabulary and methods that challenge so many of our unquestioned assumptions. When this basic literacy is coupled with the proper first principles of, for instance, thermodynamics and fluid dynamics, then an ontologically reasonable basis for simulation could be in place. As Winsberg’s general observations indicate, however, in architecture we are very rarely near this cogent coupling. Instead, regarding our lack of rudimentary knowledge about central topics like simulation, models, and experimentation, a quote form systems ecologist Timothy F.H. Allen helps characterize our current model of models: “We are tyrannized by our technology telling us how things work, because that blunts our curiosity and confidence as to how things might otherwise work.”

When you hold a hammer, everything starts to look like a nail. When trained to conduct simulations, everything begins to look a simulation. Without first principles, simulationists quickly blunt curiosity and confidence as to how things might otherwise work outside of the simulation parameters and their analogs in a building. In short, if it is not in the pull down menu in the software, it soon no longer exists ontologically for the simulationist. This severely limiting orientation is absolutely, and disturbingly, deterministic. It predetermines, for instance, that the question of energy in buildings is best addressed in terms of optimizing operational energy, yet operational energy only reflects about twenty percent of a building’s total energy flux. Faith in such a narrow, positivist posture, like so much of most technological inquiry in architecture, should indeed be cause for concern. In this modality, questions of design are routinely misconstrued as isolated, quantifiable problems that, by definition, have isolated, quantifiable solutions. Worse, such problems are construed as implausibly linear, as are the associated “solutions”. Yet, as Winsberg observes, simulation is itself it not even a tidy linear process but rather ontologically ambiguous process that jumps from errant first principles and boundary conditions for which there is necessarily insufficient data, frequent incursions and adjustments to make the model work, and then through a fraught process of so-called verification and validation. For a philosopher like Winsberg it is not at all clear that this process can yield empirically useful knowledge.

But this epistemic question concerns just the simulation process itself. In the context of architecture, the implausibly and ontologically problematic linear procession of simulation is routinely burdened with totalizing whole building simulation, for instance, rather than tasked with the experimentation of discrete processes and phenomena. The result is a mangle of linear and closed models of decidedly nonlinear and open realities. The manifestation of this process is piles of isolated data and little intellectual apparatus with which to properly interpret it and no means to connect it methodologically. In the end, the unwarranted enthusiasm and confidence in simulation today thus constitutes little more than faith-based modeling. False positives thus reign unchallenged.

As such, as the brilliant educational philosopher Robert M. Hutchins cautioned long ago in his 1933 Commencement Address at the University of Chicago, “The gadgeteers and data collectors, masquerading as scientists, have threatened to become the supreme chieftains of the scholarly world…. As the Renaissance could accuse the Middle Ages of being rich in principles and poor in facts, we are now entitled to inquire whether we are not rich in facts and poor in principles.” In the case of simulation, with insufficient principles architects blindly input and output large amounts of data and generally lack the means to interpret the (“big”) data. In other words, with perhaps a few exceptions, architects are not sufficiently adept with these weighty philosophical questions much less sufficiently versed in thermodynamics, fluid dynamics, and physiology or, importantly questions of form and beauty, to adequately address the mangle of knowledge necessary to adequately implement such an ontologically nuanced technique in a coherent and valid way. The result in computer science parlance is GIGO: garbage in, garbage out.

Winsberg’s chapter titles, drawn from his multiple peer-reviewed articles, indicate well how he positions the central epistemological questions of the book and larger topic. Consider, for instance, the titles of the strongest chapters: “Sanctioning Models: Theories and Their Scope,” “A Tale of Two Methods,” and “Reliability without Truth.” The tricky terrain of verification and validation, as well as the consequential differences between simulation and experimentation, are recurrent topics in the book. By the end of the book, Winsberg employs the very poignant literary trope of the fable to help characterize how knowledge and forms of “truth” are produced through simulation. This is a productive analogue as it places appropriate emphasis on judgment and the reliability of the narrator and narration for the efficacy of the knowledge.

In short, the epistemological and ontological questions that Winsberg articulates ought to temper our engagement with simulation, but rarely seem to do so in practice. This resonates deeply with what Luis Fernández-Galiano observed in his book Fire and Memory: on Architecture and Energy,

“it is therefore time that we relieved [simulation] of its exaggerated responsibilities and established the chores it can perform without abusing the concept or exhausting the instrument. Far from scornfully demoting it, to relieve the discipline of the Herculean tasks previously assigned to it is to express the absolute confidence in the future of the idea and the fertility of its approach, both of which would be seriously threatened if we insisted on overwhelming it with the burden of multiple mirages: the mythical discipline must be transformed into a modest analytical tool.”

In this context, it is compelling to look at a design practice that does, perhaps more than any other in architecture, approach these philosophical and technical questions with sufficient redress and care. In doing so, this practice does express confidence in a non-totalizing approach to simulation, but only as part of a larger experimental and analytical apparatus. One example is the research group at Kieran Timberlake. This group, initiated years ago by Billie Faircloth, has grown to include individuals is trained in methods associated with urban ecology, land use management, materials science, coding, architectural design. They begin each project with first principles and thus first discern what which questions and techniques would be necessary for the present research topic. The first sign that this practice approaches simulation practice reflexively is that they produce many of their own simulation tools—not from software downloads and online training tutorials—but rather from first principles. The group debates the proper systems boundaries of the topic at hand, they debate methods and parameters, they debate relevant orders of magnitude of the resulting data. This is their model of models. In this way, the team constructs the appropriate experimental apparatus whether it is a multi-scalar model, a relevant sensor network, or a survey method for vegetation on a green roof. Because they are hyper aware of system boundaries, they come closer than any practice to be our model of models of energy research and practice.

As such, Faircloth and her team avoid the many fraught ontological concerns of the typical simulation practice that Winsberg warns against. The fact that the team does not identify itself as a simulation practice but rather a practice that can selectively employ and critique simulation as one of many techniques is the tell that pertinent philosophical and technical capacities underpin the practice. As such, in the context of Winsberg’s characterization of simulation, the Kieran Timberlake Research Group is more of a dissimulation practice than a simulation practice.

Even the Wikipedia entry for “dissimulation” helps clarify the difference between simulation and dissimulation practices. “Pool hustling is a form of dissimulation, because the hustler conceals his real talent. It is sometimes considered a form of simulation because every hustle conveys false information about the hustler’s abilities but this is incorrect. A hustler gives the false appearance that he isn’t something. Simulation would be giving the false appearance that you are something. A hustler is a ‘dissimulator’. An equivalent ‘simulator’ would be a man pretending (by his confident movements and his bragging) that he was excellent at pool, when in fact he was terrible (he can’t make a single shot).”

If the conditions that support of life in the twenty-first century are manifestly non-linear and fraught with emergent conditions, the linear and positivist posture that so often accompanies simulation—and numerous other techniques—in architecture is problematic. The nonlinear dynamics of imperceptively large scales reflect differences in kind, not simply differences in degree. So must our methods today. Interestingly, the multivariate, often nonlinear disposition of design as a methodology is a far more adept starting point for work on these dynamics.

Books such as Science in the Age of Computer Simulation, along with related titles from Lewis Mumford, Jasques Ellul, Langdon Winner, and Merrit Roe Smith, to name but a few, help position architects, architectural educators, and architecture students to be more reflexive in their engagement with the larger technics and techniques of architecture. But Winsberg’s provides perhaps the most immediately and directly applicable case — simulation — for our necessary collective reflection on these matters of disciplinary concern. With Winsberg’s book in mind, and with Faircloth’s practice as one laudatory model of models, there is yet hope for the critical inflection of simulation as but one indicator of a much-needed maturation of technology, techniques, and technics in architecture. Without such philosophical and practical equipment, however, architects will no doubt continue to perpetuate the same technical, methodological, and ontological errors of the past century that, in part, systemically engendered the dynamics that have come to condition life so dramatically in the twenty-first century.

As Winsberg suggests, we need different, and more, models — not more ontologically ambiguous data — to address the most consequential questions of design today.

Essay by Kiel Moe.

urbanNext (July 22, 2024) Kludge: Our Model of Models. Retrieved from
Kludge: Our Model of Models.” urbanNext – July 22, 2024,
urbanNext January 12, 2016 Kludge: Our Model of Models., viewed July 22, 2024,<>
urbanNext – Kludge: Our Model of Models. [Internet]. [Accessed July 22, 2024]. Available from:
Kludge: Our Model of Models.” urbanNext – Accessed July 22, 2024.
Kludge: Our Model of Models.” urbanNext [Online]. Available: [Accessed: July 22, 2024]

urbanNext | expanding architecture to rethink cities and territories


Sign up to our newsletter

Generic filters
Exact matches only
Search in title
Search in content
Search in excerpt
High Density
Middle Density
Low Density
No Density







all formats