A few questions about computational design and form following function

liberty bell

Here’s a story. Someone I know is helping someone who is a professor of computational design at a national architecture school fabricate the thing that they, the professor of computational design, designed using computational design.

But the professor of computational design promised all cut files would be accurate. They are not; they aren’t even really cut files, although they certainly look like what one would think computationally-designed cut files would look like.

The outcome of the faux cut files is that none of the bolt holes are aligning. So the professor of computational design suggested to the fabricator “Well just field drill new holes where they need to be. That’s good enough.”

This frustrates me to no end. Isn’t the entire POINT of computational design that things will perfectly work? Is this a terrible case of an architecture professor not knowing anything about the actual practice of what they’re teaching? Is it a complete failure of Modernism? Is it yet another example that architecture is a material process that relies entirely on the skill of the hands that are building it? Yes to all, I think.

Mar 10, 19 11:26 am

*Edit, sorry i'll address you by the account you're posting on* - Liberty Bell, great questions. Two main points to this (though I could go on forever on this subject, it was a key part to my thesis work). If I remember correctly you dont really have any experience in scripting, correct?

1) Ive said this many times to friends and coworkers - everyone thinks they know how to script in Grasshopper (or 'x' software of choice). My general response is to ask them how they would do a round-trip complex data-tree reorganization. I dont actually need them to tell me what to do specifically, I just need them to start describing the process to me; this usually outs the posers in about 30 seconds. The reason being that, fundamentally, it's all just patterns of 1's and 0's, isolating the sequences you want, and aligning them to the ends you're trying to achieve. There are infinite ways to do this, which may give the same results, different results, or most crucially, results that appear to be the same but really are not. What I mean is - the process/sequence that the scripter defines actually matters quite a bit, if you're trying to achieve a certain accuracy or tolerance. A > B > C > D is not the same as A < C > B > D, but the subtleties can be so small that a dilettante won't see it.

2) Computational design and digital fabrication are related, but not interchangeable (same goes for 'parametric', 'scripting', 'geometric', etc. All these words mean related, but specific, variations of similar things). People love to work with 3D surfaces because they are fun to manipulate and easy to make cool-looking shit. But reality is a bitch, and flat surfaces don't exist in the real world. Everything has thickness, which means tolerance testing is crucial to the success of any project - which as architects we obviously understand very intimately in all aspects of our built work, but are all too quick to glance over in model space - because its hard. Not addressing this is essentially just cheating.

So my diagnosis is a fairly common one. There were likely two failures in the process that lead to the cut-files not being as expected (for this specific example; i'll get to a broader statement). First, in producing the scripted work, there was likely some piecing-together of other scripts (usually found on the internet) into the working file, without proper vetting. This means that an amount of tolerance was sacrificed because someone didnt understand 100% of what was going on, because they didnt do the full work themselves. Second, there was an expectation that because it was 'computational' it was totally correct, and look-i-can-see-it-in-my-model-so-it-must-be-true-right? But test files and mock-ups are so crucial in translating between the two. Designing with script doesnt free someone of the iterative process, it only reinforces the need for it.

TL;DR - people need to understand that the subject of computational design is actually very robust and complex. It is not pretty pictures and sexy renderings. It is methodical, it is precise, it requires proper study of the subject and a conceptual understanding of the way in which the system works. It is data, it requires correct application.

Mar 10, 19 2:48 pm
Non Sequitur

I am thoroughly aroused by this explanation.


Bench- why would assume "everyone thinks knows how to script?" Does "everyone" actually they think they know how to script? I don't.


pretty sure Bench meant “everyone (who thinks they know how to script) thinks they know how to script.” also pretty sure you knew that and just opted to play dumb because you want to put your two cents in against technology you don’t understand.


Actually I do understand the technology, but I'd rather focus on stuff that would make me a more marketable architect rather than specialize in a niche-market. So many people become experts in grasshoppper and get stuck delegating the heavy lifting to people who can actually build. Designing is great- knowing how to build is even better. Ask any of Bjark's digital people what they know about putting together a building. Oh - right; they partner other firms to make that happen.

placebeyondthesplines want to put your two cents in against technology you explicitly said you don’t understand in your previous comment


As PBTS said, yeah I think its fairly obvious what the statement referred to, so I wont bother explaining further. Also, I dont know why you feel the need to separate being a "more marketable architect" and learning how to use this type of approach to work in tandem with that. Ive found it to be an excellent way to work, particularly when coordinating complicated fabrication packages and shop drawing reviews on non-linear elements or multi- factor schedules.For applications that don't require optimization analysis, its a very straight-forward and useful way to work, providing you know how to do it.

Bench, thank you for this! And you're correct, I don't know scripting. I'm curious about it because I'm excited for how it *can* impact architecture. I'm excited for the self-assembling builder bots! 

Like you say it's not about pretty pictures and sexy renderings, but those things are what sell the idea. I just want honesty in how it's used. 

Another project I know of *did* use the computation correctly and had tolerances across a 40' surface down to 1/8" in all three directions which is spectacular and cool but! in a typical weather cycle from spring to winter the building on which this was installed moved more than that. So now all the pieces are wonky.

There's some friction between material and modelling accuracies, obviously. I'm fascinated by it.

Mar 10, 19 3:48 pm

of course, and that friction exists regardless of how “computational” a project is. I could go on and on about the how problematic your original post’s last paragraph is, but to be brief: the professor absolutely did a shitty job, but with all due respect, your criticisms are as uninformed as the professor’s “computational” skulls are suspect.



No offense taken, placebeyondthesplines, I'm asking for sake of starting the discussion *because* I'm essentially ignorant of the process. I literally welcome your analytical dissection of my last paragraph in the original post, if you have time (I know I'm asking you to educate me for free, but also hopefully your generosity with it will educate lots more people than just the OP?)!


Donna, I looked back at your post a bit more.

" Isn’t the entire POINT of computational design that things will perfectly work? "

- Not necessarily, but I can understand the sentiment. Simply put, its a robust method for developing systems of rationalization for things that are inherently irrational. Previously, there hasn't been a better way to address this. The onset of computational and rules-based systems, as well as the capacity for the broader workforce to now understand and achieve this through visual scripting like Grasshopper (rather than typed code) just makes the difference seem to be massive; we see the delta in the shift rather than the substance of the new ends. Computational work flows can and still do fail, or at least create new problems all the time.

" Is this a terrible case of an architecture professor not knowing anything about the actual practice of what they’re teaching? "

YES. Or at least not knowing it as well as they think they do. It comes back to that longtime adage of types of knowing: 1) Knowing what you know, 2) Knowing what you dont know, 3) Not knowing what you dont know. Seems like a classic case of the third in this sense; its not to do with computational design specifically, but a fundamental miss on the part of the professor to exercise a self-critical analysis of their own understanding on the subject. And it refers back to my comment about people 'actually' understanding what the 1's and 0's mean, or focusing on the image instead of the code.

" Is it a complete failure of Modernism? "

This doesnt seem very relevant, to be honest. I actually think theres an interesting debate to be had that the current digital design/practice world is an entirely different/evolved way of thinking about design, beyond Modernism and Post-Modernism; in a similar manner that contemporary art has completely left much of main modernist art theories and texts behind. See the OMA article discussion going on about the towers in Brooklyn yard and the evolving typology of towers.

" Is it yet another example that architecture is a material process that relies entirely on the skill of the hands that are building it? "

And this just comes back to my original point, that these digital workflows still need to address the real world, ugly/tough as it may be. Similar to a theory lecture that doesnt refer to anything substantive in practice.

I've been quite lucky to have worked on a number of projects where we could not have achieved the goals without these scripting tools. It goes through the same process with manufacturers as any other component (albeit a much more complex deliverable). So perhaps this is more of a failure in academic discourse not totally being able to give good instruction due to a lack of professional exposure. Profs still need professional experience to prepare students for the working world, period.

Mar 11, 19 12:58 pm

Everyone's a grasshopper pro, until they get tested.  I was the digifab person for my grad school and the amount of puffery that I overhead was crazy, but that's beside the point and honestly, I know that there's a lot that I have to learn myself.  

With "computational" design, there are bugs that come up but for the most part it's a one shot thing.  Software like Word or Chrome are constantly being written and improved but most of the projects that I've "computationalized" end with a static object and the script gets moved to an archive folder.  Sometimes, I'll go and pillage a part of it but any of the "issues" that were found creating the thing never get resolved (a tolerance that I didn't think about or a material thickness that fluctuated more than anticipated) because no one budgets for iterations/improvements and I've moved on to the next project (problem).  I have a bunch of projects that took a long while/intensive head scratching to create the workflow that I could now tweak some parameters and spit out new versions of the installation in a few days when the initial took months.    

I once interviewed with a prominent "digifab"-intensive branded firm that had a project similar to something in my portfolio/is well published and won digifab awards, I found out that a large portion was actually manually fixed because of issues that arose.  I was proud that my small group of peers was able to do something better in a much much shorter timeframe!  

TL:DR, Just because holes don't line up, I wouldn't discount the computationalism of the project.  These are very complex projects with a lot of thinking that goes into them.  A large portion of the project is learning from ones mistakes for the next time. Most hype computational fabrication as an easy CSI enhance image with just some keyboard mashing, send it to the machine and out pops a cool thing but in reality, those Kuka robots have to be checked to make sure that the path doesn't cut back on itself, a CNC doesn't cut too much of the material keeping it on the spoilboard before cutting the rest of the thing out, 3D printed object has enough supports, etc

Mar 11, 19 1:11 pm

Very well put. Your experiences sound very similar to my own, both in academia and professional practice.


Unfortunately, I haven't really been able to utilize my skills much in the profession, just a few times. There are so many cool things that I've proposed that firms have been uncomfortable to risk since it's only viable through digifab that they don't wholly understand.


I do consider myself to be very lucky to occupy a position where this is applicable to our built projects. I always recognize it is not the norm, and few offices are willing to go down that road.



Isn’t the entire POINT of computational design that things will perfectly work?

no. this is sort of like saying “isn’t the entire point of a pencil that it can be used for math homework?” computational/associative/algorithmic/parametric (related but not interchangeable descriptors, as Bench eloquently stated) design processes are used for many more purposes than just complex fabrication and intricate assembly. optimization, form-finding, rapid iteration and prototyping, quantitative analysis (of potentially infinite input streams), machine learning and ai integration into design workflows, material science testing, and advanced cost estimations are just a few of the applications of computational design techniques. 

Is this a terrible case of an architecture professor not knowing anything about the actual practice of what they’re teaching? 

no. professors who teach “computational design” have some knowledge, and probably have advanced understanding, of the theory behind what they teach and/or practice. they may not have advanced skills with every piece of design software, and they may rely too heavily on students to execute their intentions, but “not knowing anything” is not how professors in relatively niche areas of study get hired to teach in those disciplines. 

Is it a complete failure of Modernism? 

no. this question suggests a larger and more troubling misunderstanding of what Modernism is, which is a conversation for another thread. 

Is it yet another example that architecture is a material process that relies entirely on the skill of the hands that are building it?

no. materials are obviously a critical component in any architectural endeavor, but any intelligent application of computational tools includes detailed material property analysis in its solutions. if those material properties are ignored by the project’s author, the result will likely fail, like this professor’s work. but this failure is in no way representative of “computational design” as a whole, which like any other tool, must be used properly and skillfully to yield successful results 

Yes to all, I think.

no. again, none of this is meant as a defense of this professor, whose bullshit attitude is an embarrassment to those of us who take this area of research and practice seriously. the larger concern (to me at least) is the perpetuation of these superficial and wholly ignorant ideas of computation in architectural discourse, especially by those who are completely (and to donna’s credit, admittedly so) uninformed on the subject. 

Mar 11, 19 1:32 pm

"Professors who teach “computational design” have some knowledge, ...may rely too heavily on students to execute their intentions..."


(I messed up the comment) I had a professor that was quite prominent in ACADIA tell me that they had no idea how one of their projects was done because they just directed students to make it. Over the course of the semester, I realized that they could wax poetic about theory but were quite limited in computer knowledge. I have tried grappling with that since I had held them in such high regard but I guess that there are a lot of firms that could be utilizing their youngins to push the discourse but don't so ¯\_(ツ)_/¯


the professor’s task isn’t to know (or teach) the ins and outs of every bit of software, but to impart their knowledge of the underlying theory — through critique — to help students achieve more successful designs with those tools.


True to a point but when one's brand is based on being on the precipice of digital fabrication, learning that they struggle to just use a computer is jarring. I think it shows the gilding of the architectural pedagogy if professors' responsibility is merely appropriated to providing feedback. I spent some time recently in academia with an undergrad studio and I feel like a lot of critiquing is more veiled guidance towards a certain professors personal values and less about helping students achieve successful/provocative designs. I witnessed award-winning (for what its worth) practitioners quash students promising designs to move them more towards their design sensibilities


my interest in any designer’s “brand” is essentially nil, so I suppose if that’s a priority I can see how it would be surprising. and I do think instructors should have enough knowledge of the software to engage knowledgeably with the student’s idea. but your description of critic/student studio dynamics is not remotely unique to computationally-oriented studios (especially for undergrads, who often need substantially more guidance away from their willful and thoughtless instincts).


Lol, your interest in a designers brand is moot. To break it down, if one goes to a cooking class for a chef who exclusively publishes recipe books on BBQ, one expects them to know the difference between South Carolina and Kansas City sauce. I don't expect them to know the nuances.

I didn't teach a computational studio, actually the schools values were far from it and my point wasn't the tired critique of how thoughtless drones undergrads are. Take any group of architects and you'll find a good share of thoughtlessness, experience just provides that ability to disguise it better.   


not sure where the hostility is coming from. your expectations of architecture faculty are different from mine, and neither of us is necessarily wrong. personally I don’t care if a professor can manipulate software if they understand it well enough to get their intent executed well. if you think they should be able to model and script as well or better than their students, we can agree to disagree without being uncivil.

"a robust method for developing systems of rationalization for things that are inherently irrational"

Now all we have to do is decide if actual building construction is rational or irrational.

Mar 11, 19 2:21 pm

This is such a great conversation, I really am learning from it!  Thank you all for being generous!

I admit I know nothing of scripting, but I'm in the process of using and learning Revit. And for me, because building physical objects is the part of architecture that I'm most enamored with, the idea of building the whole house (or whatever) in the computer, like actually model-constructing every stud and nail and shingle and flashing, is nirvana. But I don't know how to actually do that, yet, and it's likely not a very efficient use of the computing energy.

My comment about the failure of Modernism relates to the idea of form following function. Which of course is a pithy summation of Modernism, and is a much bigger topic. But what it leads to for me is loving objects that use the absolutely minimal efficiently required amount of material. But when the holes then don't line up - argh!!

Anyway, thank you. All of this helps and is fun to discuss.

Mar 11, 19 5:19 pm

"the idea of building the whole house (or whatever) in the computer, like actually model-constructing every stud and nail and shingle and flashing, is nirvana. "

Revit doesn't do that.

Revit relies heavily on 2D information for details. This 2D information, however, is linked and drawn over 3D model information so the details are much more useful over time.

liberty bell

Sneaky, by “Revit relies heavily on 2D information for details” do you mean all the detail lines I have to draw on the sheet to make the model information look the way I want the *drawing*
information to look?


Yes and no. A lot of that can be fixed with a good set of visibility graphic settings. But the rest is detail components (2d families), masking and filled regions, and linework, yes.

liberty bell

Well that’s disappointing. Filled regions are so annoying.


I would prefer to use Revit (even if ALL it did was coordinate view callouts and such) than CAD. Every day all day.


Actually it was supposed to be 'function follows form' but Louis Sullivan got tanked the night before...

Mar 11, 19 6:22 pm

You can call yourself a computational whatever but it doesn't make you a good designer or builder.

I worked on a project with someone who is now absolutely the foremost computational designer building actual projects (well two, since Andrew Kudless is the other). This was their first permanent outdoor thing. I worked for the fabricator. We had to do a lot of education about material thickness, fastening, the thickness of paint, and other such minutiae. The script is only as good as the input, and it sounds like the person in question is an idiot, so go figure....

We also suggested a computational solution (yes, fabricators should be able to code) to this person to solve the doubly curved surfaces they had everywhere on their piece - they hadn't accounted for the difference between doing one of their pieces in 24 ga aluminum and 1/4" plate. To my knowledge this little routine us still used by this firm...

I guess bottom line someone always knows more, so be willing to learn. Oh, and go build some stuff. It's the best.

Mar 12, 19 12:53 am
liberty bell

Archanonymous I am *dying* to know if this person/project is who/what I think it is but I’m guessing a NDA prevents both of us from ever talking about it.


It probably is. Interestingly I still respect the guy and am friendly with him. It's all about the give and take... with any fabricator and designer, not just computational projects.


I'm struck by the number of people, OP included, who jump to the conclusion that the designer is an "idiot" "failure" "terrible" "incompetent".  It is really interesting how we hold ourselves to this zero sum game. 

The same mentality is pervasive throughout the industry. We demand perfection that is in reality not attainable. It leads to a lot of self doubt, undervaluing our services and giving away free work. There is always something that you didn't think about, some issue that will arise.  Either you except that and constructively deal with problems or you carry a lot of stress, work extra hours and give work away for free. 

The promise is never perfection. It is Design, it is Intent, it is a Contract that is to be negotiated in the real world.  

Computational design is not something new.  Everyone has been using it to design. the level of complexity of computation has increased. That doesn't necessarily mean the number of parameters has to increase. The human brain, and realities of site and material are probably best left as simply as possible. In this case said fancy design would have been well served by understanding this and perhaps leaving out that final connection detail to be drilled countersunk and field fixed, or as other have pointed out provided some greater tolerance, slots or connections that accommodate variability in site and material. Something that a mock up might have indicated. 

let's stop holding architecture to perfection. None of us are perfect. and no script will be either.

Mar 12, 19 1:17 pm

I am not even going to pretend to understand what half the people here are even saying about scripting, but OP describes a situation where the designer (Architect) is also the fabricator. That is very different from standard delivery methods used in Architecture. Design Build comes close, but there the Design and Construction Contracts are merged. This is something else entirely. Is computational design's intent to push Architects more into fabrication realm? If not, then fabrication will still be a separate entity that will communicate their interpretation of design via shop drawings that may need delegated engineering added on top. Sounds like computational gurus know as little about construction contract structures as Architects who do know contracts know about computational design technologies.


Agreed it is blurring the lines between architect and fabricator and creating more demand for a perfection that is not attainable or at least has not be valued correctly. The Case that OP refers to though he was half fabrocator, Providing cut files, but not actually doing the construction or having the ability to manipulate those items during construction.


I stand corrected Jonathan. It becomes an interesting question. Should designer be handing out computational design output files to a fabricator who doesn't quite understand what they are looking at or has the ability to confirm the values? Forget old design geezers not understanding new technologies. Fabricators need to understand how scripting works as well. And for anyone who has ever reviewed shop drawings, you are always glad when they barely know how Autocad works.


Whether the professor in the OP was good at computational design and scripting or not, they are a bad architect. Contractual separation (or breaking down those barriers in an intentional way), proper design and validation, reviews, qa/qc and mockups all go into making architecture happen correctly and in line with design intent. Calling something"computational" does not release the architect from these responsibilities.


Dated to 2009-2011, but my 2 cents:

Most of the computational design in my graduate studies involved computational form making and manual cut-file creation which usually started by using a splice/split command of some sort and creating cut files, manipulated in the original program off to the side, or CAD, or sometimes even illustrator.  Some people were very accurate digitally.  Some fudged.  Many had to figure out adjustments, including making new cut files at times, when their fabrications didn't fit.  

If it was a team project or a 3rd party was being used, the cut files almost always didn't work, and in-field adjustments had to be made.  

As someone raised in the trades who was a 1st generation college graduate, I considered the student and/or professor/education process a success if by thesis year, most people had figured out their own set of tweaks (eg: angles with slip joints or something) BEFORE fabrication such that on-site install day went rather smoothly.  

Regarding grasshopper, I wouldn't even know where to begin now, but at the time, I was the go-to for figuring out how to "script" something.  My personal goal was to never grab a bunch of buttons and copy paste them, but honestly I failed at that quite a bit.  Having quite a few friends in CS at the time, I understood that I was barely scratching the surface in actually coding, even with my copious VBscript buttons.  Everyone had their own sets of scripts they would mix and mash together on projects, but no one at the time seemed to have figured out a way to create a grasshopper/VBscript library of their own and actually program, calling out different "functions" from the library as necessary.  The way the professors always talked about a pedagogy, it seemed like this would be the goal.  Imagine a "Zaha" library that you loaded in and could call from that allowed you to do tons of tricks that eventually implemented Zaha work, except replace "Zaha" with your name here and call that your thesis.

Mar 12, 19 5:08 pm

LB, Speaking more broadly to whether this is a failure of a professor, I would say no: this is a successful test of a hypothesis with a negative conclusion. Part of the role - and usually a main condition on their tenure - for professors in a research university is to conduct research. i am making the assumption that a 'national architecture school' with this position is part of such a university.

in this case, the hypothesis was the professor's method of developing a fabrication model. the failure shows where that method needs further adjustment.

without specific details it's hard to evaluate whether that outcome was truly predictable. the fact that the fabricator raised no objections suggests it was at least a subtle failure, or a fabricator inexperienced with this workflow. as others point out in real world practice the designers and fabricators would test their digital model and fabrication techniques to find these faults before proceeding to full fabrication.

of course the professor ought to publish this research even if only informally to make it useful. hopefully the experience will make them a better teacher at least.

Mar 13, 19 5:05 am

Schools need to teach analog fabrication before digital. For that matter, professors should learn the former before instructing students in the latter.

Mar 13, 19 10:03 am
Non Sequitur

I just used Dynamo for the first time today.  Pat on the shoulder for me.

Mar 13, 19 12:10 pm

Too bad Patrik is busy suing the Hadid estate. I'm sure he would have some valuable insight on this.

Mar 13, 19 12:32 pm

professor must be tenured

Mar 13, 19 3:30 pm

Block this user

Are you sure you want to block this user and hide all related comments throughout the site?

  • ×Search in: