What can architecture learn from the Tech Revolution
I have been thinking about this post for a while and its original title was going to be, “What can architects learn from Apple” but with the recent release of Windows 8 I think my thoughts can apply to a much larger spectrum. The reason I originally focused on Apple was because I think (and I am sure I am not alone) they have the best design minds in the game.
Full disclosure, I have owned multiple iPods in my day and currently have both an iPhone and iPad, so my opinion is clearly skewed in Apple’s favor. Anyways, what I love most about Apple as a company and brand is their complete dedication to the idea that their design aesthetic and approach is the best and everyone else must either accept it or be left behind. You can hate on the aesthetic, but you can’t hate on their unwavering belief in their own design. I believe that architects would benefit from this undying belief in their own ideas and work.
However let’s bring it back to the larger issue of lessons to be learned from the Digital Technology Revolution, which for the purposes of this post is dating back to the early 80’s. As far as advancements in technology goes, the computer and “digital technology” has had the fastest and most rapid pace of any technology in human history. It took only 30 or so years for computers go to from the size of a room to the size of a jacket pocket. That is the equivalent to going from the original Smith and Wesson revolver to laser beams in the time it takes to complete the ARE’s.
Due to the rapid pace of evolution in digital technology, I believe it has been forced into a very condensed trajectory in terms of design and aesthetics. Art, Architecture and many other disciplines have had a very steady theoretical growth through the various Avant-gardes of the day. Digital Technology, on the other hand, has not had the luxury of this steady growth as theoretical ideas on aesthetics and style have had to keep rapid pace with technological advancements.
At present, it seems like digital technology is in its “Modernism phase,” the origins of which can be found in the clean and simple look of Apple’s first iPod. With the first iPhone came the introduction of the “grid” in digital technology. Not only are all the apps on the iPhone placed on a grid, but you cannot change it if you tried, the modernism take-over of computers and mobile devices has arrived. At this point Apple is assuming the role of radical architecture groups such as Superstudio and Archigram, with their devotion to the grid. We also see the grid used in the recent release of Windows 8. The random and unorganized look of the classic windows desktop, with icons thrown around everywhere (at least that is what my desktop looks like), is whitewashed and replaced with a tiled grid to solve all our problems, even if we didn’t know we had any.
The point to all of this is that with the rapid evolution of digital technology, and the way in which design theory is forced to keep pace, we are able to see design theory redefine itself over and over again in a short time frame. What took art and architecture centuries now only takes until the next smart phone release date. As architects, the rapidity of this evolution in design thinking, predicated by advancements in digital technology, not only tells us where we’ve been but also can give us a glimpse into where we will be.
Where will design theory in digital technology be in 5 years? 10 years? These are the questions that are most interesting to me as a would-be architect.
I am a graduate student and an entrepreneur at the University of Michigan Taubman College where my studies are focused on leveraging design ideas across multiple scales and platforms. Meeting at the intersection between design, tectonics and fabrication, I am continually exploring how a design idea can navigate complex material and production systems and evolve into fully realized architectural artifacts.