Last week, the Van Alen Institute hosted an interdisciplinary event relating brain activity, new technology and our response to the built environment. The event included a tech demo of brain computer interfaces and a conversation involving architects, neuroscientists, psychologists and designers, weighing in on technological advancements in EEG brain computer interfaces (BCI) and the application of brain imaging to redefine how we understand people's perception of their surroundings.
Mark Collins from GSAPP's Cloud Lab kicked off the presentations with a summary of the Dumbo Mental Map Project. Van Alen and Cloud Lab jointly hosted a previous event to gather data that used brain waves to indicate when a subject was in a state of meditation or attention while walking on a predetermined route through the DUMBO neighboorhood. The data was then used to generate graphics overlaying maps and 3d renderings of DUMBO. Not only were the visuals mesmerizing, they were an embryonic representation of a much larger application of visualizing brain data. Seeing where the environment could induce a meditative or attentive state begs the question – can neural cartography be used to impact the way we design cities? And could it be used to add an informative and complex layer in analyzing the performance of a building?
Next up, the hard scientists took center stage starting with Dave Jangraw, a neuroscientist at Columbia. Jangraw quite eloquently put his research into laymen's terms explaining how BCIs are used to record electrical activity on a scalp. When a neuron fires it creates an electric dipole, a positive and negative charge on each end of the neuron that can be registered and collected as data by an EEG BCI. He walked through the methodology used to study people’s “Aha!” moments, showing the potential for a “mobile interest detector.” By being able to identify these “Aha!” moments, BCIs can register moments of interest and potentially use virtual visualization technologies to provide information like looking up the model of a car that piques your interest. Think of something along the lines of wearing a future version of Google Glass, fused with a BCI so when you see someone you recognize (Aha!) the tech will scan their face, cross reference it with your friend's database, and provide you with pertinent information about them. It's moments like these that our sci-fi fantasies seem to be getting closer to reality.
Nancy Wells, an environmental psychologist at Cornell University, carried the science torch into the realm of psychology, briefly describing how environmental psychologists study people’s relationships to their environments. Through objective measurements and quantifiable research they can hone in on the causality of the built environment on the individual. Wells brought up the example of integrating more nature into the city to relieve directed attention fatigue. The integration of environmental psychology to the design process is filled with possibilities to create spaces that resonate at a more neurological level. For example, urban spaces can be designed to include more natural landscapes to provide spaces for cognitive restoration.
The founders of OpenBCI and MindRider stepped up to the plate to show how EEG BCI platforms will not only be readily available, but able to provide valuable data in the very near future. Joel Murphy & Conor Russomanno, founders of OpenBCI, introduced their affordable and open source brain-computer interface, giving almost anyone access to BCI tech and the ability to use it to extract meaningful data. They started developing the idea of an open source BCI in Parson's Design Technology program, and after a successful Kickstarter they intend to start rolling out their Arduino-based product by the end of the summer, opening the doors of a powerful research tool to gamers, scientists, hobbyists, and pretty much everyone. Arlene Ducao and Josue Diaz rounded out the presentations with MindRider – helmets with an embedded BCI that tracks meditative and attentive states during bike rides with a sleek iPhone app. The app shows the routes taken and through gradients of green to brown to red shows state of mind along the route. Diaz showed the maps of a few of his rides and the data clearly indicated areas in which he felt safe riding a bike and others in which he was in a very attentive state (any NYC cyclist knows that biking in Chinatown takes all of your attention). MindRider’s hardware and software has the potential to create safer biking conditions by focusing on stressful areas and to be honest, their 3D printed helmets look as cool as the technology behind them.
The presentations were followed by a demo where audience members were free to check out the OpenBCI and MindRiders gadgets, with data analysis on the spot. Arlene Ducao fitted me with a MindRider helmet which was not only comfortable, but able to stream data immediately.
The event tied together a diverse group of interests and exemplified unforeseen potentials in interdisciplinary approaches to urban design, thanks to cutting edge new technologies in neuro-imaging and tracking.
3 Comments
Three emotions sum it up: fear, disgust and delight.
Wish I hadn't missed that one...is anyone working on developing software that defines space through the basic motor skills picked up by most basic BCI headsets? I realize its a challenge well beyond mood and data collection, but after letting my 6 year old daughter master push/pull in about 5 minutes vs her father spending hours I don't think it would be that tough to create form through intentional thought. Also, anyone taking the neurophenomological approach to this data collection, I realize its not trending in neuroscience.. I have a project for a programmer to make this pop science for the masses, hit me up with an email. Thanks. I have the emotiv headset and SDK and would be willing to compensate....
Academy of Neuroscience and Architecture
Block this user
Are you sure you want to block this user and hide all related comments throughout the site?
Archinect
This is your first comment on Archinect. Your comment will be visible once approved.