Conor

Conor Russomanno

Neurotechnologist & Entrepreneur

Bio

I come from a mixed background of art, engineering, and design. As an undergraduate at Columbia University, I studied civil engineering & engineering mechanics while teaching computer graphics and developing Unity-based virtual environments under NSF funding. I later discovered brain-computer interfacing (BCI) as a Design & Technology MFA student at Parsons School of Design. I have been tirelessly pushing the industry of BCI forward ever since, making technologies for recording brain activity more cost-effective and accessible to everybody. Having led two successful crowdfunding campaigns, raising close to $500,000, I now spend most of my time building OpenBCI. I also love teaching. I recently taught Creative Coding, Physical ComputingDesigning Consciousness, and a number of other courses at Parsons School of Design. I now teach a course titled Neuromachina: Man & Machine at NYU Tisch School of the Arts.

CV

Work Experience

OpenBCI | Co-Founder & CEO
Brooklyn, NY (June 2013 — Present)

New York University ITP | Adjunct Faculty & “Something In Residence”
New York, NY (Jan 2016 — Present)

  • Courses taught: The Body Electric, Neuromachina: Man & Machine

Parsons School of Design (MFADT) | Adjunct Faculty
New York, NY (Sep 2013 — Dec 2016)

  • Courses taught: OpenBCI: Brain Hacking, Creativity & Computation (JS/Java/Arduino), The Digital Self: Interfacing the Body, Materials Spectrum Lab, Physical Computing, Designing Consciousness, Creative Coding (openFrameworks/C++)

NeuroTechNYC | Founder & Organizer
New York, NY (Jul 2015 — Present)

  • Coordinate monthly hack nights centered around the use of human-computer interface technologies

Felix Intelligent Local Advertising | Front-End Engineer
New York, NY (Jul 2013 — Dec 2013)

  • Designed and implemented internal browser-based dashboards and client-facing sites using Javascript, HTML, and CSS

Brain Interface Lab | Founder & Director
New York, NY (Oct 2012 — June 2013)

  • This is where my BCI journey began and also where the OpenBCI logo originates from

Education

Parsons School of Design | M.F.A. Design & Technology
New York, NY (Aug 2011 — May 2013)

  • Concentrations: brain-computer interfaces, creative coding, physical computing, game design, & illustration

Columbia University | B.S. Civil Engineering & Engineering Mechanics
New York, NY (Aug 2007 — May 2011)

  • Concentrations: project management, 3D-modeling, computer graphics
  • Led the 3D content creation of a Unity-based virtual world (aka CyberGRID) under an $750M NSF grant

Thomas Jefferson High School for Science & Technology
Alexandria, VA (Aug 2003 — May 2007)

  • Ranked #1 Public High School in the U.S. by U.S. News & World Report (2007)

Blog

  • 3D printed EEG electrodes! (2/16/2015)

    I spent the day messing around with 1.75mm conductive ABS BuMat filament, trying to create a 3D-printable EEG electrode. The long-term goal is to design an easily 3D-printable EEG electrode that nests into the OpenBCI “Spiderclaw” 3D printed EEG headset.

    I decided to try to make the electrode snap into the standard “snappy electrode cable” that you see with some industry-standard EMG/EKG/EEG electrodes, like the one seen the picture below.

    IMG_3583

    After some trial and error w/ AutoDesk Maya and a MakerBot Rep 1, managed to print a few different designs that snap pretty nicely into the cable seen above. At first, Joel (my fellow OpenBCI co-founder), and I we’re worried that the snappy nub would break off, but, to our pleasant surprise, it was strong enough to not break with repeated use. Though the jury is still out since we’ve only repeatedly snapped for 1 day.

    Here you can see a screenshot of the latest prototype design in Maya. I added a very subtle concave curvature to the “teeth” on the underside of the electrode so that the electrode will hopefully make better contact with the scalp.

    Screen Shot 2015-02-16 at 6.17.48 PM

    Here is a photo of a few different variations of the electrodes that we’re actually printed over the course of the day.

    IMG_3581

    FullSizeRender (3)

    I’d like to note that I printed each electrode upside-down, with the pointy teeth facing upward on the vertical (Z) axis, with a raft and supports, as seen in the picture below.

    Screen Shot 2015-02-16 at 6.35.01 PM

    I tested each of the electrodes with the OpenBCI board, trying to detect basic EMG/EEG signals from the O1/O2 positions on the back of the scalp—over the occipital lobe. I tried each electrode with no paste applied—simply conductive filament on skin. And then I tried each electrode with a small amount of Ten20 paste applied to the teeth. To my pleasant surprise, without applying any conductive Ten20 paste, I was able to detect small EMG artifacts by gritting my teeth, and very small artifacts from Alpha EEG brain waves, by closing my eyes. Upon applying the Ten20 paste, the signal was as good (if not better) than the signal that is recorded using the standard gold cup electrodes that come with the OpenBCI Electrode Starter Kit! Pretty awesome!

    Here’s a screenshot of some very faint alpha (~10Hz) that I was able to pick up without any Ten20 paste applied to the electrode, with an electrode placed over the O2 node of the 10-20 system!

    OpenBCI-2015-02-16_14-39-10

    And here’s a screenshot of some very vibrant alpha (~10Hz) that I was able to detect with Ten20 paste applied to the 3D-printed electrode!

    OpenBCI-2015-02-16_17-27-57

    The signal looks pretty good. Joel may begin messing around with an active amplification hardware design that works with the any 3D-printed snappy electrode design.

    In case you’re interested in printing your own, here’s a link to the github repo with the latest design of the electrode!

    More on this coming soon!

  • OpenBCI Graphical User Interface (GUI) (12/3/2014)
    PowerUpBoard

    [Image 1] — The OpenBCI Board (with which the OpenBCI GUI interfaces)

    Over the course of the late summer and early fall I worked extensively on the OpenBCI Graphical User Interface (GUI). The first version of the application, as seen in [Image 2] below, was developed by Chip Audette, who is one of the biggest OpenBCI contributors and runs the amazing blog EEG Hacker. The GUI is developed in Processing, a Java-based creative coding framework.

    OpenBCI-2014-09-20_13-04-02

    [Image 2] OpenBCI GUI – Version 1

    I worked on:

    • [Image 3] updating the design & user experience (w/ the help of Agustina Jacobi)
    • [Image 4] adding a UI controller to manage the system state (initial hardware settings, startup, live data streaming mode, playback mode, synthetic data mode, etc.)
    • [Image 5] adding a UI controller to manage OpenBCI board channels settings
    • the startup protocol for establishing a connection between the OpenBCI GUI and the OpenBCI Board
    • a collapsable window for adding and testing new features, called the “Developer Playground”
    • a widget at the bottom of the application that gives feedback to the user about what the system is doing

    [Image 3] OpenBCI GUI - Version2

    [Image 3] OpenBCI GUI – Version2

    [Image 3] —

    [Image 4] — UI controller to manage the system state

    Screen Shot 2015-02-17 at 3.27.52 PM

    [Image 5] — UI controller to manage OpenBCI board channels settings

    To download the latest version of the OpenBCI GUI, check out the following Github repo! Don’t hesitate to fork it, make improvements, and try out new features in the developer playground. For more information on how to get up-and-running with the OpenBCI board, check out the following getting started guide on the OpenBCI website.

  • [Make Magazine] OpenBCI: Rise of the Brain-Computer Interface (11/1/2014)

    I wrote the following article which was published in Volume 41 of Make Magazine!

    Conor wears an early prototype of the OpenBCI 3D-printable EEG Headset.

    Conor wears an early prototype of the OpenBCI 3D-printable EEG Headset.

    Conor wears an early prototype of the OpenBCI 3D-printable EEG Headset.

    This article first appeared in Make: Volume 41.

    This article first appeared in Make: Volume 41.

    During this summer’s Digital Revolution exhibition at London’s Barbican Museum, a small brainwave-influenced game sat sandwiched between Lady Gaga’s Haus of Gaga and Google’s DevArt booth. It was Not Impossible Labs’ Brainwriter installation, which combined Tobii eye tracking and an OpenBCI Electroencephalography (EEG) device to allow players to shoot laser beams at virtual robots with just eye movement and brain waves. “Whoa, this is the future,” exclaimed one participant.

    But the Brainwriter is designed for far more than just games. It’s an early attempt at using Brain-Computer Interface technology to create a comprehensive communication system for patients with ALS and other neurodegenerative disorders, which inhibit motor function and the ability to speak.

    render2

    The brain is one of the final frontiers of human discovery. Each day it gets easier to leverage technology to expand the capabilities of that squishy thing inside our heads. Real-world BCI will be vital in reverse-engineering and further understanding the human brain.

    Though BCI is in an embryonic state — with a definition that evolves by the day — it’s typically a system that enables direct communication between a brain and a computer, and one that will inevitably have a major impact on the future of humanity. BCIs encompass a wide range of technologies that vary in invasiveness, ease of use, functionality, cost, and real-world practicality. They include fMRI, cochlear implants, and EEG. Historically, these technologies have been used solely in medicine and research, but recently there’s been a major shift: As the technology becomes smaller, cheaper, and woven into the fabric of everyday life, many innovators are searching for real-world applications outside of medicine. It’s already happening, and it’s often driven by makers.

    OpenBCI 3D-printed EEG headset prototypes.

    OpenBCI 3D-printed EEG headset prototypes.

    The field is expanding at an astounding rate. I learned about it two and a half years ago, and it quickly turned into an obsession. I found myself daydreaming about the amazing implications of using nothing more than my mind to communicate with a machine. I thought about my grandma who was suffering from a neurodegenerative disorder and how BCIs might allow her to speak again. I thought about my best friend who had just suffered a severe neck injury and how BCIs might allow him to walk again. I thought about the vagueness of attention disorders, and how BCIs might lead to complementary or even supplementary treatments, replacing overprescribed and addictive medications.

    I went on to found OpenBCI with Joel Murphy as a way to offer access to every aspect of the BCI design and to present that information in an organized, collaborative, and educational way. I’m not the only one who sees the potential of this amazing new technology. But creating a practical, real-world BCI is an immense challenge — as the incredibly talented Murphy, who designed the hardware, says, “This stuff is really, really hard.” Many have attempted it but none have fully succeeded. It will take a community effort to achieve the technology’s potential while maintaining ethical design constraints. (It’s not hard to fathom a few not-too-far-off dystopian scenarios in which BCIs are used for the wrong reasons.)

    Russomanno (left) and Murphy demonstrate how to get started with OpenBCI.

    Russomanno (left) and Murphy demonstrate how to get started with OpenBCI.

    Of the many types of BCIs, EEG has recently emerged as the frontrunner in the commercial and DIY spaces, partly because it is minimally invasive and easily translated into signals that a computer can interpret. After all, computers are complex electrical systems, and EEG is the sampling of electrical signals from the scalp. Simply put, EEG is the best way to get our brains and our computers speaking the same language.

    EEG has existed for almost a hundred years and is most commonly used to diagnose epilepsy. In recent years, two companies, NeuroSky and Emotiv, have attempted to transplant EEG into the consumer industry. NeuroSky built the Mindwave, a simplified single-sensor system and the cheapest commercial EEG device on the market — and in doing so made EEG accessible to everyone and piqued the interest of many early BCI enthusiasts, myself included. Emotiv created the EPOC, a higher channel count system that split the gap between NeuroSky and research-grade EEG with regard to both cost and signal quality. While these devices have opened up BCI to innovators, there’s still a huge void waiting to be filled by those of us who like to explore the inner workings of our gadgets.

    Grant_using_OpenBCI

    UCSD researcher Grant Vousden-Dishington, working with OpenBCI at NeuroGaming 2014.

    With OpenBCI, we wanted to create a powerful, customizable tool that would enable innovators with varied backgrounds and skill levels to collaborate on the countless subchallenges of interfacing the brain and body. We came up with a board based on the Arduino electronics prototyping platform, with an integrated, programmable microcontroller and 16 sensor inputs that can pick up any electrical signals emitted from the body — including brain activity, muscle activity, and heart rate. And it can all be mounted onto the first-ever 3D-printable EEG headset.

    In the next 5 to 10 years we will see more widespread use of BCIs, from thought-controlled keyboards and mice to wheelchairs to new-age, immersive video games that respond to biosignals. Some of these systems already exist, though there’s a lot of work left before they become mainstream applications.

    The latest version of the OpenBCI board.

    The latest version of the OpenBCI board.

    This summer something really amazing is happening: Commercially available devices for interfacing the brain are popping up everywhere. In 2013, more than 10,000 commercial and do-it-yourself EEG systems were claimed through various crowdfunded projects. Most of those devices only recently started shipping. In addition to OpenBCI, Emotiv’s new headset Insight, the Melon Headband, and the InteraXon Muse are available on preorder. As a result, countless amazing — and maybe even practical — implementations of the BCI are going to start materializing in the latter half of 2014 and into 2015. But BCIs are still nascent. Despite big claims and big potential, they’re not ready; we still need makers, who’ll hack and build and experiment, to use them to change the world.

  • 3D printed EEG Headset (aka “Spiderclaw” V1) (12/17/2013)

    The following images are a series of sketches, screenshots, and photographs documenting my design process in the creation of the OpenBCI Spiderclaw (version 1). For additional information on the further development of the Spiderclaw, refer to the OpenBCI Docs Headware section and my post on Spiderclaw (version 2). If you want to download the .STL files to print them yourself or work with the Maya file, you can get them from the OpenBCI Spiderclaw Github repo. Also, if 3D printed EEG equipment excites you, check out my post on 3D printable EEG electrodes!

    10-20 System (Scientific Design Constraint)

    Concept Sketches

    3D Modeling (in AutoDesk Maya)

    3D Printing & Assembly

    Future Plans

    Headset_Interface

  • ROB3115 – A Neuro-Immersive Narrative (8/12/2013)

    In-experience screenshot

    ROB3115 is an interactive graphic novel that is influenced by the reader’s brainwaves. The experience is driven by the reader’s ability to cognitively engage with the story. ROB3115′s narrative and its fundamental interactive mechanic – the reader’s ability to focus – are tightly intertwined by virtue of a philosophical supposition linking consciousness with attention.

    ROB3115 explores the intersection of interactive narrative, visual storytelling, and brain-computer interfacing. The experience, designed for an individual, puts the reader in the shoes of a highly intelligent artificial being that begins to perceive a sense of consciousness. By using a NeuroSky brainwave sensor, the reader’s brain activity directly affects the internal dialogue of the main character, in turn, dictating the outcome of his series of psychosomatic realizations. The system is an adaptation of the traditional choose-your-own-adventure. However, instead of actively making decisions at critical points in the narrative, the reader subconsciously affects the story via their level of cognitive engagement. This piece makes use of new media devices while, at the same time, commenting on the seemingly inevitable implications of their introduction into society.

    This project was my thesis in graduating from Parsons with an M.F.A. in Design & Technology.

  • Charcoal Mike (4/28/2013)

    It was my girlfriend’s birthday and she really likes Michael Jackson. I think this is the best charcoal I’ve ever done. 🙂

    michael

  • Dot – Graphic Novel Character Design (4/1/2013)

    Dot is one of the main characters in a sci-fi graphic novel that I’ve been working on as a side project. The story largely inspired my thesis, Rob3115, which is a graphic short story about a robot. The piece is interactive and is affected in real-time by the reader’s brainwaves.

    Reel_illustrations_2

  • Brain Interface Lab (3/29/2013)

    I recently founded the Brain Interface Lab with some colleagues from Parsons MFA Design & Technology and Columbia University. The lab is dedicated to supporting the open-source software and hardware development of brain-computer interfaces. Check out our website and all of the awesome stuff that was created during our first big event titled Hack-A-Brain:

  • audioBuzzers – Audio Visualizer (Unity) (3/6/2013)

    Summary

    This is a Unity-built audio visualizer of the song Major Tom, covered by the Shiny Toy Guns.

    Project Files

    The Web Player: http://a.parsons.edu/~russc171/UnityHW/AudioBuzzers_2/AudioBuzzers_2.html

    The Unity Project: http://a.parsons.edu/~russc171/UnityHW/hw_wk5_audioBuzzers.zip

    Screenshot

    Screen Shot 2013-03-06 at 5.11.50 PM

  • Demo Reel (3/6/2013)

    DEMO REEL BREAKDOWN

    DRB

  • Plasma Ball Concentration Game (openFrameworks + Neurosky’s EEG Mindset) (12/21/2012)

    Project Summary

    This project relates to the brain-computer interface work I’ve been doing for my thesis. As I will soon be creating generative animations that responds to brain activity, which are part of a digital graphic novel, I wanted to do a prototype of a visually complex animation that was dependent on a person’s brain activity. This project was written in openFrameworks and uses a Neurosky Mindset to link a player’s attention level to the intensity of electricity being generated from a sphere in the middle of the screen. The meat of the code is a recursive function that creates individual lightning strikes at a frequency inversely proportional to the attention parameter calculated by the Neurosky EEG headset. The project was visually inspired by the tesla coil and those cool electricity lamps that were really popular in the 90s (see below).

    Once the connection between the Neurosky headset and the user’s computer has strong connectivity, the user can press the ‘b’ key (for brain) to link their EEG with the plasma ball. At any point the user can press the ‘g’ key (for graph) to see a HUD that displays a bar graph of their attention value on a scale from 0-100. The graph also shows the connectivity value of the device and the average attention value, calculated over the previous 5 seconds, being used to dictate the frequency of the electricity.

    In order to get this application working on your computer, you must first download and install the Neurosky Thinkgear connector. You should be able to get it working with any bluetooth enabled Neurosky device; I’ve documented how to do so in the readme file on my github. You can get my code for the project on my Github page here: https://github.com/crussoma/conorRussomanno_algo2012/tree/master/Conors_Final

    Also, if you just want to see the recursive electricity code working independent of a person’s EEG, download and install the app lightningBall (not lightnightBall_brain) from my github.

    Project Video

    To see this project in action check out my demo reel and jump to 35s.

    Visual Inspiration

    lightnightBall

    lightnightLamp

    Screenshots

    Screen Shot 2013-03-06 at 5.29.23 PM

    Screen Shot 2013-03-06 at 5.29.00 PM

    References

    My code uses some of the logic and algorithms Esteban Hufstedler’s processing sketch:http://www.openprocessing.org/sketch/2924

    Additionally, a big shout out to Akira Hayasaka for writing the Neurosky openFrameworks addon that I used to pull this off:  https://github.com/Akira-Hayasaka/ofxThinkGear

  • ‘Wetlands’ Architectural Renders (12/20/2012)

    Project Summary

    I spent the past 6 weeks working with the amazing and progressive artist Mary Mattingly on her project titled Wetlands. Most of her work explores the complex relationship between people and the Earth. Wetlands, currently in the design phase, is a self-sustained living environment that floats in the rivers outside of Philadelphia. The structure will be a low-cost floating barge with various components that explore DIY techniques of sustainability.

    My Role

    I worked with 2 other artists to create an architectural design for the structure that optimized the functional and design constraints. I helped with the concept drawings and took the lead on creating 3D renders of the design.

    Renders

    Project Presenation PDF: wetlands

  • Please Vote For An Awesome EEG Project! (10/26/2012)

    Please take 10 seconds to vote for my New Challenge application:

    Despite being rather silent on this blog recently, I’ve actually been quite busy. My ongoing thesis at Parsons MFA Design & Technology is an exploration of practical applications of wearable brain-computer interfaces. More on that to come.

    Recently, some fellow designers, engineers, researchers, and myself applied for an award of up to 10K to explore if wearable BCIs could be used to find complimentary or alternative solutions for people suffering from attention disorders such as ADHD. If you support this cause, please click on the image above or the following link and click the “vote” button. You could comment here, but it would be better for you to comment on the application page itself in order to prove to the judges that people truly do care about this cause.

    The application is as follows:

    Project Title: Brain Design Lab – Finding Alternative Approaches to Addressing ADHD

    People Involved:

    • Conor Russomanno (Director) – Conor is currently a 2nd Year in Parsons School of Design MFA Design & Technology Program. Conor did his undergraduate degree in engineering at Columbia University, and has been working with brain-computer interfaces for the past year. Check out his website at conorrussomanno.me.
    • Kristen Kersh – Candidate for MFA in Design & Technology at Parsons School of Design, Masters in Neuroscience and Education from Harvard University
    • James Ramadan – Received dual majors in biology and statistics from University of Virginia, currently does research in statistical analysis of quantitative EEG.
    • Amy Burns – Award winning reporter, spent more than 17 years in the multi-media industry, covering a diverse range of topics through the written word, social media, and the power of video.
    • Other members of the Brain Design Lab (our website is currently being built, braindesignlab.com)

    The Problem

    Our brains are dependent on the stimuli provided by our environment. Neuroplasticity is the notion that our neurons can be molded and re-purposed based on our experiences, even after critical stages of development. Currently, elite academic institutions such as Harvard, Columbia, and MIT are using functional magnetic resonance imaging (FMRI), magnetic resonance imaging (MRI), and electroencephalography (EEG) to research the brain’s ability to develop and change in response to stimuli. These studies have produced important findings with regards to a wide range of neurological diseases, traumatic brain injuries, and learning. In turn, these findings are being translated and applied to improved techniques in medicine, therapy, and education.

    One of the main shortcomings of interfacing the brain is the ability to attain data outside of the confinement of a laboratory setting. There are very few studies done with a patient within the context of their normal environment, looking at how their home, what they eat, smell, see, hear, and touch affects the activity within their brain. Understandably, this is a very large challenge to address. If we are honored with receiving funds from the New Challenge competition, we intend to contribute to this pervasive challenge by addressing the issues of one of its sub communities, people suffering from attention disorders that affect their ability to focus and learn.

    In 2007, the Center for Disease Control reported that 8.4% of American children aged 3-17 were at one point diagnosed with ADHD. Roughly 50% of children with attention disorders continue to experience issues as they progress into adulthood, and almost 60% of people diagnosed with these disorders are prescribed medication in an attempt to address the symptoms. It is vital that researchers continue to explore alternative and complimentary methods for solving attention-related disorders, and do not rely entirely on prescription medication to resolve the issue. Additionally, we believe that solutions to these problems have the potential to extend beyond the scope of individuals diagnosed with ADHD, and could be implemented by undiagnosed individuals trying to enhance their level of focus, learning ability, and productivity. It is this ubiquitous issue that we intend to examine.

    Our Solution

    To address this problem, my team of designers, engineers, and researchers has come together to found the Brain Design Laboratory (BDL). The goal of this community is to design, build, test, and rebuild non-invasive neurofeedback platforms that allow users to record environmental conditions over prolonged periods of time, while simultaneously tracking brain activity. In order to explore alternative techniques to addressing ADHD, we want to analyze the data that is recorded by these systems.

    The systems will be comprised of a non-invasive headset that wirelessly sends brainwave data to a mobile phone and a central server, as well as a mobile application that tracks environmental stimuli both actively and passively. Passive stimuli will include variables such as location, noise, and movement, using GPS, audio inputs, and accelerometers. Actively recorded stimuli will include variables such as diet, activities, and moods, and will be input manually by the user. We believe this system will provide invaluable insight into how environmental stimuli correlate to variations in levels of attention. We will reach out to find user groups willing to test the platform. Eventually we hope to be able to provide real-time feedback to the user about how their environment is affecting their level of attention.

    Currently, the potentials of commercial EEG have been used primarily for stationary recording and interaction, and do not serve as a good system for prolonged recording of brain activity. Some of the major shortcomings include comfort and attention to aesthetics. We believe that our diverse team of designers and engineers with experience in neuroscience, electrical engineering, as well as fashionable technology, can provide a new outlook on these problems, creating a system that is both wearable and functional. Lastly, we don’t want to just build technology; we strive to turn BDL into an open community of designers, researchers, patients, parents, and other organizations who are dealing with this problem.

    Rough Budget

    Item Cost Rationale
    20xNeuroskyThinkgearChip $35 each The Neurosky Thinkgear chips (http://neurosky.com/Business/ThinkGearChipsets.aspx) are commercial
    Electronics $2000 Bluetooth modules, Android testing platforms, electrodes, wires,
    Materials $500 Garments, materials, and accessories for designing and building wearable devices. Including fabric, sewing equipment, hats, etc.
    Website $500 We will use this money to establish our validity as an organization so that we can reach out to potential user groups for testing
    Contingency Cost $1000 Miscellaneous expenditures

    Our Qualifications

    I first began trying to address this issue last spring when I designed and built a baseball cap with a sensor for recording brainwaves. To accompany the hat I developed a mobile application for Android that received and recorded the user’s EEG allowing for retroactive analysis of the data. The application also allowed the user to record a variety of moods and daily activities, the intention being to see how quantitative brain activity could be used to find new comparisons between the two. For more information about the project refer to: http://conorrussomanno.me/2012/06/19/interactive-android-application-for-eeg-biofeedback/

    This Fall, with the support of former dean of Parsons The New School of Design’s Art Media and Technology department, Sven Travis, I founded the Brain Design Lab (BDL), a community focused on finding practical applications for brain-computer interfaces. Since it’s inception the community has grown and now has members both inside and outside of the New School community. Some of BDLs most prominent members include a recent graduate of Harvard’s Neuroscience and Education M.S. program, a University of Virginia graduate with a double major in biology and statistics, an award-winning journalist whose son suffers from an incredibly rare undiagnosed neurological disorder, and a graduate of Columbia University’s engineering program.

    Recently we received $1,500 from the New School Student Activities Finance Committee to host a development jam titled Hack-A-Brain. The goal of the event is to explore the potentials of various front-line commercial EEG devices, while introducing New School students to the emerging industry of brain-computer interfacing (BCI). The Brain Design Lab has already connected with a number of individuals and organizations involved in the industry. Now we are looking to find additional support, make new connections, and apply novel design techniques to address problems related to the brain. We want to start by attempting to build user feedback applications for addressing attention disorders such as ADHD.

     

  • ABC No Rio – An Illustrated Short Story Prototype (10/20/2012)

    I collaborated with two other artists, Tharit Firm Tothong and Giselle Wynn, on the creation of this illustrated short story for a class project. The piece pays tribute to ABC No Rio, an art gallery and concert space in the Lower East Side that has been in operation since the early 80s and was very active politically active during the late 80s and early 90s, acting as a sanctuary for society’s misfit demographics, as well as taking a strong stance of opposition to NYC’s heavy gentrification at the time. The piece speculates a fictional narrative from the point of view of a poor musician living in the slums of an overpopulated and depressed urban setting.

    The piece is comprised of unpolished illustrations, done by myself and Giselle, as well as a collection of photographs of authentic artwork from within the walls ABC No Rio itself, taken by Firm. Firm also oversaw the design and layout of the composition.

  • Bull’s Eye – Hand-drawn Animation (9/15/2012)

    This hand-drawn animation is of an archer readying and firing his bow:

  • Futuristic Flyover (8/26/2012)

  • The Locket, Directed by Carillon Hepburn (8/19/2012)

    This amazing short film was written, directed, edited, and starred-in by my inspiring little sister, Carillon Smith (aka Carillon Hepburn).  She did it all in just 3 weeks, during a summer film intensive at Virginia Commonwealth University. The flick is a dynamic, mysterious, and gripping drama that touches on the themes of teen passion and self-discovery.

    [youtube=http://www.youtube.com/watch?v=bH8suw2kXi4&w=560&h=315]

  • Not So New News! (8/19/2012)

    Just found out that my friend Jeremy and I made it into the New School newspaper last May, after being asked what our plans were for the summer! It’s amusing to compare a prior perception of a future level of achievement to an ex post facto critique on the same success state. The “graphic novel” didn’t get written/illustrated, but I’d say that it’s underway. Additionally, it’s looking more and more likely that the Brain Cap will be the primary inspiration for my upcoming thesis. Though, I didn’t do everything I wanted to this summer, I had many unexpected successes. I think that it’s important to to have a plan, but just as important to be willing to deviate from it.

  • Rooftop Chilling w/ Shades (7/25/2012)

    “Life moves pretty fast. If you don’t stop and look around once in a while, you could miss it.” – Ferris Bueller

    And sometimes it looks better with sunglasses on. 🙂

  • Subway Paternity (7/24/2012)

    I took this photo on the way home from class one night as I was riding the 4/5 train south from Union Square. I couldn’t resist stealthily snapping this shot of father and son. After taking the picture, a woman on the other side of me smiled so widely I could see her from the corner of my eye. I turned to her and immediately knew this was the mother of the child. After fumbling over my words, I finally said, “I couldn’t help it; look at them!”

    “Thankfully, I get to every day,” she replied.

Talks & Workshops

Contact

Your Name (required)

Your Email (required)

Subject

Your Message