SPLIT BRAIN ROBOTICS
Audience members are invited to try and control two giant, scary robots with your brainwaves! It won’t be easy, but can you make them kiss? Fight? When two audience members accomplish this feat, everyone is treated to a show of lasers, fog, sound, video, and other cool chaos.
Performances: Friday, April 7-9, 2017. 8:00 P.M.
Sunday April 9 Matinee 1:00 P.M.
The Lab 2948 16th St.
San Francisco, CA 94103
email@example.com @thelabsf / #thelabsf
The Lab is wheelchair accessible and is located 1/2 block from the 16th and Mission BART.
Video of the Project:
Live streaming brain data runs the two robots! Volunteers’ (your!) thoughts are brought to life through robotic actions.
This performative installation is a collaborative project of Kal Spelletich (artist, robot maker), Masahiro Kahata (brainwave control hardware/software), and Mitch Altman (integration hardware and software).
Spelletich, Kahata, and Altman are interested in exploring and critiquing the promise that new ‘advances’ in technology will solve our problems and fulfill our dreams. They ask:
Can we translate human thoughts and emotions into robot actions?
Can this demonstrate new ways to instill consciousness? Mindfulness?
Can we create empathy for a robot?
Can a robot respond to, represent or convey one’s inner desires?
What are the promises, benefits, and dangers of technology?
SOME VIDEO LINKS:
Stop Motion 24 sec.
Mitch Altman is a San Francisco-based hacker and inventor, best known for inventing TV-B-Gone remote controls, a keychain that turns off TVs in public places. He was also co-founder of 3ware (a successful Silicon Valley startup) and did pioneering work in Virtual Reality at VPL Research in the mid-1980s. He has contributed to MAKE Magazine and other magazines. For several years he has traveled around the world giving talks, and leading workshops, teaching people to make cool things with microcontrollers and teaching everyone to solder. He promotes hackerspaces and open source hardware wherever he goes. He is a co-founder of Noisebridge hackerspace, and is President and CEO of Cornfield Electronics.
Masahiro Kahata began making electrical systems with Vacuum tube over 50 years ago in Sapporo, Japan. n the 1960’s, he started to use the Transistor and ICs. In the 1970’s, he made custom-made Stereo & PA systems with OpAmp, analog & digital I/O system with LEDs, etc. at Psychic Lab. He became more interested in UFO and psychotronics than electronics – psychotronics stands for the interdisciplinary study of the interaction of matter, energy, and consciousness. In 1983, he developed brain wave interactive I/O system with a Lisa computer (the first consumer computer with human interface) and other computers such as Apple II, CPM, Rockwell. Since 2002, Kahata had developed a new function of coherence in the IBVA, which shows visually how to synchronize the two sides of the brain. In October 2014 Kahata created Brain-Duino MetaVolutiON project, an open source EEG system, with Metamind in Berlin, Germany.
Kal Spelletich explores the interface of humans and robots, using technology to put people back in touch with real-life experiences. His work is interactive, requiring participants to enter or operate his pieces, often against their instincts of self-preservation. He probes the boundaries between fear, control and exhilaration. He has exhibited at The Catharine Clark Gallery, Gallery Maeght and the De Young Museum, SFMOMA, all in San Francisco, as well as at galleries and events all over the world. He was awarded San Francisco Art Commission funding for 2016.
The artist would like to thank Jonathan Foote, Willi Döring, Silver Kuusik, and Robert Langer for their help with this project. Kal Spelletich: Split Brain Robotics is supported by generous grants from the San Francisco Arts Commission and The Zellerbach Foundation.
-a robot brain meld using bio compatible sensors and questioning the very technology that purports to save us.
An interactive audience participatory performance installation with two large robots that extend from 8-16 feet tall, each identical, each controlled by the left and right side brainwaves of an audience participant(s). The idea is for volunteers to use their brainwaves to make the two robots move, collaborate, interact with each other and even kiss! (When they do “correctly” interact, symbolic and metaphoric events will happen, activating lights, fog, other robots, sounds, fire.
THIS PROJECT HAS BEEN FUNDED BY THE SAN FRANCISCO ART COMMISSION.
INDIVIDUAL ARTIST COMMISSIONS
AND: The Zellerbach Foundation!
A TALK MITCH ALTMAN AND I ARE GIVING ON THIS PROJECT;
The performance will consist of 4 acts.
- Pre-show with audience activated prototype robots.
- Artist’s discussion and demonstration.
- Break with student robots serving snacks and drinks
- Audience operating robots.
The performance will start with a pre-show incorporating two smaller protoype robots already built. They are operated by touch, heartbeat and skin conductivity. They are 6 feet tall with 4 degrees of movement. Anyone can operate them. They will be on a small stage spot-lit. The audience will be able to walk up to them and operate them at will before and after the exhibit.
An observation and review of Split Brain Robotics by Piero Scaruffi
You travel around the world and you find people from all races and all education levels who marvel at the creativity and innovation of the Bay Area (better known abroad as “Silicon Valley”). But there’s a story of Silicon Valley that is rarely heard. For example, the first general-purpose robot was built in Silicon Valley in 1969 (Shakey, at SRI Intl) but Silicon Valley did very little in robotics until the late 2000s. Then suddenly Silicon Valley became the epicenter of a major revolution in robotics. Who kept robotics alive? You’d have a hard time finding a straight line if you only looked at the high-tech world. But look behind (below? above?) the tech world and you’d find the Survival Research Labs, Seemen and many other deranged artists/inventors who built machines for some very unlikely kinetic sculpture. Kal Spelletich was and is one of them. Or think of the hackerspace movement that is now sweeping the entire world. The USA (and the entire high-tech world) was plunging into the Great Recession in 2008. Investors were running away from high-tech startups. Some were talking about the death of the World-wide Web because of the distortion introduced by the narrow minds of the corporate world. The first hackerspaces defended an ideal of what technology should be and what it should do for ordinary people: not an emanation of Wall Street, but an emanation of the community. San Francisco pioneered this movement with Citizen Space in 2006 and Noisebridge in 2007. Mitch Altman was the cofounder of Noisebridge, and a veteran of Jaron Lanier’s virtual-reality startup VPL Research. These alternative hackerspaces kept the momentum going when the financial world was scared like a baby on a rollercoaster. We could write a book titled “What saved Silicon Valley?” Those kinetic artists playing with robots and those hackerspaces sheltering the independent engineer represent more than an anarchic subculture and certainly more than an artistic movement: they represent the fusion of science and the humanities that gave us Athens, the Renaissance, the Enlightenment and the Victorian boom.
It is therefore a pleasure to find Spelletich collaborating with Altman and Japanese brain-wave pioneer Masahiro Kahata. Their project wants to plug into your brain, pick up your waves, split them between right and left hemispheres, and use them to guide two robots, one representing your right brain (the creative brain) and the other one representing the left brain (the rational brain). It would be easy to categorize this project as “three mad men and two robots”, and it would be literally true. But, as it often the case in the Bay Area, the three men and the two robots constitute a team of acute research into the human condition. Neuroscience is telling us that we are not one person but the result of the collaboration between at least two selves (and possibly more). Roger Sperry’s pupil Michael Gazzaniga has been the foremost authority in investigating the multi-brain architecture of the human brain, and every year there seems to be new details on how we are not who we think we are.
Are you sure that your two “you’s” collaborate? Each self cannot read the thoughts of the other self, but at the end the collaboration between these two selves is “you”, your daily behavior, your profound thoughts, your pain and your laughter. Far from being only a funny joke,this split-brain project can be a terribly serious investigation into the nature of… you.
Scientists have already been successful in designing devices that transmit a disabled person’s thought to a limb (typically an artificial arm). But which thoughts exactly are we transmitting? The thoughts of which of your brains?
How do the two robots that obey your two brains interact? Can “you” control their interaction and make them love each other?
“Split-Brain Robotics” will be the most interesting (and revealing) experiment of your life. Hmmm…. yours? Whose…?
By Piero Scaruffi
Piero Scaruffi is a cognitive scientist who was the manager of the Olivetti Artificial Intelligence Center in Silicon Valley, held visiting scholarships at several academic centers (notably Harvard and Stanford universities), lectured in three continents, and published several books on Artificial Intelligence and Cognitive Science.
- Artist’s discussion and demonstration.
The three artists will demonstrate the main robots for fifteen minutes explaining how we made them, why and what we hope to accomplish. The audience can be seated during the performance or stand on the sides
A volunteer’s left-brain controls one robot arm. Their right side brain will control the 2nd robot.
Also, using two volunteers, one volunteer’s left-brain controls one robot arm
A second participant’s right-brain controls the other robot. Can they make the robots kiss? Interact? Dance? Fight?
There are two large identical split-brain robots on a stage facing each other spotlighted in a darkened space with fog swirling around them. They can move up down, left right and forward and back. Each robot has a pair of “hands” on the tip of the robot. They will be 16 feet tall at their full extension.
At the foot of the stage on a short riser will be electronics and a workstation the artsist’s will operate. We will sit in front of each robot wearing the headset with EEG brainwave-monitoring electrodes on it.
Each robot is equipped with a microphone amplifying it’s movement sounds giving it/them a “voice”. The sounds are modulated and are mic-ed through a P.A. giving the robots a loud unique ever evolving voice.
The robots have lights, lasers and video cameras on them. The two onboard live video feeds are projected on large screens behind each robot catching glimpses of the audience and volunteers operating them. A sort of robot’s eye view of what each robot can see. The spotlight and laser on each robot further illuminates the space as they are activated. When the robots interact and for instance, kiss, lights, lasers, fogs and explosions of sound will happen.
Break Time/Student Work 15 minutes.
Kal Spelletich has been making art, teaching art, technology and robotics for 27 years in San Francisco. Several of his students’ works will be highlighted for the break. They will be “serving” coffee, snacks and refreshments. These students of Kal’s have interpreted some of his ideas and ran with them.
Audience operating robots 45 minutes.
The audience will be invited onstage to don the EEG headsets and operate the robots. At the foot of the stage on a short riser will be electronics and for audience members to don EEG headsets.
An audience member will sit in front of each robot wearing the headset with EEG brainwave-monitoring electrodes on it. People will feel the power of their thoughts moving a robot more than twice their size. They will have fun and be afraid. They will be performing live with the robots dealing with stage fright. The robots will roar and whisper. They will lunge and barely move. They will intimidate with their all seeing panopticon video eyes and spotlights.
There will be a spectrum of interactions with the volunteers who operate the robots and the live audience, sometimes meditative, sometimes raucous and rowdy.
Volunteers will be asked to peacefully meditate, think kind and even nasty thoughts. They will be prodded via small video monitors on a table in front of them with a wide spectrum of imagery.
There will be audience suggestions and questions to trigger peoples’ thoughts as they are hooked up to the robots. A dialogue between the audience, artists and the volunteers. All of this will affect the output of the volunteers brainwaves and the robots responses. For instance, if an audience member becomes agitated or excited a Fedback loop can begin, where the robots respond in an agitated way and even aggressively. Can the loop be stopped? Can mind master the machine? We will also have a reverse setting where the opposite happens, people are calm and relaxed and the robots are agitated and aggressive.
Physics does not change the nature of the world it studies, and no
science of behavior can change the essential nature of man, even
though both sciences yield technologies with a vast power to
manipulate their subject matters.
– B.F. Skinner
We have two sides of our brain, the left side is dominated by;
And Reasoning, like for a scientist.
The right side is dominated by;
Like for artists.
The two sides need to cooperate for us to function. There are theories that different cultures’ left and right sides operate differently and that even different sexes respond uniquely.
Some scientists dispute that the brains are split this clearly.
We are hoping that this performance can explore wether the two sides of the brain can actually define these thought and intellectual processes.
We are exploring new ways to use brain-wave reading technology in an experiment to improve people’s lives and exemplify the poetry of the mind. The robots are not programmed, they are responding live to human brain waves. Each robot will respond uniquely to each participant’s brainwave input signals. Robot actions will be unique to the individual or group sensed.
Left-right brain research is perhaps a start at understanding the way the brain divides learning tasks between verbal and visual, analytical and global, logical and creative.
For example, a person who is “left-brained” is often said to be more logical, analytical, and objective, while a person who is “right-brained” is said to be more intuitive, thoughtful, and subjective.
In psychology, the theory is based on what is known as the lateralization of brain function. So, does one side of the brain really control specific functions? Where does creative thinking come from? Today, neuroscientists know that the two sides of the brain work together to perform a wide variety of tasks and that the two hemispheres communicate through the corpus collosum.
The left hemisphere specializes in picking out the sounds that form words and working out the syntax of the words, for example, but it does not have a monopoly on language processing. The right hemisphere is actually more sensitive to the emotional features of language, tuning in to the slow rhythms of speech that carry intonation and stress.
Language tends to be on the left, attention more on the right.
Are people either left-brained or right-brained? Will genders, races gay and straight responding differently? Will an artist operate the robots differently than a scientist? Childeren as compared to adults? What about different sexual identities?
Can we build an interface to trigger robots that can read viewers’ “auras, vibe or character”? Can a robot respond to ones individuality? Can we meld two minds into one creating a collaborative response? We are exploring new forms of interpersonal communication through touch, force-feedback technology, intimacy, social interaction via technology, machines, robot, fear, play, machine, human interaction, rite of passage and empowerment.
While over-generalized and overstated by popular psychology and self-help texts, understanding your strengths and weaknesses in certain areas can help you develop better ways to learn and study. For example, students who have a difficult time following verbal instructions (often cited as a right-brain characteristic) might benefit from writing down directions and developing better organizational skills. Manually committing things to memory.
Historically, film has visited the concept of man machine. From Metropolis to the opening scene in the film Modern Times with Charlie Chaplin manically and hilariously attempting to keep up with an assembly line while he is prodded to quicken his pace. Consumed by his work, Chaplin becomes physically integrated into the machinery, his body twisting and turning within the conveyer belts and gears. Still imprinted with his work, Chaplin leaves the factory and screws the bolts on a fire hydrant. When the line is halted at lunchtime, his spastic twitching signals his disorientation and loss of control of his body. With Chaplin physically and cognitively consumed by his labor, he faces an existential, yet comic, crisis in its absences as he shifts from machine back to man.
Left-brainers brag about their math skills and the right-brainers tout their creativity. The brain’s right hemisphere controls the muscles on the left side of the body, while the left hemisphere controls the muscles on the right side o f the human body. When you wink your right eye, that’s the left side of your brain at work. Because of this criss-cross wiring, damage to one side of the brain affects the opposite side of the body. The brain carefully balances and assigns control of certain functions to each side. It is nature’s way of ensuring that the brain ultimately splits up tasks to maximize efficiency. Most people are right-hand dominant, which is actually controlled by the left side of the brain. Brain asymmetry is essential for proper brain function. It allows the two sides of the brain to become specialized, increasing its processing capacity and avoiding situations of conflict where both sides of the brain try to take charge.
With this performance piece we are interested in exploring, honoring AND critiquing transhumanism and the promise that new ‘advances’ in information and communication technologies will solve our problems and fulfill our dreams — that all we need to do is update, upgrade and replace our devices. We tend to consider how technology and machines alter our bodies from the consumer’s end, as with our daily use of wearable or smart devices (watches, fitness trackers, phones, and more) and the more fantastic, cutting-edge neuroprosthetics and artificial organs. But, can we translate human emotions into robot actions?
Can this demonstrate to us a new way to instill consciousness?
Can we create empathy for a robot?
What are robot best at? What are humans best at? What about the grey area where these two things are evolving?
Is there a role for hybrid human machine systems?
Can a robot respond to, represent or convey one’s inner desires?
As an artist, I work with technology and the body, subjecting the audience to different real-life events, as opposed to virtual ones. This project will explore this theme further, and also look at:-Prosthetic augmented bodies-Machines with human-like social interactions, sociable robotics, hybrid human machine systems, out of body/split body experiences and phantom limbs. What are the limits of how much a person is prepared to submit to external forces and how far s/he can allow a machine to intrude on the body. Who is in control, the viewer or the viewed, man or the machine? I am inspired by Futurism, interactive performance art from the 1970’s. Chris Burden, Piero Manzoni, Joseph Beuys, Bill Viola, Yves Klein, Jean Tinguely, Marcel Duchamp, Rebecca Horn, Edward Keinholz, Antennae Theater, Survival Research Labrotories, George Coates Theater, Leonardo Da Vinci’s performances and Laurie Anderson. I am also inspired by sensors and interfaces that can be used in home care, companion technology, entertainment, medical, spiritual and mental health.
I am exploring experiential and contemplative approaches to the understanding of technology. This work bridges the gap between humanity and technology. The development of a multi-sensor output could be applied in various fields including health care and entertainment. I believe that this work can provide life support for the planet.
This performance explores creativity by analyzing and harvesting brain waves and brain function. It is theatrical poetry with live humans and robots. It questions and critiques the role of technology in our lives and in the arts. Who is in charge, mankind or the machine?
I request funding to develop robots and EEG sensors that can essentially source an individuals mental vibe, character and energy getting a unique fingerprint from each volunteer.
This funding would help with the advancement of my art, allow further research in my field.
Economic constraints keep us from progressing to the degree that we envision and limit the scope of our work. With the requested funding, we would be able to create, exhibit and document the work as we envision it (thoroughly developed ideas, larger scale, more sophisticated and technologically advanced, better and more efficient equipment and materials).
To be frank, we feel stuck in terms of what we can further create in this manner. We have made a large body of work at a certain scale and technological level. Yet, to go further and to keep growing, some support is needed. This grant will bring our vision to fruition.
There is an inevitable divergence, attributable to the imperfections of the human mind, between the world as it is and the world as men perceive it. – J. William Fulbright