This is only a preview of the January 2015 issue of Silicon Chip. You can view 36 of the 104 pages in the full issue, including the advertisments. For full access, purchase the issue for $10.00 or subscribe for access to the latest issues. Items relevant to "The Micromite Mk.2":
Items relevant to "Isolating High Voltage Probe for Oscilloscopes":
Items relevant to "High-Energy Multi-Spark CDI For Performance Cars, Pt.2":
Items relevant to "The Currawong 2 x 10W Stereo Valve Amplifier, Pt.3":
Items relevant to "Salvage It":
|
INTERF
TO THE
by DR DAVID MADDISON
S
While interfacing to the human brain might
seem the stuff of science fiction, there is much
work being done in this area, as well as work on
animals and insects. You can even do it yourself
and it can have many practical aspects.
cience fiction is full of scenarios in which a person’s
own brain is interfaced directly to a computer or a
machine (or another person) and is used to interact
with, or control it.
Examples include the people in The Matrix trilogy, the
Daleks in Dr Who and the Borg in Star Trek.
And who can forget that 1983 sci-fi film Brainstorm, the
whole theme of which was the development, use (and misuse) of a Brain-Computer Interface (BCI). BrainStorm can be
viewed on YouTube at http://youtu.be/cOGAEAJ4xJE
In this article we will primarily focus on methods of interfacing the human brain with computers and machines,
so called brain-computer interfaces or BCIs. Australia is
a world leader in the Cochlear implant but these devices
do not interface directly to the brain. Rather, they connect
to existing nerve fibres and are in the related category of
neuroprosthetics.
A brain-computer interface can be defined as a system
for reading information from the brain to enable control of
a machine or the transmission of an item of communication
or thought. It is also a system of feeding information into
the brain to enable the brain to interpret a sensation from
some external sensory device.
In other words, information is transmitted to and from
the brain to a machine without the engagement of the usual
senses, the peripheral nervous system or limbs.
Reading the brain
To interface the brain to a computer, information has
to be first read from the brain. There are several means by
which information can be acquired from a brain for the
purpose of brain-computer interfacing.
Electroencephalography (EEG) has the advantage that it
is relatively cheap and simple to do and can provide useful information in a clinical setting. It is also non-invasive
and so is amenable to a wide variety of brain computer
interfacing techniques, provided useful information can
be obtained. There is a distinct advantage that changes in
brain activity can be read very rapidly compared to other
slower methods that rely on a change of blood flow, such as
functional magnetic resonance imaging (fMRI), for example.
EEG also has a number of disadvantages. A scalp-reAn EEG headset, as used in a clinical
setting. Worldwide, the location of
EEG electrodes is standardised
according to the so-called 10-20
system (see right) whereby
electrodes are positioned
according to anatomical
landmarks. Results from
different researchers will
therefore correspond to
the same electrode locations
(there are also higher resolution
electrode placement schemes such
as the 10-5 system and others). In clinical applications
typically 19 electrodes are used plus an earth and
system voltage reference. The voltages measured are of
the order of microvolts and are amplified by 1,000 to
100,000 times.
12 Silicon Chip
siliconchip.com.au
FACING
BRAIN
... yes, it is
really happening!
corded EEG represents a coarse measure of brain activity
due to the poor electrical conduction and thickness of the
skull and the subsequent dispersion of electrical signals.
It only measures the collective excitation of large numbers
of neurons behaving in a synchronised manner that also
happen to be oriented in the correct direction to provide an
electrical signal that conducts toward the scalp. Individual
neurons or small groups of neurons cannot be read directly.
The EEG output consists of rhythmic signals in various
frequency ranges and also transient activity. Typically (but
not always) these rhythmic signals are classified in terms
of a number of frequency bands. These are usually Delta
(<4Hz), Theta (4-7Hz), Alpha (8-15Hz), Beta (16-31Hz),
Gamma (32+Hz) and Mu (8-12Hz). All these bands are associated with a certain biological significance and activities
in the brain.
Electrocorticography
Electrocorticography (ECoG) is a form of EEG in which
the electrodes are placed on the surface of the brain (cerebral cortex). It has the advantage that much higher spacial
resolutions can be obtained and the sampling of much
smaller groups of neurons. Different types of electrodes
can also be used.
Of course, it has the distinct disadvantage that it is intrusive and requires the skull to be opened. For the purposes
of brain computer interfacing it would only be done (at this
point in time) for life-critical applications such enabling a
quadriplegic to operate a robotic arm or wheelchair.
Electrical activity in the brain
The brain consists of specialised cells called neurons
and glial cells. The neurons are the cells responsible for
information processing while the glial cells mostly have
support roles.
Neurons are electrically active and can communicate
with other cells in the brain by a branched conducting
fibre called an axon that extends from the body of the cell
and which can communicate with many other nearby or
far away neurons. Neuron to neuron communication constitutes the essence of how the brain works.
The architecture of this connectivity between neurons
Research conducted at the Brain Institute at the University
of Utah showing three types of electrocortical arrays in
simultaneous use. The numbered electrodes are part of an ECoG
array sitting on the surface of a human brain, the green wires
terminate in a micro-ECoG grid and the black square with the
gold-coloured electrodes is a “Utah Electrode Array” (UEA)
which has an even finer resolution than the micro-ECoG grid. In
this work the electrodes are used to discover and remove areas
of the brain responsible for epileptic seizures but data read
from such electrodes can also be used, for example, to convert
speech-related brain signals into words, control machinery
such as a robot arm, a wheelchair or even an aircraft or to work
in any other application requiring a direct brain-computer
interface. Note that while it is obviously an invasive procedure
to have such electrodes implanted beneath the skull, these
particular electrodes sit on the surface of the brain and do not
penetrate it where damage may be done in sensitive areas.
siliconchip.com.au
January 2015 13
Synaptic transmission of information showing neuron body
(soma) and attached dendrites and axons. Information
enters a neuron via a dendrite and leaves via an axon.
Neurotransmitter molecules pass across the synaptic gap.
Each electrical impulse will cause a connected neuron to
be either excited or inhibited. The collective excitement or
inhibition of very large numbers of neurons is what can be
detected by an EEG signal.
is known as a neural network.
The way axons transmit electrical signals is by means of
electrochemical pulses involving sodium and potassium
ions being transported in different directions through the
neural cell membrane. These electrochemical pulses are
known as action potentials and typically last less than one
millisecond and propagate at speeds of 1 to 100 metres
per second.
Some neurons are inactive most of the time while others may be constantly active and fire at a rate of 5 to 50
times per second. A neuron’s axon is connected to other
neurons via junctions called synapses which make contact
with another part of the neuron’s body called the dendrite.
There is a very high level of connectedness; each axon may
have many thousand synaptic connections to neurons or
possibly other cell types.
According to the latest estimates the human brain has an
average of 86 billion neurons and 100 trillion synapses. The
axons are the “wires” that connect most of the functional
elements of the brain with each other.
Once an electrical signal or action potential arrives at a
synapse, specialised chemicals known as neurotransmitters are released and these bind with the target neuron or
other cell.
Many different neurotransmitters exist (around 100 have
been identified so far) and can exert many different simple
14 Silicon Chip
or complex influences on the target (or post synaptic) neuron but fundamentally will cause the post synaptic neuron
to be either inhibited or excited.
As each neuron is connected to large numbers of other
neurons the total numbers of inhibitory or excitatory signals
received will determine whether that neuron will either
not fire or fire and not pass or pass information to the next
neuron in the network, and so on.
Many of these synaptic junctions are dynamically reconfigurable by changing the nature of the signals that travel
through them and are thought to be involved in learning
and memory.
Since the connections are not “set in stone”, some reconfiguration of the brain is possible and this is the basis
of neuroplasticity, the ability of the brain to reconfigure
itself to compensate for damage.
This plasticity has only been seriously recognised in
recent years and also suggests that electrode placement for
the purpose of brain-computer interfacing is not extremely
critical. It suggests that the brain will eventually be able to
learn how to control an interface no matter where in the
brain it is located (within reason) by a sufficient amount
of learning.
Electrical signals in the brain or action potentials are
the way neurons communicate with each other. Action
potentials are subject to some basic but important rules.
Firstly, there is a minimum threshold voltage below which
no signal will be propagated along an axon so electrical
“noise” will not cause signals to propagate.
Secondly, it is “all or nothing”; each action potential has
the same strength, independent of the strength of a stimulus.
Thirdly, there is a refractory period after the action potential
in which no further action potentials can be generated. This
helps ensure that the action potential propagates in only
one direction and not back to its point of origin.
Most people are familiar with the terms “grey matter”
and “white matter”. If one takes a cross-section of a human
brain, it will be seen that the outer layers are dark in colour
(grey) while the inner parts are light in colour (white). The
difference arises from the fact that axons are lighter in colour due to their insulating myelin sheaths while neurons
are darker in colour. These colour differences show that
the outer parts of the brain contain mostly neurons and
the inner parts of the brain contain mainly axons or the
“wiring” of the brain.
Non-invasive brain interface
While EEG and other methods can be used to read information from the brain, the information has to be meaningful
and somehow express the subject’s intent if they are to do
something useful like control a machine. Like any new
task, practice is necessary so that the appropriate synaptic connections can be strengthened in order to learn the
desired behaviour. The following methods describe ways
BCI devices can be controlled without intrusive implanted
electrodes.
An EEG signal can be influenced by imagined movements
and biofeedback methods whereby an individual learns
with many training sessions to influence an EEG signal in
a way that can be detected and used to drive a machine.
Silent vocalisation of words can also be sensed and used
to drive the interface.
The Steady State Visual Evoked Potential (SSVEP) is
siliconchip.com.au
a control system whereby a subject looks at one or more
flashing screens or symbols. The signal from the flash is
relatively easy to detect in an EEG signal and the intent
of the subject can be inferred from the frequency of the
flashing area they are looking at. It may be annoying for
people to use, however.
The P300 wave, or more specifically now known as
two waves, the P3a and P3b, are another way information
can be read from the brain. These occur after a low probability event is observed and recognised among a series of
“standard” events. These waves are useful to monitor for
brain-computer interfacing because they are relatively consistent across most people and using them can be learned
with minimal training.
One example of using this brainwave for communication in the disabled is the use of a P300 matrix speller. A
test subject is presented with a 6x6 matrix of letters and
numbers and individual rows and columns are illuminated
in a pseudo-random manner. The subject selects a letter by
concentrating on the character they want and their P300
wave is detected at that time.
Using this method with a scalp EEG results in letter selection rates of 1.4 to 4.5 characters per minute. This was
able to be increased to 17 characters per minute by Peter
Brunner and others in 2011 with an implanted 96-electrode
array. Hybrid systems have also been developed combining
the SSVEP mentioned above and the P300. See YouTube
video http://youtu.be/08GNE6OdNcs “Emotiv BCI2000
Video.mp4”.
Writing information to the brain
Mentioned above were several methods that could be
used to read information from the brain. It is also possible
to “write” information to the brain. This can be done via
implanted electrode arrays, transcranial magnetic stimulation (TMS) where a powerful magnetic field is pulsed
through the skull or focused ultrasound (FUS) where a
focused ultrasound beam is transmitted through the skull.
All these methods excite groups of neurons within their
field of influence and cause them to fire.
The earliest experiments with interfacing animal brains
to machines happened in 1969. The experiment was by E.E.
Fetz at the University of Washington School of Medicine
in Seattle and involved training a monkey to move a biofeedback meter needle by activating neurons in its motor
cortex, the region of the brain responsible for the execution
of movement. The activity of these neurons was read from
an implanted tungsten micro-electrode.
Following work by Fetz in interfacing a monkey brain to
a machine, in the 1980s Apostolos Georgopoulos at Johns
Hopkins University found a mathematical relationship
between the electrical signals from motor cortex neurons
and the direction the animal wished to move.
This lead to the development of computer models that relate movement to neural signals and are the basis of models
that now translate complex neural signals into commands
to operate machines such as robot arms.
Monkey controls robot arm
Professor Miguel Nicolelis from Duke University in North
Carolina was the first to interface a monkey brain to a robot
arm which it could move.
By 2000 the group had managed to reproduce a monkey’s
siliconchip.com.au
A monkey using a brain-controlled robotic arm to grab
food to feed itself. The monkeys were able to effortlessly
control the robot arm as though it were a natural part of
themselves.
arm motion in a robot arm by monitoring neural signals
from the monkey. The monkey had no direct control over
the arm, it just reproduced its movements.
Subsequently, monkeys first trained to reach and grab
objects on a computer screen using a joystick. This joystick
also controlled a robot arm which the monkeys could not
see. They were learning the simply task of moving things
in two dimensions on the computer screen before being
shown the actual robot arm which could move in three
dimensions which the monkeys learned to control.
In this work an electrode array monitored an area on the
motor cortex of around 50 to 200 neurons.
Other groups have done similar work and a group lead by
Andrew Schwartz at the University of Pittsburgh in 2008
interfaced a monkey to a robot arm with an electrode array which recorded signals from 15-30 neurons and which
enabled the monkey to feed itself.
A video of a monkey operating a robot arm can be seen
at http://youtu.be/gnWSah4RD2E “Monkey controls robotic
arm with brain computer interface”.
Visual imagery from the brain
Although the stuff of science fiction, scientists are mak-
Open-source brain computer interface
There is a successfully funded Kickstarter project called
OpenBCI to develop an open source platform to enable anyone with an interest to monitor their own or another person’s
brainwaves via a wearable EEG monitor with a view to developing products controlled by the brain. Each board supports
eight electrodes but
these can be daisychained together to
increase the electrode count.
Apart from the
electronics and
software there is
also a 3D printable headset to
mount the electronics package. See
http://openbci.
com/
January 2015 15
can be seen at http://youtu.be/nsjDnYxJ0bo “Movie reconstruction from human brain activity”.
Reading the subject matter of dreams
Image (top row) presented to a cat and reconstruction
(bottom) of that image as read from the brain using
electrodes implanted in a region of the brain that processes
visual information.
ing good progress in reading visual imagery from inside
the brain. Examples include reading images seen by the
eye directly from the brain and also determining some
content of dreams.
In one of the first demonstrations of reading visual imagery from a brain a cat had electrodes implanted in its
brain and it was made to watch various scenes. The data
from the electrodes was processed with some basic mathematical filtering and the original image was reconstructed.
It certainly seems from the reconstructed images, however, that the animal imposed its own cat-like interpretation
on the features on the human face.
This work was done in 1999 at the University of California, Berkeley with a research team lead by Professor Yang
Dan. Naturally, this brain reading was invasive by virtue of
the fact that electrodes needed to be implanted on the brain.
Apart from cats, visual imagery has also been read from
human brains. This work was done in 2011 by scientists
at the University of California, Berkeley lead by Professor
Jack Gallant. In this case non-invasive function magnetic
resonance imaging (fMRI) techniques and computational
modelling were used to read and interpret brain activity.
Subjects watched video clips and the moving images were
read from their brains.
To extract this video information from the brains of
experimental subjects they had to lay still inside a fMRI
machine while watching two different sets of trailers from
Hollywood movies. The fMRI machine was used to measure
the blood flow through the visual cortex of the brain which
is the part responsible for vision. The fMRI data was then
broken down into three dimensional versions of pixels
known as “voxels”.
One of the researchers said “We built a model for each
voxel that describes how shape and motion information in
the movie is mapped into brain activity”. As the video was
being played to the subject the change in each voxel, corresponding to changes in brain activity in that region, was
correlated with the video image being presented at the time.
A problem of using fMRI for this type of work is that
the blood flow which fMRI measures changes much more
slowly than the electrical neural signals. This problem
was overcome by the development of a two stage model
that separately describes the neural signals and blood flow.
However, the scientists who did this work were careful
to point out at the time that the technology to read people’s
thoughts is many decades away. A video of the experiment
16 Silicon Chip
Japanese researchers Yukiyasu Kamitani and colleagues
at the Advanced Telecommunications Research Institute
International in Kyoto, Japan have been working on reading
the subject matter of people’s dreams.
In work published in 2013 they showed that they could
tell what a person was dreaming about. The research involved asking volunteers to have a mid-afternoon nap in
a fMRI machine and when they had reached the earliest
stages of sleep (stage 1 or 2) they were woken and asked
to give a verbal report of what they were dreaming about.
This was repeated at least 200 times for each subject.
Next, these verbal dream reports were analysed by researchers who reduced them to key words and concepts.
Researchers next went online to build a vast visual database
of images that mostly closely corresponded to the subject
matter of the verbal reports provided by the dreamers.
Researchers then did further fMRI scans on the dreamers while they were awake and asked them to watch the
images that had been collected that corresponded to the
subject matter reported from their 200 plus dream sessions.
This enabled brain activity patterns to be read from that
individual that corresponded to the visual imagery they
were watching. These activity patterns were used to train
a decoder computer to correlate patterns of brain activity
with certain types of visual imagery.
After the decoder was trained it was possible to enter
measured brain activity and it could then correlate that with
the visual imagery now known to produce this pattern and
thus the subject matter of the dream could be predicted.
The predictive capacity of the system was quite coarse.
For example, it could tell if someone was dreaming of driving in a car but not what type of car. Also, the decoder has to
be trained individually for each person. It cannot be used to
read dream subject matter without individualised training.
See YouTube video http://youtu.be/inaH_i_TjV4 “Dream
decoding from human brain”.
Transmitting thoughts from one person to
another
In early 2014, a team lead by Alvaro Pascual-Leone, Director of the at the Berenson-Allen Center for Noninvasive
Brain Stimulation at Beth Israel Deaconess Medical Center
(BIDMC) and Professor of Neurology at Harvard Medical
School in Boston succeeded in reading a thought from one
person and transmitting it to another person 8,000km away
via the Internet.
Together with researchers in France and Spain, the
thoughts of a person in India were transmitted to a person
in France. The words transmitted were the greetings “hola”
and “ciao”. In reality it was not words that were transmitted but a binary code. The sender evoked imagery of using
either their hands or feet. The brainwaves of the sender in
India were read by an EEG and it was determined if they
were imagining using either their hands or feet.
Hands corresponded to a “0” and feet to a “1”. The chosen
number was transmitted over the Internet to France and
the receiver’s brain was stimulated via the process of transcranial magnetic stimulation (TMS). The TMS stimulation
was interpreted as a flash of light (phosphene) for a 1 and
siliconchip.com.au
no flash for a 0 and thus the simple message was decoded.
Connecting two rat brains together
The brains of two rats were electronically linked such that
what one rat did was duplicated by another rat at a distant
site. A team lead by Miguel Nicolelis of Duke University in
North Carolina and collaborators in Brazil published this
work in early 2013.
One rat called the “encoder” learned various tasks and
signals from a cortical micro-electrode array implanted in
it were monitored. The electrical signals from the encoder
rat’s brain were then transmitted to the same area of a
“decoder” rat’s brain.
The encoder’s electrode arrays consisted of 32 electrodes
connected to the rat’s primary motor cortex of the brain
which is responsible for movement. The decoder rats had
4 to 6 micro-stimulation electrodes implanted in the same
area.
When the decoder rat received signals from the encoder
rat’s brain it interpreted the action meant by those signals
and performed the same task (pressing the same lever) as
the encoder rat. Even when the decoder rat was untrained
and unfamiliar with the task the decoder rats would press
the correct lever around two thirds of the time which while
not perfect is still a remarkable result.
The encoder rat was located in Brazil while the decoder
rat was located in the USA. A video of the experiment can
be seen at http://youtu.be/w_qbkYDlhDY “Brain-to-brain
interface transmits brain activity directly from one rat to
another”
Human-to-animal control
Transmitting a thought from one person to another is
impressive but so too is transmitting a command from a
person to animal. Seung-Schik Yoo of Harvard Medical
School in Boston lead the team. A person was connected
to an EEG machine and used the technique of steady state
visual evoked potential (SSVEP) to trigger a signal for a rat
to move its tail. The rat’s brain was stimulated in the area
that controls tail movement by the technique of focused
ultrasound (FUS) and the rat moved its tail.
The experiment can be seen at https://www.youtube.
com/watch?v=VaJjHgyHnEc “Human moves rat’s tail with
thoughts alone”. See also http://youtu.be/TpFdM_e76Fw
“LEGO goes with the brain: A robot
remotely controlled with steady-state
visual evoked potentials”,
Still images taken from video showing the presented image
(top) and the corresonding image read from a human brain
using functional magnetic resonance imaging (fMRI).
(From http://spectrum.ieee.org/geek-life/tools-toys/this-isyour-brain-on-fmri)
(sic) in which a robot is controlled by a person using SSVEP techniques.
Human-to-human control
Researchers at the University of Washington have enabled one person to control motion in another person. The
first person thought of an action to move their hand to
press a button but did not actually move their hand. The
electrical activity in the brain associated with this intention was recorded with an EEG headset and transmitted
via the Internet.
The brain of a receiving subject was stimulated via the
process of transcranial magnetic stimulation (TMS) which
induced an electrical signal in the brain of the subject over
an area responsible for hand movement causing them to
physically move their hand to press a button.
This may sound scary in some senses but it is important
to note that this work is currently at a very basic level and
there is no indication that mass mind control or robot-like
zombie people will be walking our streets any time soon.
See http://youtu.be/rNRDc714W5I “Direct Brain-to-Brain
Communication in Humans: A Pilot Study”.
Human vision & movement
An obvious application for interfacing the brain is to
provide vision for blind people. Retinal implants (“bionic
eyes”) are one such approach but if this is not suitable
the vision areas of the brain can be stimulated directly.
Data from a camera is processed and sent to an electrode
array implanted on the visual cortex of the brain. Where
Scheme by which a thought was
transmitted from one person to another
over the Internet. From http://abcnews.
go.com/Technology/scientists-transmitthoughts-brain/story?id=25319813
siliconchip.com.au
January 2015 17
this has been done the subjects have gained some limited
level of functionality to enable them to do basic tasks and
even driving a car slowly in a car park was demonstrated
in one instance.
BCIs have been used to help disabled people control
computer cursors for communication, wheelchairs and robotic arms to help them with household tasks. See YouTube
videos http://youtu.be/mJQ0HqThU4c “Two-Dimensional
Cursor Control Using EEG”, http://youtu.be/qQ7AJnVKc_g
“Mind Typing and PC Control with Brain-Computer Interface (BCI)”, http://youtu.be/gvR0kHm9fwo “BCI driving a
wheelchair” and http://youtu.be/76lIQtE8oDY “One Giant
Bite: Woman with Quadriplegia Feeds Herself Chocolate
Using Mind-Controlled Robot Arm”.
Neurogaming
Neurogaming is a new computer gaming modality where
characters and games are controlled by BCI technology
as well as other sensors such as heart rate monitors, eyetrackers and sensors to detect muscle movement. Such
technology can also be used for virtual reality training for
different professionals and has also been suggested for the
treatment of various disorders such as PTSD, ADHD and
other behavioural and cognitive disorders.
Future uses
Anything that requires human input for control is open
to the possibility of direct control via a brain-computer
interface. For precise and high levels of control it may be
necessary to have implanted electrode arrays since at the
moment scalp EEG readings are fairly coarse in nature
although if training with EEG headsets started at a young
age, better results might be achievable.
The military also have some interest in controlling fighter
jets and other machines with the mind (whether the pilot
is in the cockpit or a remote operator). Firefox (1982) was
a science fiction movie which features an aircraft with a
mind-controlled weapons system but the English-speaking
pilot tasked to retrieve the plane could not get it to work
until he realised he had to think in Russian, not English.
Brain-controlled toys
A number of toys have been produced or are under
development which are controlled by the brain. One such
toy is a radio-controlled helicopter called the Puzzlebox
Orbit which us controlled via a NeuroSky EEG headset
(see below).
Instructions for a do-it yourself conversion of a cheap
radio controlled helicopter to mind control using consumer
EEG headsets is described at http://www.instructables.com/
id/Brain-Controlled-RC-Helicopter/
Note that on that web page on the right hand column you
will see links to other brain control DIY projects.
Consumer EEG headsets
Interfacing the brain is not just restricted to laboratories.
There are a large number of consumer grade EEG headsets
available for the purpose of brain computer interfacing.
They are all capable of measuring a number of mental states
and some can measure facial muscle movement and eye
movement as well. A full description of these devices is
not possible here but you may wish to research them yourselves. These devices have between 1 and 14 electrodes.
Some of these headsets are also appropriate for professional
use and research.
The devices include: Emotiv EPOC, Emotiv Insight,
HiBrain, iFocusBand, Mindball, Mindflex, MindSet, MindWave, Muse, MyndPlay BrainB, Neural Impulse Actuator
(discontinued and detected muscle movement only),
NeuroSky, OpenBCI (this is an open hardware project,
see box), Star Wars Force Trainer (discontinued), Xwave
headset (discontinued) and Xwave Sonic (discontinued).
Of particular interest is that Emotiv Systems is a Sydneybased company with international offices, founded by former Young Australian of the Year, Tan Le. For an overview
of some features of one of the Emotiv headset models see the
YouTube video at http://youtu.be/bposG6XHXvU “Emotiv’s
New Neuro-Headset”.
A lot of open-source software has been developed to
support the output of some of these and other EEG devices.
An example is OpenViBE, which is a general purpose and
highly capable software platform for real-time acquisition,
processing and classification of brain waves for all aspects
of brain-computer interfaces including biofeedback, robotinterfacing, diagnosis, biofeedback and game control.
OpenViBE can be used by anyone even if they are not
familiar with programming. Several open-source Matlab
toolboxes have also been developed for interpreting data
Scheme for brain to brain interface with human
subjects. A sender imagines hand movement to
press a fire button but does not actually move his
hand. The intent to press the button is detected
via EEG signals and the signal is transmitted
via the Internet. The person receiving the
signal is stimulated to press a button as their
brain is stimulated via transcranial magnetic
stimulation (TMS). (From www.washington.edu/
news/2013/08/27/researcher-controls-colleaguesmotions-in-1st-human-brain-to-brain-interface/).
18 Silicon Chip
siliconchip.com.au
from various EEG devices.
With any EEG device, receiving unwanted electrical noise
from muscles can be a problem with these devices so a
special effort has to be made to avoid unwanted movement,
especially of the facial area, when using these devices.
SILICON CHIP readers may be interested in experimenting
with some of these devices and software tools.
Many of these devices can be connected to smart phones
for purposes such as meditation, biofeedback or playing
games (neurogaming) or other possible purposes such as
assisting the disabled to communicate, for research, software usability testing and so-called neuromarketing where
a person’s reaction to advertising material is monitored.
BCI2000
In addition to the open source software mentioned above
to analyse EEG signals, BCI2000 (www.schalklab.org/research/bci2000) is an open-source suite of software for all
aspects of brain-computer interface research and can be
used for data acquisition, stimulus of neurons and brain
monitoring applications.
It is free for non-profit and educational use and supports
numerous types of instrumentation and runs on Windows,
OS X and Linux.
It has been under development since 2000 by the BrainComputer Interface R&D Program at the Wadsworth Center
of the New York State Department of Health in Albany,
New York with substantial contributions from various
other groups.
BCI2000 is designed to easily interface with various
equipment and software in real time via a network-based
interface so that, for example, a robot arm running its own
software could be made to be easily controlled by neural
signals processed by BCI2000. In addition, Matlab scripts
can be executed within BCI2000.
BCI2000 has an additional benefit that all data is stored
in a standardised format along with the system configuration and event markers so that it can easily be shared with
other researchers.
To see an example of BCI2000 in use see http://youtu.be/
suKTlrzaU9g “Playing the Game ‘Pong’ with EEG”. Here
a 32-channel EEG is acquired and analysed from each of
two subjects to extract control signals which move the
electronic game paddles.
Ethical issues
As with any new technology certain ethical issues need
to be considered, especially with intrusive brain interfaces
such as cortical electrode arrays. While few would question the need for such intrusive interfaces in life-critical
applications such as controlling a wheelchair or robot
arm, one might question the appropriateness of such an
interface for a non-critical application such as connecting
to the Internet.
On the other hand many would argue that a person is
entitled to do as they will with their own body as long as
that person pays for it.
Other issues relate to the reversibility or otherwise of
intrusive BCI interface procedures. Most implants, no matter what type, leave some sort of permanent impact on the
body and may not be removable without doing damage.
What issues arise if better models of interface are developed
and old ones need to be removed?
siliconchip.com.au
Cyborg Roaches!
We make no judgement
on the ethics of doing
this but some people
have built their own
remote controlled
living cockroaches
with parts from a kit
as featured in this video.
http://youtu.be/V2zNOP6RqRk “Amazing! Real Creating a Cyborg Cockroach (Bugs Robot)”.
It is not known whether this would work with typical Australian cockroaches.
It is not a joke! Note that the developer does consider
ethical issues and addresses them on their web page at
https://backyardbrains.com/products/roboroach
Alternative therapies also need to be considered. For
example, with advances in stem cell research it is conceivable that in the near term future spinal cords could be
repaired and the necessity to have an electrode implant for
brain control of a wheelchair might become unnecessary
(but people already in receipt of such implants might be
able to re-purpose them).
Conclusion
Brain-computer interfacing has an exciting future and it
is likely that the first major uses will be to assist disabled
people to communicate and move.
Neurogaming, like much computer gaming is likely to
have many spin-offs such as virtual reality and treatment
of various disorders. Later developments might include
control of cars, aircraft and many other machines as well.
Some people may consider the technology “inhuman”
and may choose to preserve what they see as their humanity. Controlling animals with BCI may bring many benefits
such as in search and rescue but may also raise ethical
challenges.
Neuroplasticity ensures that most people should be able
to learn to use BCI and most likely do useful things with
non-intrusive BCI such as EEG headsets. Other ethical challenges are raised due to appropriateness of the technology
for certain uses and cost.
In the medium to long term future the rights of people
not to have their mind read (should that prove to be possible) need to be seriously considered. BCI is potentially
very useful for the disabled but biological cures using stem
cells for conditions such as a severed spinal cord may be
better and not far off.
The nightmare scenarios from science fiction seem a long
SC
way off, if they happen at all.
YouTube videos of interest:
Visual Image Reconstruction from Human Brain:
http://youtu.be/daY7uO0eftA
A Remote Controlled Rat: http://youtu.be/G-jTkqHSWlg
Cyborg insects: http://youtu.be/dSCLBG9KeX4
Computer records animal vision in Laboratory – UC Berkeley:
http://youtu.be/piyY-UtyDZw
January 2015 19
|