This is only a preview of the July 2015 issue of Silicon Chip. You can view 35 of the 96 pages in the full issue, including the advertisments. For full access, purchase the issue for $10.00 or subscribe for access to the latest issues. Items relevant to "Build a Driveway Monitor, Pt.1":
Items relevant to "Install USB Charging Points In Your Car":
Items relevant to "Intelligent Charger for Nicad & NiMH Batteries":
Articles in this series:
Items relevant to "Ultra-LD Mk.4 200W RMS Power Amplifier: Preview":
Purchase a printed copy of this issue for $10.00. |
Final part of this
fascinating series by
Dr David Maddison
The
BIONIC EYE
In the first part of this series, Dr David Maddison looked at the history
and recent advances in the “holy grail” of vision impairment research
– allowing the blind to see. There’s a lot of work currently going on in
search of that lofty goal . . .
T
here are a several ongoing bionic eye research projects around the world. They mostly involve retinal
implants or cortical implants. In Australia, there is
one of each type of device under development.
Beyond implants and also described here, there are sensory substitution devices such as the Brainport, Eyeborg
and The vOICe. The learned skill of human echolocation
that requires no hardware whatsoever is also described.
Sensory substitution is the process whereby one sense
such as sight is replaced with another sense such as touch.
A simple example of this is a blind person’s cane.
There is not room in this article to discuss all the projects under development so some representative cases are
discussed below, including retinal implant devices that
are either in clinical trial or commercially available. The
two Australian projects will be discussed in greater detail.
Devices not described because they have not yet reached
clinical trial are a Stanford University group, “Photovoltaic
Retinal Prosthesis”, Nano Retina (Israel), the Boston Retina
Implant Project and various groups in Japan.
according to the WHO standard that blindness is greater
than 20/500 (normal vision being 20/20). Nevertheless, it
does improve the quality of life for its users. The Argus III
device is under development and will have 200 electrodes.
Among activities that patients have reported to be able
to undertake with the device are:
• Locate doors, windows, elevators;
• Follow a pedestrian crossing across a street;
• Avoid obstacles;
• Find utensils on a table or when serving food;
• Locate coins;
• Track the motion of a napkin when cleaning;
• Sort light and dark clothes;
• Locate people in front of them (but not see the details
of a face);
• Track a ball; track players on a field;
• Locate an overhead light in an entrance way;
Retinal and cortical implants
ARGUS II
The Argus II, manufactured by US company Second
Sight www.secondsight.com, is one of only two retinal
implants that are currently approved by regulators and
commercially on the market. It has a 10x6 electrode array.
While a pioneering device, the best result for visual acuity
achieved so far is 20/1260 which is still legal blindness
74 Silicon Chip
Argus II retinal implant showing its
location within and on the eye. In
addition to the implant, a patient also wears glasses
containing a video camera and a video processing unit
worn on the belt.
siliconchip.com.au
Schematic view of BVA high acuity device with 256 electrodes. The illustration at left shows the location of the device
within the eye; at right is an exploded view of the device.
• Locate the light of a candle or light bulb and
• Watch fireworks.
Bionic Vision Australia
This national consortium of researchers from the Bionics
Institute, the Centre for Eye Research Australia, NICTA, the
University of Melbourne and the University of New South
Wales is developing a retinal implant.
The main purpose of the BVA device is to initially help
people with retinitis pigmentosa and age-related macular
degeneration.
It consists of a camera and vision processor device as
well as the retinal implant which receives signals wirelessly from the vision processor.
An objective with the BVA device was to preserve any
minor residual vision that someone may have and minimise
damage to the retina. Devices planted in the epiretinal or
subretinal spaces (either directly above or directly below
the retina) can have problems that can lead to the deterioration of what little retinal function may be left, therefore
BVA decided to use the suprachoroidal space for implant.
Utilising the suprachoroidal space, a world-first by
BVA, provides a “buffer” between the electrodes and the
neural tissue which is analogous to the way the cochlear
devices are implanted and why they have long term stable
performance.
The first implant of an experimental device by this
group was conducted in 2012 and was an early prototype
22-electrode device, which has now been implanted into
three patients as part of a clinical trial which was successfully completed. The devices were implanted in the
suprachoroidal space. The purpose of this device was to
enable a vision processor to be developed based on feedback from the patients in order to allow optimisation of the
stimulation algorithms. See a video of the patient using the
device – “Dianne Ashworth 12 months on, 2013” at https://
youtu.be/jQEZiAuJ_AE and “Dianne Ashworth bionic eye
prototype testing, 2014”, https://youtu.be/6EmleCs0KGY
The prototype devices enabled the trial patients to identify basic shapes, letters and numbers; tasks not possible
with whatever residual vision they had. The devices were
removed at the conclusion of the trial in August 2014.
Three devices are currently under development. The first
to be developed for production was a 44-electrode device
based on the prototype 22-electrode device, expected to
enter clinical trial in mid 2015.
A wide-view device with 98 electrodes is also being developed which has hexagonal electrodes which enable more
effective stimulation when using a high electrode density.
Beyond that there is a 256-electrode high-acuity device
under development but currently there is not sufficient
funding to continue this development.
The 256-electrode device will have electrodes so closely
spaced they will need to be in much closer contact to the
neural tissue and electrode stimulation from the suprachoroidal space will not be suitable. So part of the device will
be placed epiretinally, despite some disadvantages with
that location as described above. There are future plans to
expand the electrode count to 1,000.
Novel approaches to electrode fabrication are required
for such high electrode counts. It so happens that deposited
diamond film is very biocompatible and can also be doped
Simulated phospene patterns and images for Bionic Vision Australia devices for an image of old Melbourne tram at 16
phosphenes, 64 phosphenes and 1,000 phosphenes (pixels) and original image and “Bionic eye” text at 1,000 phosphene
resolution.
siliconchip.com.au
July 2015 75
Electrode array (tile)
of Monash Vision
Group device which
during installation
is pushed down onto
the surface of the
brain such that the 43
electrodes enter layer
4 of the V1 area of
the visual cortex.
Back side of the
9x9mm “tile”
showing control
circuity of tile which
contains 650,000
transistors and 43
digital-to-analog
convertors. The
entire implant is
hermetically sealed.
to provide electrical conductivities ranging from that of
an insulator to that of a conductor. The device will be effectively hermetically sealed in a diamond “box”.
It is expected that the high degree of biocompatibility
with diamond will minimise problems with conventional
devices in the epiretinal location.
For further details of the high acuity device see the video
with Professor Steven Prawer: “The Diamond Bionic Eye”
– https://youtu.be/jOokLf3frwE
to support improved stimulation algorithms as they are
developed.
Note that with implanted electrode arrays, whether they
be in the retina or the visual cortex, that there is a minimum
practical spacing since electrical currents will stimulate
adjacent electrodes if the spacing is too little. Additionally,
if the electrodes are spaced too close together there are too
many electrodes and too little neural tissue.
Multiple tiles will be implanted to improve resolution.
It is intended that up to 11 tiles will be implanted in a patient giving a resolution of 473 pixels. Multiple tiles will be
used since it is difficult to fabricate a single larger device
with the required curvature to conform to the brain, apart
from the fact that each brain has a slightly different shape.
The precise location in which the tile is to be implanted
is determined using the process of functional magnetic
resonance imaging, fMRI (see Interfacing to the Brain,
SILICON CHIP, January 2015) is used to find the area of the
visual cortex associated with high resolution vision from
the fovea.
When the area is located, a check is made to ensure no
major blood vessels will be penetrated and then the tile is
pushed down allowing the electrodes to penetrate into the
brain to their full depth of 2mm.
The location into which the electrodes penetrate is a
part of the V1 visual cortex called layer 4. Layer 4 is the
area of V1 that receives most of the input from the lateral
geniculate body.
Monash Vision direct to brain bionic eye
Monash Vision Group (MVG) is a collaboration between
Monash University, Alfred Health, MiniFAB and Grey Innovation and is under the leadership of Professor Arthur
Lowery.
The device under development is a cortical implant. It is
intended for people with non-functional retinas, damaged
optic nerves or missing eyes that are not candidates for a
retinal implant but it can also be used to provide vision
where blindness occurs for a variety of other reasons. The
implant is expected to enter clinical trials in one year.
The device will consist of an electrode array or “tile”
implanted on the visual cortex V1 area of the brain. That
tile will receive wireless signals from a digital processor
attached to the side of a user’s eyeglasses.
The glasses will also contain a video camera to visualise
what a user is looking at.
The implanted tile is 9mm x 9mm in size and contains
43 2mm-long platinum-iridium electrodes, corresponding
to a 43-pixel image.
On the back side of the tile is the wireless receiver and
processing circuitry containing 650,000 transistors and 43
digital to analog convertors.
Each electrode is individually addressable and configurable with a variety of parameters to ensure each electrode
performs optimally and that the settings can be changed
Pixium Vision
French company Pixium Vision (www.pixium-vision.
com/en) has a retinal implant, the IRIS device which in
its commercial version will have 150 electrodes and is
currently undergoing clinical trials. Its PRIMA system
will have up to several thousand electrodes and will begin
clinical trials in 2016.
Retinal Implant AG
A German company Retina Implant AG (http://retinaimplant.de/en/default.aspx) has a retinal implant device
called the Alpha IMS that has received European regulatory approval for marketing. It has 1,500 photodiodes and
matching stimulation electrodes in a 3x3mm package. The
photodiodes eliminate the need for an external camera.
SENSORY SUBSTITUTION DEVICES
AND TECHNIQUES
Seeing with your tongue – Brainport
Headset of Monash Vision Group device that contains a
camera, video processor and wireless coupling to connect
to implanted tile.
76 Silicon Chip
Brainport (www.wicab.com/en_us/) does not directly
connect with the nervous system of a person but is an
assistive technology to allow people to see via sensory
siliconchip.com.au
Neil Harbisson, said to be the world’s first cyborg and who
can hear colours with his prosthesis.
Brainport device showing processing unit, eyeglasses with
camera and plate to be put on mouth to stimulate tongue
with visual information.
substitution. Brainport uses a video camera to generate a
pattern on a device that a user puts on their tongue.
It uses an array of 400 points to generate a pattern on the
tongue corresponding to a visual image. Users eventually
learn to interpret the sensation on the tongue as sight via
the process of neuroplasticity, whereby the brain rewires
itself to accommodate new ways of working.
(See videos of this device in use: Brainport Vision Device
helps a blind man “see” https://youtu.be/xNkw28fz9u0 and
Emilie Gossiaux painting with the BrainPort https://youtu.
be/1xYi9oZMVWI).
Seeing colour with sound – Eyeborg
It is not a bionic eye in the sense that it is not interfaced
with the visual system but artist Neil Harbisson was born
with an extremely rare vision disorder called “achromatopsia” or total colour blindness and can only see in shades
of grey.
He has had a device made for him that converts colours
to sound and even lets him “see” in the infrared and ultraviolet.
The Eyeborg can convert 360 colours into sounds and
can indicate colour saturation via volume level. The user
has a choice of perceiving colour via either a logarithmic
or non-logarithmic sound scale.
In Neil’s device he says that with his infrared detection
capability he can sense if there are movement detectors
in a room or if someone points a remote control at him
and with his ultraviolet sensing ability he can determine
whether or not it is a good day to sunbathe!
Neil used to wear the device but has recently (since
March 2014) had the device, called an Eyeborg permanently
attached to his skull and this enables more nuanced hearing of the sound as the sounds are transmitted through his
skull to his ears.
The “antenna” which is the stalk onto which the camera
that sees the sound is mounted, also has Bluetooth and
WiFi capability so he can send and receive images. He is
siliconchip.com.au
able to “hear” images sent to him. To charge the device he
plugs it into a USB port and a charge of a few hours lasts
three to four days; however he wants to develop methods
to charge the device by his body.
In 2004 he was declared by the media to be the world’s
first cyborg. After a long battle with the UK Passports Office
who initially refused to allow a passport photograph with
the device attached, he won the right to be photographed
with the device after arguing that the device was part of his
body. He is also now an advocate of cyborg rights.
The Eyeborg device has also been developed as a wearable, non-implanted device and donated to blind communities to enable them to have a sense of colour. Neil helps
people become cyborgs via his Cyborg Foundation (www.
cyborgfoundation.com) (video on that site as well) which
has also donated Eyeborgs to the blind.
If you want to experience hearing colours as sound there
is a free Android App to enable this, with an Apple iOS
App under development: www.eyeborgapp.com
For a talk by Neil Harbisson see The Human Eyeborg: Neil
Harbisson at TEDx Gateway https://youtu.be/d_mmwrbDGac The Eyeborg development site is at www.eyeb.org
It is written in Catalan. Google may be able to translate
it but the translation process did not work at the time of
writing.
There is also an unrelated Eyeborg project at http://
eyeborgproject.com/ which is essentially a video camera
mounted within an eye socket with no integration to the
body. There is also a descriptive video at that link which also
looks at other advanced prosthetic devices.
Seeing with sound – The vOICe
There is a project to enable blind people
to see with sound by converting camera images into sounds ( “soundscapes”) which the
user learns to interpret.
The vertical axis of an image is converted into frequency
and the horizontal axis into time and stereo panning as the
software scans across the image to create the soundscapes.
The technology is the invention of Dutch engineer, Dr Peter
B.L. Meijer.
It is hoped that with sufficient training users will be able
to learn to interpret – and perhaps even experience – the
soundscapes as sight.
The technology is called The vOICe (Why? “Oh, I see!”)
July 2015 77
Original camera image (left) and image reconstruction from
The vOICe “soundscape” giving an idea of the resolution
that might be seen by a skilled user of the technology.
and is privately owned intellectual property but it is supplied free to non-commercial users.
Users (and that includes SILICON CHIP readers who are
interested in experimenting with this!) are able to assemble and configure their own set-ups from commercially
available equipment. Windows and Android devices are
currently supported, and soon there may come suitable
augmented reality glasses for convenient hands-free use.
A relatively high resolution compared with retinal and
cortical implants is theoretically possible for those that
learn to interpret the soundscapes.
Soundscapes are generated at a resolution of 176x64
pixels (ie, representing over 11,000 pixels) for a one second
soundscape. However, due to hearing limitations the real
resolution could be somewhere between 1,000 and 4,000
pixels for complex images, similar to between a 32x32 and
a 64x64 pixel array as shown in the illustration in Part 1
of this feature.
Hearing limitations are in part the result of a general
frequency-time uncertainty in sound: there is a fundamental
limit to how well one can simultaneously extract frequen-
cies and time points of sound elements in arbitrary complex
sounds. However, someday in the future it may be possible
to overcome this limit by skipping over-the-air soundscapes
altogether, using the same scanning and panning scheme
of The vOICe to directly stimulate nerves in the cochlea
with high resolution cochlear implants.
This device has the advantage that it is not implanted
and therefore there is no risk of medical complications
from surgery, device failure or foreign body reactions. It is
very low in cost and has a high resolution comparable to
or better than current implanted devices.
Moreover, neuroscience research has shown that the
visual cortex of blind users over time gets recruited for
processing sound (and touch).
In one experiment at Harvard Medical School in Boston,
temporarily disrupting activity in the visual cortex of an
experienced late-blind user of The vOICe with a technique
called TMS (Transcranial Magnetic Stimulation) also disrupted the visual interpretation of soundscapes of objects.
In other experiments it was shown that a brain area
called LOtv (lateral-occipital tactile-visual area, which is
activated by shapes that are seen or touched but not by
natural sounds) became responsive to soundscapes that
encoded object shapes.
The Holy Grail is now to devise efficient training paradigms that not only bring improvements in functional vision but that for late-blind users also reliably lead to “truly
visual” percepts from soundscapes.
There is a very extensive and detailed web site describing the technology along with demonstrations at www.
seeingwithsound.com Also see a somewhat-dated video
on the technology featuring the inventor at Seeing with
Sound (sensory substitution for the blind) https://youtu.
be/I0lmSYP7OcM and a recent video The vOICe Lets The
Blind See With SOUND! https://youtu.be/MjMhvfC1LTY See
also Grasping objects with The vOICe (sensory substitution
for the blind) https://youtu.be/XuosPzluCRg
Human echolocation
Certain individuals have developed a method of sensory
INCORPORATING THE RETINAL CODE
Much retinal implant research has focused on improving the
devices’ electrode count, apart from mechanical, electronic and
bio-compatibility issues. There is also another important factor to
be taken into account.
Recall that the retina itself processes visual data before the information is sent back to the brain via the ganglion cells. Whatever
processing takes place is important in how the brain interprets the
visual data. With a retinal implant this processing step is typically left
out and the ganglion layer is directly stimulated via the prosthesis.
While the specifics of what coding is done by the retina is too
difficult to understand from first principles at this time it is possible
in a research environment to determine what code is output from
the eye (in the form of pulse trains) for a certain input stimulus
such as a face, for example. Without knowing what is actually
happening in the eye researchers have reverse-engineered the
output code to match what the eye does.
When a visual stimulus is encoded the way it is done naturally
in the eye and then presented to the prosthetic device, a superior
result is achieved compared with when no encoding is done.
78 Silicon Chip
A question that might be asked is, if this natural processing is
not encoded in device hardware, will the brain be able to learn to
do this processing itself via the process of neural plasticity?
An explanation of this research in more detail is available at
Sheila Nirenberg: A prosthetic eye to treat blindness https://youtu.
be/Aa2JfigaNcs
A) Original image presented to eye B) image reconstructed from encoder C) image reconstructed from retina
from encoded data D) image reconstructed from retina
without the use of an encoder.
Diagram credit: From Nirenberg and Pandarinath
http://physiology.med.cornell.edu/faculty/nirenberg/
lab/papers/PNAS-2012-Nirenberg-1207035109.pdf
siliconchip.com.au
perception not normally found in humans and that is echolocation. This is a form of sensory substitution where one
sense is developed to replace another lost sense.
Echolocation, or sonar, is the method by which bats,
toothed whales and dolphins and some other animals
“see” in certain environments for navigation and hunting.
They do this by emitting a sound and then listening for
the echo which gives then information about the range of
an object and its texture. In addition, the direction of an
object can be determined, as with normal hearing, by the
difference in arrival time of the reflected sound in each of
two ears.
The direction of the outgoing beam can also be altered
up and down enabling a three-dimensional view of the
environment that is akin to vision.
A bat’s sonar system has a surprisingly high resolution
and can resolve points that are as little as 0.3mm apart.
There are several people on record who have managed
to train themselves to use echolocation. They do this by
using their tongues to make a loud click and listening for
an echo in the same way as echo-locating animals. Since
humans do not have the specialised apparatus for making
sounds or analysing them in the same way as animals, it
is not likely they can see as well with sound as animals
do – but they can nevertheless develop a useful picture of
their world.
Remarkably, blind people who have developed an echolocation ability have been found to be using the visual cortex
of the brain, normally responsible for vision, for processing
the acoustic information about the environment rather than
the parts of the brain normally used for hearing.
There is a video here of a man who is able to ride a
bicycle and do solo hikes in the forest using echolocation
among other remarkable achievements. Human echolocation - Daniel Kish, “Batman”: https://youtu.be/A8lztr1tu4o
See also Human echolocation-1 https://youtu.be/GVMd55j2EXs and Human echolocation demonstration-2 https://
youtu.be/3pM6YYDjb4o This same individual is teaching
other people the technique of human echolocation: teaching
the blind to navigate the world using tongue clicks – Daniel
Kish at TEDxGateway 2012 https://youtu.be/ob-P2a6Mrjs
Biological solutions
Apart from electronic solutions to blindness, biological
cures are also under investigation. One example is whole
eye transplants which are currently under development.
In an eye transplant by the far the biggest challenge is
connecting the optic nerve but significant developments
are currently being made in the area of nerve regeneration.
Another promising area of research is to inject human
embryonic stem cells into the eye. Such therapy has been
used with some success to treat age-related macular degeneration (AMD) or Stargardt’s macular dystrophy. Gene
therapy is also under investigation.
In the medium to long term future it may even become
possible to grow spare body parts from one’s own genetic
material.
Conclusion
Great advances have been made in bionic vision and
vision via sensory substitution. Much of this can be attributed to continued advances in microelectronics, computer
processing power, materials science and a continued imsiliconchip.com.au
Have an Android device?
Then try teaching yourself to see with sound
using the free app from
Google Play: https://play.
google.com/store/apps/
details?id=vOICe.vOICe
A rising bright line gives
a rising tone, bright specks
give short beeps, the folds in
your curtains and the books
on your bookshelf yield
rhythms, and the bright rectangle of a window sounds like a noise burst. The dark rectangle
of a door opening gives a “gap” in the noise of the surrounding
wall. Just experiment and push your perceptual limits.
provement in understanding how the brain works.
The realisation that neuroplasticity can effectively rewire
the brain allows for alternate approaches to vision using
different sensory inputs such as sound and touch and the
possibility that such methods will lead to a very real sense
of sight should not be excluded since neuroplasticity allows non-visual data to be mapped to the visual cortex as
though it were real vision.
Great challenges still exist, especially with resolution,
however much lower resolution vision than what is natural can still lead to profound improvements in a visionimpaired person’s life.
SC
LOOKING FOR
PROJECT
PCBS?
PCBs for most* recent (>2010)
SILICON CHIP projects
are available from the
SILICON CHIP On-Line Shop
– see the On-Line Shop pages
in each issue or log onto
siliconchip.com.au/shop.
You’ll also find some of
the hard-to-get components
to complete your SILICON CHIP
project, plus back issues,
software, panels, binders,
books, DVDs and much more!
Please note: the SILICON CHIP OnLine Shop
does not sell complete kits; for these, please
refer to kit suppliers’ adverts in each issue.
* PCBs for some contributed projects or those where copyright has been retained
by the designer may not be available from the SILICON CHIP On-Line Shop
July 2015 79
|