This is only a preview of the December 2014 issue of Silicon Chip. You can view 37 of the 112 pages in the full issue, including the advertisments. For full access, purchase the issue for $10.00 or subscribe for access to the latest issues. Items relevant to "A WiFi Server For $5":
Items relevant to "High-Energy Multi-Spark CDI For Performance Cars":
Items relevant to "A TDR Dongle For Cable Fault-Finding":
Items relevant to "Currawong 2 x 10W Stereo Valve Amplifier, Pt.2":
Purchase a printed copy of this issue for $10.00. |
By Dr David Maddison
ARGUS-IS
Wide Area Persistent Surveillance System
Developed by BAE Systems, ARGUS-IS combines a 1.8-gigapixel
camera with advanced software, enabling it to find, monitor and
track multiple targets over a wide geographic area. Here’s a look
at its capabilities.
A
RGUS PANOPTES or Argos is
the “all seeing” giant from Greek
mythology, often described as having
multiple or even one hundred eyes.
The Ancient Greek epic poem Aegimius states, in fragment five: “And
[Hera] set a watcher upon her [Io],
great and strong Argos, who with four
eyes looks every way. And the goddess
stirred in him unwearying strength:
sleep never fell upon his eyes; but he
kept sure watch always”.
It would be difficult to come up with
a more fitting name for ARGUS-IS,
BAE Systems’ Autonomous Real-Time
Ground Ubiquitous Surveillance Imaging System. This is a 1.8-gigapixel
16 Silicon Chip
aerial surveillance system that offers
an unprecedented ability to find,
monitor and track multiple simultaneous targets of interest over a wide area
in real time.
ARGUS-IS was developed as an
airborne “Wide Area Persistent Surveillance System” and is designed
to monitor areas of interest for many
hours, days, months or even years.
The system is intended for use in the
battlefield and against locations such
as cities and towns where insurgents
(or others) might live and conduct
terrorist activities. It “sees and records all”. For example, if a terrorist
or other enemy attack occurs and the
enemy activity had not previously
been detected, it is possible to review
the recorded video data to see where
the enemy came from and then take
appropriate action against them at the
originating location.
In operation, the system can resolve
objects as small as 15cm from 5300
metres altitude (and possibly smaller
objects with image processing). The
project was funded by the USA’s
Defence Advanced Research Projects
Agency (DARPA). This was initiated
in 2007 with US$18.5 million from
DARPA to BAE and the system was
first flight tested in February 2010. Operational status with the US Air Force
siliconchip.com.au
368 of these 5-megapixel sensors
are used in the ARGUS-IS system.
Note the non-pixel elements and
the package and interconnecting
pins around the edge of this chip,
explaining why ARGUS-IS requires a
mosaic of four arrays to provide full
sensor coverage.
was achieved on July 1st, 2014 when
the ARGUS-IS sensor was incorporated in a “Gorgon Stare” pod aboard
a General Atomics MQ-9 Reaper unmanned aerial vehicle (UAV).
Traditional surveillance platforms
have a “drinking straw” view of what
is beneath them and have to swivel
and zoom to view targets of interest.
In addition, multiple passes over a
target of interest might be required to
gather the required data. The widearea context of any activity is often not
seen because only a zoomed-in view
is available. Most traditional platforms
also acquire only limited amounts of
data and are not designed for persistent
surveillance or automated analysis of
the recorded data.
By contrast, a Wide Area Persistent
Surveillance System such as ARGUSIS can monitor large areas and even
an entire city of up to 100 square
kilometres and “stare” at the scene.
Because of its extremely high sensor
resolution, it’s unnecessary to physically zoom or swivel the image sensor.
Instead, targets of interest can be seen
by electronically zooming in on parts
of the image. In addition, multiple
targets can be seen and monitored simultaneously on the electronic image.
Airborne platforms for the imaging system can include a variety of
conventional unmanned aerial vehicles (UAVs or “drones”), aerostats
(balloons or powered airships) and
conventional aircraft, although its
primary mission role is long term
surveillance. Thus, an aerial platform
siliconchip.com.au
This official US Air Force photograph, taken at Kandahar air field in Afghan
istan on August 18th this year, shows a General Atomics MQ-9 Reaper with
two sensor pods. One is labelled “EO” meaning electro-optical and the other
is labelled “IR” meaning infrared. It is postulated that the EO pad contains an
ARGUS-IS sensor. The other pod might contain the infrared version, designated
ARGUS-IR.
The Boeing SolarEagle is a solar and hydrogen fuel-cell powered aircraft which
is under development and is designed to stay airborne for indefinite periods.
It might be considered to be an “atmospheric satellite”. In the near future, a
platform such as this could be used for ARGUS-IS persistent surveillance. Other
aircraft such as airships are also under development for this purpose.
with a long “loiter time” capability is
preferred. Solar-powered UAVs now
in development could even provide
continuous surveillance for years at
a time.
Making the sensor
Clearly, making an image sensor
with 1.8-gigapixel resolution would
be a difficult task with present technology. At the moment, the highest
resolution image sensor available is
the Gpixel GMAX3005. This is a monochrome sensor with 150 megapixels
resolution and is capable of 10 frames
per second in full frame mode.
So how was an image resolution,
in colour, with more than 12 times
the current highest-resolution monochrome chip achieved? The remarkable way that this was done goes back
to the initial requirements for the
system. It had to be done both quickly
and relatively cheaply.
Necessity being the mother of invention, scientists and engineers decided
that the best way to achieve this resolution was to use off-the-shelf civilian
December 2014 17
MaxiMite
miniMaximite
or
MicroMite
Which one do you want?
They’re the beginner’s computers that the
experts love, because they’re so versatile!
And they’ve started a cult following around the
world from Afghanistan to Zanzibar!
Very low cost, easy to program, easy to use –
the Maximite, miniMaximite and the Micromite
are the perfect D-I-Y computers for every level.
Read the articles – and you’ll be convinced . . .
You’ll find the articles at:
siliconchip.com.au/Project/Graham/Mite
This diagram shows how each of the arrays is overlapped to provide a single,
seamless mosaic image. Note that each array is offset by exactly the width of
one sensor element in either the vertical or horizontal direction.
Maximite: Mar, Apr, May 2011
miniMaximite: Nov 2011
Colour MaxiMite: Sept, Oct 2012
MicroMite: May, June 2014
plus loads of Circuit Notebook ideas!
PCBs & Micros available from PartShop
KEEP YOUR COPIES OF
SILICON
CHIP
AS GOOD AS THE DAY
THEY WERE
BORN!
Magazines are
sneaky things:
left to themselves, they’ll
hide, they’ll get
crushed, folded,
spindled, dogeared, pages will
disappear . . . not
good when you
want to refer to an
article in the future.
ONLY
14 95
$
INC
GST
PLUS
p&p
A SILICON CHIP binder will
keep your copies in pristine
condition – and you’ll
always be able to find them!
* Each binder holds up to 14 issues
* Made from heavy duty vinyl
* Easy-fit wire inserts
ORDER NOW AT
www.siliconchip.com.au/shop
18 Silicon Chip
Functional elements of the ARGUS-IS system from an earlier implementation.
Note that the area monitored is now around 100km2, not 40km2 as shown here,
and the robotic helicopter platform has been discontinued. Each individual
yellow square represents a surveillance target. There is both on-board and
ground station data-processing.
technology of the type that almost
everyone carries in their pocket – image sensors from mobile phones. Basically, 368 5-megapixel mobile phone
sensors were used to make a composite
focal plane array (CFPA) capable of
imaging at up to 12 frames per second
(although some literature says 10).
However, there is the obvious problem of what happens at the edges of
these individual sensors where there
are no imaging elements. If a single
array was used, there would normally
be vertical and horizontal blank lines
siliconchip.com.au
Believed to be a demonstration of the technology rather than the enactment of a real crime, this example from Dayton,
Ohio shows how Wide Area Persistent Surveillance could be used to track a suspect vehicle. In this case, the observing
vehicle was a manned aircraft and neither ARGUS-IS or Persistics were used. The company involved with this technology
is Persistent Surveillance Systems.
in the image, corresponding to the four
sides of the chip edges.
This problem was solved by using
not one but four CFPAs, each containing 92 5-megapixel sensors which are
combined make up to 368 sensors
mentioned above. These four arrays
are arranged so that there is overlap
with the blank areas of the other arrays
so that 100% coverage is achieved.
This provides a seamless image mosaic with no operator intervention
required.
Naturally, there would have been
significant engineering challenges to
get all these chips correctly aligned.
Analysing the data
ARGUS-IS generates a staggering
amount of data – up to several terabytes per minute in fact. This creates
serious challenges to transmit, store,
manage and process in order to extract
meaningful information. It is simply
not feasible for people to review the
recorded video data because of the
large amount of data continually being acquired. In fact, this surveillance
technology has exceeded the ability
siliconchip.com.au
Block diagram of ARGUS-IS system. Unfortunately, very few technical details
have been released about the system but there is some fascinating engineering
commentary based on what few facts are publicly known, along with some
intelligent guesses, at http://ambivalentengineer.blogspot.com.au/2012/08/argusis.html
to use all the data generated.
With ARGUS-IS, it is easily possible
to generate petabytes (one petabyte is
1000 terabytes) of data every day. At
the moment, the world’s largest hard
drive is Western Digital’s helium-filled
10-terabyte model, so a few hundred
of these would be needed to store all
the data that might be generated in
just one day.
Somehow the huge amount of raw
video data must be analysed and
turned into knowledge. Furthermore,
the data must be reduced to a reasonDecember 2014 19
Civilian Gigapixel Imagery
The first gigapixel digital stitched image was thought to have be made by a
hobbyist in 2003. The web page for this achievement is at http://www.tawbaware.
com/maxlyons/gigapixel.htm
The image generated was made up from 196 individual pictures taken with
a 6-megapixel digital camera. Interestingly, this is not greatly different from
ARGUS-IS imagery which effectively has 368 individual pictures taken using
5-megapixel cameras.
Civilian gigapixel imaging projects are becoming very popular. Just Google
“gigapixel photography” and you will see the large number of projects and
companies involved with this exciting new era of photography. Such images are
becoming known as “gigapans”. Note, however, that these images are generated
by hundreds of smaller stitched images from regular cameras. ARGUS-IS generates 1.8-gigapixel images natively with a single photograph (or more correctly,
frame of a video image).
The largest stitched image currently in existence is a 681-gigapixel image of
the Moon taken over a four year period. You can see the zoomable image for
yourself at http://lroc.sese.asu.edu/images/gigapan/
Also of interest is the Seattle Gigapixel ArtZoom project which was sponsored
by Microsoft and celebrates the arts in Seattle, Washington, USA. It’s in the form
of an interactive image at http://gigapixelartzoom.com/ Terapixel images have
also been created.
You can make your own gigapan images. Just Google “make your own gigapan” to see the large amount of equipment, websites and software to assist in
doing this. One of the most basic pieces of equipment is an automated mount
to automatically incrementally move a camera in various pan and tilt directions
in order to cover a scene.
able size since it might be stored for
days, months or years.
This is done using an advanced
software suite known as “Persistics”,
developed by the Lawrence Livermore
National Laboratory in the USA under
a DARPA contract. Perisistics takes the
data stream from ARGUS-IS (or other
surveillance platforms) and extracts
relevant information such as the movement of people (or “dismounts”, in
military jargon) and vehicles. At the
same time, it compresses non-changing
background information, such as
stationary objects or geographical
features, by up to 1000 times. No data
is lost with this approach and indeed,
sub-pixel resolution can be achieved
for either background objects or moving people or vehicles. For ARGUS-IS,
a pixel corresponds to an area on the
ground of from less than one square
metre to several square metres.
The Persistics software suite can not
only operate in near real-time mode
but also in “forensics” mode where
This rendering of the ARGUS-IS
imaging head shows the four
lenses associated with the composite focal plane arrays and the
6-axis gimbal mechanism.
past data is analysed to detect the
precursors to an adverse event. For
example, in the event of a terrorist act,
the video can be reversed to establish
the original location of the terrorists
who conducted the act.
Persistics also employs advanced
analysis algorithms that enable it to
“stare” at people and vehicles of interest for extended periods of time and
thus automatically detect anomalies
that the system might be programmed
to look for. For example, a person of
interest might deviate from a regular
route and drive somewhere that they
might not normally go. This behaviour
could be detected, tracked and an
alarm issued to the system’s operators.
A preselected target of interest can
also be programmed into the system
and an alert issued if the person or
vehicle is found. And thousands of
targets of interest over a 100km2 area
can be simultaneously tracked.
The advanced algorithms used in the
ARGUS-IS data can be processed to generate 3D models of areas of interest.
20 Silicon Chip
siliconchip.com.au
A typical imaging result from ARGUS-IS taken over Quantico, Virginia, USA. The central image is the 1.8 gigapixel picture
and each of the small inset windows (except for the top left image and the helicopter) represents an area of interest that is
a zoom of the high resolution image. The top left image is the general area on Google Earth, with the dark mosaic image
in the centre of the diagram representing the ARGUS-IS picture. The helicopter at bottom left is a Blackhawk carrying an
ARGUS-IS pod. To give a rough idea of the size of such an image, assuming this was a square image (which it isn’t, but we
are approximating) it would be 42,426 x 42,426 pixels. In the central ARGUS-IS image, the contribution of the individual
sensing elements can be seen and is evident from the slight brightness variations at their edges.
data analysis make extensive use of another consumer item – high-powered
graphics chips (GPUs), as fitted to PCs
for computer games. These are used
because of their ability to quickly
process large amounts of graphics data.
Some of the processing power is used
to detect and compress non-changing
data, while the remainder is used to
find and analyse targets of interest.
Before Persistics software can analyse the received video, it must first
pre-process the imagery. A technique
known as “pixel-level dense image
correspondence” is used to carry out
the following steps:
• Stabilise the video to remove the
effects of vibration by the platform
that acquired it;
• Remove or account for any paralsiliconchip.com.au
lax errors due to processing images
from multiple passes by a surveillance
platform providing a “straight down”
view, thus making it much easier to
identify moving targets of interest;
• Compress non-changing background imagery;
• Detect moving people or vehicles of
interest and provide sub-pixel resolution of such objects;
• Provide seamless stitching of adjacent images taken by different cameras
to make a large virtual image;
• Improve signal to noise ratio; and
• Account for exposure variations on
different parts of the image or due to
the use of different cameras
Persistics can also automatically
apply computing power where it is
needed. For example, the software
might detect a moving enemy convoy
and devote extra resources to tracking that, or it might observe an area
surrounding friendly troops to detect
any nearby threats. The data can be
integrated with existing maps and
other metadata of interest.
Questions & answers
Examples of questions that Persistics might answer include:
• Where was a particular vehicle between 10am and 11am this morning?
• What vehicles and people visited
this building in the last eight hours?
• What places has this person visited
in the last two days?
• What are the origins of the group
of vehicles assembled at this location?
• Over the last four weeks, how often
December 2014 21
Spurious
Tracks
Persistics software can be used to track vehicles. In this example, the track of a suspect vehicle is shown in green when
a visit to a suspicious building is made, outlined in the red box. Note also a vehicle track in mauve. Perisistics can also
generate a road traffic pattern as shown at upper right. Here, the width and brightness of the lines, representing roads,
is shown in proportion to the traffic flow observed. Any deviation from regular patterns of traffic flow might constitute
an anomaly that requires further investigation. The spurious tracks shown are errors generated by unusual sun angles
and other causes. Showing individual vehicle tracks is an effective means of data compression. Apart from it being easy
for an analyst to see, a conventional view of a vehicle of interest would not provide direct track information and would
constitute hundreds of individual pictures with only the position of the vehicle changing a small amount between images.
did the person living at this location
visit this suspicious building?
• What time does this person normally leave their house in the morning?
• What persons or vehicles previously
visited the location of a terrorist attack?
As a result of this, direct relationships and patterns of activities can be
established between people, vehicles,
buildings, events, times and locations.
Also, since everything is recorded
and archived, patterns of activity can
be established, even for people not
currently under suspicion if it is later
determined that they were involved in
terrorist or other enemy activity.
Unfortunately, no video of the Persistics/ARGUS-IS ability to track people
seems to have ever been released.
Instead, these are just a few still images but some sense of what might
be possible can be gained from a
video released of the VIRAT System (Video and Image Retrieval and
Analysis Tool) – see http://youtu.be/
LkueCrzzRrk or the YouTube title
“DARPA Video and Image Retrieval
and Analysis Tool (VIRAT)”.
Table 1: VIRAT Detection Tasks
INTERACTION
Single Person
Person-Person
Person-Vehicle
Person-Facility
Vehicle
Other
ACTIVITY TO BE DETECTED
Digging, loitering, picking up, throwing, exploding/burning, carrying,
shooting, launching, walking, limping, running, kicking, smoking,
gesturing.
Following, meeting, gathering, moving as a group, dispersing, shaking
hands, kissing, exchanging objects, kicking, carrying together.
Driving, getting-in (out), loading (unloading), opening (closing) boot,
crawling under car, breaking window, shooting/launching, exploding/
burning, dropping off, picking up.
Entering (exiting), standing, waiting at checkpoint, evading checkpoint,
climbing atop, passing through gate, dropping off.
Accelerating (decelerating), turning, stopping, overtaking/passing,
exploding/burning, discharging, shooting, moving together, forming
into convoys, maintaining distance.
VIP activities (convoy, parade, receiving line, troop formation, speaking
to crowds), riding/leading animal, bicycling.
22 Silicon Chip
VIRAT might be considered as a
companion software suite to ARGUSIS and Persistics and could be used to
scan through data from ARGUS-IS and
other surveillance platforms. VIRAT is
intended specifically to automatically
look for suspicious types of behaviour
which Persistics does not currently do.
From the original program documentation for the project (BAA08-20), VIRAT
ideally looks for the types of activities
listed in Table 1.
In short, VIRAT looks for short-term
activities which might be suspicious
in small geographic areas. For a bigger
picture, there is PerSEAS or “Persistent Stare Exploitation and Analysis
System”. This system also looks for
suspicious patterns of behaviour but
over a larger geographical area and
over much longer periods of time to
detect possible threats. Algorithms
from VIRAT provide some of the underlying capabilities within PerSEAS.
The US military is not the only
organisation that’s interested in the
data provided by Persistics, VIRAT,
PerSEAS and ARGUS-IS. Agencies
such as the Department of Homeland
Security also have an interest and the
data can also be used to check compliance with international nuclear treasiliconchip.com.au
Table 2: Summary Of ARGUS-IS Features*
FEATURE
Number of video
windows
Video tracking
CAPABILITY
More than 100 user-defined windows can be simultaneously
observed by operators.
System can track “dismounts” (people) or vehicles as chosen by
the operator, or system can initiate automated tracking.
Full field-of-view (FOV)
vehicle motion detection Automated moving target indicators over the full 60° field of view.
2.5-3.3 frames of effectively lossless JPEG2000 archive in
Forensic archive
standard NITF metadata.
Unmanned-air-systems
Size, weight, power and function compatible with unmanned or
compatible
manned fixed wing, rotary and airship platforms.
Real-time forensic reach back capability; thumbnails and
Archive access
metadata for ~40,000 targets.
Calibrated and stitched full FOV mosaic at user-defined update
Full FOV mosaics
rate for background context.
Gimbal assembly
Six axis stabilised.
Focal plane array
4-colour Bayer CFPA, each containing 92 x 5-megapixel 2.2μm
assembly
pixel focal plane arrays.
Data access and retrieval through services provides a standard
Interface
web-service interface.
* From BAE Systems brochure.
ties by monitoring buildings where
suspicious activities might be taking
place. Other obvious uses for this technology include border protection and
safeguarding important infrastructure
such as power plants.
Future developments
Methods of analysing patterns of
behaviour that are outside the bounds
of established cultural or social norms,
and which might indicate possible terrorist activity, are also under development. Research is also under way to use
Persistics data to develop 3D models of
areas of interest. The companion data
analysis suites VIRAT and PerSEAS
will also continue development.
YouTube Video
A YouTube video of the ARGUS-IS
system, titled “Spy Drone Can See
What You are Wearing From 17,500
Feet”, is available at http://youtu.
be/AHrZgS-Gvi4 This is an extract
from the Nova program “Rise of the
Drones” (released on PBS Jan 23,
2013) which can be seen at http://
youtu.be/IOzCiCl05Ec
On the hardware side, an infrared
version of ARGUS is also under development, designated ARGUS-IR.
ARGUS-IR is another DARPA-funded
project and is designed to address the
problem of current IR sensors which
have a narrow field of view, limited
resolution and a low frame rate.
ARGUS-IR will operate much like
ARGUS-IS, the objective being to track
unmounted targets at night. It will
use at least 130 independent video
streams for tracking individual targets
of interest. The concept will be similar
to ARGUS-IS in that it will use many
smaller sensors formed into one large
virtual sensor mosaic.
Civil liberties concerns
Some civil liberties groups in the
USA have expressed concern that this
technology might be used to monitor
American civilians. In particular, the
activities of civilians could be recorded and archived by the government,
even if they have done nothing wrong.
At the moment, ARGUS-IS is not
used in any civilian environment
in the USA. However, amid much
controversy, the technique of Wide
Area Persistent Surveillance is under
consideration for use in Dayton, Ohio.
Conclusion
ARGUS-IS represents an exciting
new technology for use in intelligence
gathering and the fight against global
terrorism. However, the ability to “see
all and know all” also has potential
civil liberties concerns if the technology is used inappropriately by
governments, so it is important that
people know this technology exists
and understand what it can do.
To their credit, the US Government
did make information on ARGUS-IS
known to the general public and it is
up to all free people to ensure it is only
SC
used for its intended purpose.
Desktop 3D Printer
Bring your imagination to life.
Automatic Bed Levelling
High Print Resolution
Automatic Material
Recognition
Up to 300% Faster
Faster and More Accurate Setup
For Software Selection of Heat
Profiles using SmartReel™
Down to 20 Microns
Dual Nozzle System
See our website for more details
www.wiltronics.com.au
siliconchip.com.au
$1495.00
inc. GST
Includes 2 SmartReel™ reels of filament!
December 2014 23
|