This is only a preview of the January 2004 issue of Silicon Chip. You can view 33 of the 96 pages in the full issue, including the advertisments. For full access, purchase the issue for $10.00 or subscribe for access to the latest issues. Items relevant to "Studio 350 Power Amplifier Module":
Items relevant to "High-Efficiency Power Supply For 1W Star LEDs":
Items relevant to "Antenna & RF Preamp For Weather Satellites":
Items relevant to "Lapel Microphone Adaptor For PA Systems":
Articles in this series:
|
in the movies
Freeze motion is an effect which appears to
have taken the motion picture world by storm.
Here we take you behind the scenes to show
how it is done and surprise, surprise – it is not
done by computer generated effects.
By BARRIE SMITH
– how it’s done
www.siliconchip.com.au
January 2004 7
I
f you’ve seen The Matrix films
you’ll know the effect: the action
freezes and the camera tracks
around the subject, usually with
Keanu Reeves, skirts akimbo and eerily aloft, while dishing it out to the
evil forces.
Or it may be a bullet, stopped dead,
camera moving around it. If only levitation and suspension of the element
of time were so easy!
When viewed on the big screen,
the effect is rivetting. And these days
when big budget films appear to be
absolutely chockers with computergenerated imagery, it’s refreshing to
find this frozen-moment effect was
perfected some 20 years ago by English
visual artist Tim MacMillan and essentially uses well-proven photographic
processes.
However, the principle of capturing an event in rapid, successive
frames goes farther back to the days
of Eadweard Muybridge, who shot
his famous horse walking/trotting/
cantering/galloping sequence (and
many others) with an array of still
cameras.
Camera array
Which brings us to the term which
best describes the principle item of
hardware, the ‘camera array’.
Put simply, the array is a firing
line of still cameras, fixed to a sturdy
metal bar or truss and curved in an arc
around the subject. When the subject
reaches a critical point in the frame,
the cameras are fired either in unison
or in very close succession (typically
10 milliseconds apart).
If fired simultaneously, the effect is
christened ‘frozen moment’ or ‘temps
mort’ (‘dead time’); if in rapid succession, the name ‘flow motion’ is
employed.
When the succession of frames is
retrieved from the still cameras and
collated together into a recognisable
motion picture sequence, we get a
‘movie’. The action is frozen (the
shutter speed is often as fast as 1/1000
second or more) and sharp. But when
the movie is run, the camera appears
to be tracking around the subject.
An effective use of the effect is to
edit it into a normal, 24 frame/second
sequence shot; a motion picture camera is placed at either end of the still
camera array and lined up to match
the framing of the first or last of the
multiple cameras.
So the frozen moment may follow
a normal speed action, precede it or
even be used in the middle of the 24
frame/second sequence.
Melbourne’s Mark Ruff confesses
to being “obsessed with this image
technique” and has spent five years or
more perfecting his own system. He is
the first to clarify any confusion that he
is connected in any way with the team
that created the marvellous effects in
The Matrix films; these were achieved
with a film-based system, brought to
Australia by Manex.
Film was also the basis of Marks’
first array, which first fired its rapid
shots in 1997.
The inspiration came from seeing
the BBC Natural History Unit series
The Human Body, which employed
Tim McMillan’s Time Slice Camera:
“I thought this guy is a hero for
developing such a system. The Time
Slice Camera holds one length of film
within its casing and has a longitudinal array of lenses and shutters.
Mark admits “this camera array has
certain limitations”, so MacMillan
invited Mark to Scotland to shoot a
job: “Rather than Tim spend a lot of
time and money on an array, he got me
[and the gear] over to do the job and
he essentially directed the time slice
component.”
Flare Obstacle
In common with some US systems,
Mark Ruff’s first approach employed
60 Pentax film SLR cameras and Sigma
lenses. It worked. But there were many
problems, mostly related to the build
of the cameras and lens quality — flare
was an obstacle — and even the shutter misfired on occasions when wear
crept in.
Moreover, the system was unwieldy
in the post process. Not only did a cassette of 35mm film have to be loaded
The principles involved in “freeze frame” photography go right back to the days of the celebrated Eadweard Muybridge
(shown above right) and his amazing (for the time) “Horse in Motion” series of photographs. These were taken in 1877
as a result of an earlier wager as to whether all four of a trotting horse’s legs were ever off the ground at the same
time. (He proved they were!) His work in stop-action series photography led to his invention of the “zoopraxiscope,” a
primitive motion-picture machine which recreated movement by displaying individual photographs in rapid succession.
8 Silicon Chip
www.siliconchip.com.au
In this shot a 35mm Arriflex 435 motion picture camera is placed at the start
of the still camera array and lined up to match the framing of the first of the
multiple cameras and ‘hand over’ the action (moving left to right on the screen)
to it.
into each camera pre-shoot, the exposed films then needed processing
and scanning to become a digital image
file. The frames then had to be recorded
onto 35mm motion picture negative
and a print made.
This took two days, before you could
even screen the sequence! Making tests
was often as costly as the final shoot.
Digital to go!
As many amateur snap shooters have
found, going digital will not necessarily save you money. Mark Ruff figures
his move to digital cost him ten times
that of a film approach but he describes
the difference as “chalk and cheese”.
His current digital system is based
on 30 Canon EOS 10D digital SLR
cameras, complete with 30 Canon f3.54.5/24-85mm zoom lenses. If you walk
into a camera shop, a single camera and
lens will cost over $4000.
A digital rig, complete with 30 cameras, lenses and firing infrastructure
can be set up ready to shoot within
an hour. Doing a test is virtually free
— aside from time. If a problem does
arise, a re-shoot can be done immediately.
And as for post processing, the time
from shoot to sequence preview-ready
can be as short as 30 minutes. The
client can then give an OK on the
spot. At this point the digital to film
transfer has yet to be made but these
days film editing is computerised so
the digital sequence can be cut into the
main edit and the final recording to
www.siliconchip.com.au
film done when all the other material
is conformed.
At the moment, Mark’s ‘firing
line’ can only shoot frozen moment
sequences. He feels that this type of
action “can be handled in more ways
than a non-linear, flow motion event.
A non-linear “temps mort” effect can
be ping-ponged and/or zoomed into
repeatedly to increase screen duration.
With flow-motion the action can only
go in one direction. More cameras are
simply needed for flow-motion.
A brace of 36 cameras is now available while 42 cameras is about the
maximum the current infrastructure
can handle, based mainly on the truss,
which is nine metres long.
With a computer algorithm called
Time vs Speed
The frozen moment effect simulates a motion picture camera moving at great speed. However, in the
real world, it is impossible to move
a film camera at these speeds. The
calculations are based on a rig which
is on a 9 metre long truss. Shutter
speed (time is frozen) at 1/1000
second. The ‘window of time’ is one
millisecond. All cameras fire in this
brief moment, so it is like travelling
nine metres in one millisecond; 9km/
second or 32,400 kilometres/hour.
‘sharp interpolation’, partly developed
by Tim MacMillan, it is possible to
create inter-frames; a 36 camera system could then produce a 72-frame
sequence or even more and deliver an
on-screen 3-second sequence.
Normally, the cameras are spaced
20cm apart, lens centre to lens centre;
this is governed by the space necessary
at the camera’s side to insert and remove the CompactFlash memory card.
Initially, it took Mark about a week to
get the system up and running, plus
a further month to reach its current
form. He admits the “previous four
years of R&D helped of course — as I
knew exactly what to do.”
What also must have helped was
a degree in physics, a Bachelor of
Applied Science (Photography) from
RMIT and nearly a decade of real
experience as a technical director for
Melbourne’s Channel Ten. Mark has
also owned a business/studio servicing commercial/advertising photography for almost ten years and been
an ad agency staff photographer for
three years.
He remembers RMIT taught him
“how to ‘think’ about taking a photo
rather than just teaching you ‘how’ to
take a photo.
From camera to the Mac
After a sequence is shot, all the
CompactFlash cards are removed
from the camera and images downloaded into a Macintosh G4 laptop:
“An AppleScript sorts all the images
into appropriate takes (taking about 30
seconds) and positional stabilisation
achieved within minutes. Results can
then be burnt to DVD as data and/or
QuickTime files. It is therefore possible
to shoot, do the necessary post and
deliver to client all in the one day.”
There are registration problems connected with so many shots taken by so
many different cameras.
One disadvantage of a digital camera
is that the CMOS image sensor does not
consistently align with the camera’s
viewfinder screen: According to Mark,
“It does not matter how accurate you
are in an optical alignment, the pixels
will never be in exactly that same spot
you look at. It’s around 20-40 pixels
between each camera.”
But this aside, he added, it is gratifying that all images are registered so,
“once a stabilisation path has been
executed for that camera set up, it
applies for all takes. These framing,
January 2004 9
Bike Sequence: In this series of shots taken with the techniqe described
in this feature, you can see how much the background changes with respect to the bike rider
who appears to be moving in slow motion. The sequence runs down the columns.
10 Silicon Chip
www.siliconchip.com.au
scaling, and rotational errors can be
minimised (eliminated) with some
clever software.”
Jobs done
So far, Mark’s array has been used
to capture frozen moment sequences
in TV commercials for Toohey’s, Eveready batteries, the Nine Network
plus work for an Arnott’s corporate
video and various short films.
Mark is also a regular visitor to
India’s Bollywood, shooting TV commercials (one with cricketer Sachin
Tendulkar in Mumbai) and an Indianproduced, Tasmanian-located feature,
entitled ‘Boys’, which he describes as
“a fantasy dream sequence … three
set ups a day in different locations for
seven days.”
He has also “collaborated with
Dayton Taylor from Time Tracks who
operates another version of a multiple
lens camera. We worked on a BMW
shoot in Hollywood together.”
What’s Next?
Design is just about complete to do
the following:
• Control all camera settings (ISO
setting, colour temperature, shutter
speed, lens aperture etc) from the one
CPU. This is expected to be much
quicker than a number of people
manually adjusting cameras.
• Preview down-loading of the
images could be achieved almost
instantly upon exposure, by hooking
into a PAL (or NTSC) video signal
output from the camera. This means
The EOS 10D
While most digital SLR cameras
have an image sensor that is half
the area of the normal 35mm still
film frame, by good fortune this is
almost exactly the size of the motion picture frame. So data from a
digital SLR has more than enough
resolution for a movie, whether it
be 4:3 or 16:9 or even 2.35 (CinemaScope) aspect ratio.
The EOS 10D has 3072 x 2048
pixels available in its 22.7 x 15.1mm
CMOS sensor. The camera also
has a PAL/NTSC video output, so
tapping into this for a video preview
is possible.
two things: on a shoot a client could
see high-res results instantly. Near real
time broadcast playback could also be
made for various events, particularly
sporting, as part of a super slow motion replay. This can be done within
five seconds.
Mark has already conducted a test of
a video replay using a motor car race
as a trial. At the moment he is bullish
about the rig and its capabilities.
He is confident “there are no limitations at the moment — other than
the lack of an open cheque book to
implement all the options possible.
Even underwater is possible and an
outer space project should seem easy
without that gravity thing.”
Contact:
Mark Ruff Photography.
Office 03 9887 9364.
Mobile 0412 990 125.
Office at F.S.A. as well – contact
Russell Cunningham 02 9360 5800
Web site: www.ruffy.com
Email: ruffy<at>ruffy.com
Another
SILICON CH
Publicati IP
on
THE PROJECTS: High-Energy Universal Ignition System;
High-Energy Multispark CDI System; Programmable
Ignition Timing Module; Digital Speed Alarm &
Speedometer; Digital Tachometer With LED Display;
Digital Voltmeter (12V or 24V); Blocked Filter Alarm;
Simple Mixture Display For Fuel-Injected Cars;
Motorbike Alarm; Headlight Reminder; Engine
Immobiliser Mk.2; Engine Rev Limiter; 4-Channel
UHF Remote Control; LED Lighting For Cars; The Booze Mail order prices:
Aust: $14.95 (incl. GST & P&P)
Buster Breath Tester; Little Dynamite Subwoofer; Neon
NZ/Asia Pacific: $18.00 via airmail
Tube Modulator.
Rest of World: $21.50 via airmail
Order direct from the publishers (don’t forget your address info and credit card details):
PHONE:
FAX:
EMAIL:
(02) 9979 5644
(02) 9979 6503
Details to
9pm-5pm
24 hours a day
office<at>siliconchip.com.au
Mon-Friday
www.siliconchip.com.au
WEB:
Via
siliconchip.com.au
(click on order form)
MAIL:
Silicon Chip Publications
PO Box 139
Collaroy NSW 2097
January 2004 11
|