This is only a preview of the September 2023 issue of Practical Electronics. You can view 0 of the 72 pages in the full issue. Articles in this series:
|
Techno Talk
We both have truths,
are mine the same as yours? Max the Magnificent
We live in interesting times. Not so long ago, everyone living somewhere like the UK was exposed to
the same sources of news, and those sources did their best to maintain the integrity of their content. If
only we could say the same today.
T
here’s an old expression that
‘History is written by the victors.’
This means that, although certain
things may or may not have happened,
the eventual record of what occurred is
documented by people with power and
influence who at best were there and saw
it (if we’re lucky) or who just heard or
read about it (if we’re not). Either way,
they cast their own interpretation of what
happened under the influence of social,
political and personal considerations.
There’s also the fact that most of us
don’t have great memories to begin
with. Actually, that’s not strictly true
because my dear old mother’s mind is
so sharp that she sometimes remembers
things that haven’t even happened yet.
Be that as it may, in many respects the
way our biological memory systems
work is awesome overall, it’s just that
most of us aren’t tremendously talented
with respect to remembering specific
happenings. Things are further complicated by the fact that our memories
are coloured by our emotional state at
the time an event takes place. And none
of this is aided by the fact that the way
our minds work means that every time
we recall something, we inadvertently
tweak that memory and end up storing
a slightly revised version.
As a result, since the dawn of time,
people who experience the same event
may recall it in very different ways. I
offer myself as living proof. Whenever
I do anything with my wife (Gina the
Gorgeous), sometime later, when she
tells the tale of our adventure to a third
party – like her mother, for example – it’s
as though I attended a completely different event (usually one for which my
presence was surplus to requirements).
That may be one of the reasons people were so excited by the advent of
photography, as epitomised by the oldest surviving photograph of an image
formed in a camera, which was created
by the French inventor Joseph Nicéphore
Niépce circa 1827. By the 1850s, printed photographs were starting to become
available to the general public. At that
time, most people assumed that a photograph offered a faithful depiction of a
8
scene, which led to sayings like, ‘The
camera never lies’.
As we now know, of course, the camera rarely tells the truth. Some people
lay the blame on the creators of digital image editing applications like
PhotoShop. In response, I would suggest that these simple souls search for
‘Cottingley Fairies’ on Google.
Dire doings being done
Most of the time, photos are altered for
artistic or commercial effect. It’s unfortunate that some nefarious scoundrels
modify media with maleficent motives.
All of which leads us inexorably to the
topic of deepfakes (a portmanteau of
‘deep learning’ and ‘fake’). This involves using artificial intelligence (AI)
and machine learning (ML) to manipulate or generate synthetic media in
the form of audio and visual content.
For example, take a tool like Lyrebird
– https://bit.ly/3NGDpYr – which can
listen to someone talking, or a recording of someone talking for a couple of
minutes and extract a ‘digital signature’
of that voice. Later, using a text-tospeech mode, it’s possible to create
a recording of that person apparently
saying whatever you want them to say.
I’m sure this capability will come in
handy for all sorts of legitimate purposes. What could possibly go wrong?
Where things start to get really scary
is when an AI is used to analyse videos of someone talking. Using natural
language processing (NLP) to understand what’s being said (one part of
understanding the emotional content
of the message), the AI can analyse
the video frame by frame and associate phonemes (units of sound that
distinguish one word from another)
with lip movements and other facial
phenomena (blinking, squinting, grinning, grimacing…).
Later, if presented with an audio
recording (a recording that could be
generated by a tool like Lyrebird), the
AI can generate a frame-by-frame photorealistic video of that person saying
whatever you want them to (appear to)
be saying.
Fake news!
One of the things I didn’t fully appreciate when I lived in England was the
quality of the news. After spending
the past 33 years of my life listening
to the drivel they call news here in the
US, hearing the dulcet tones of a BBC
news announcer explaining what’s
actually happening without offering
their personal interpretation brings a
little tear to my eye. The news channels over here are so far apart, and
some are so divorced from reality, that
reports of the same happening suggest
the events they describe occurred on
different planets.
The same thing happens with newspapers. I know that the tabloids in the
UK tend to lean one way or the other politically speaking. As I recall, however,
at least you used to get the impression
they were reporting the same thing, albeit from their own perspective (I hope
this is still the case). Once again, things
are more dire here in the US.
And then there’s the fact that many
people now obtain their news from social media channels whose algorithms
are tuned to keep on feeding each viewer more of what they like, which boils
down to them only seeing things that
reinforce what they already believe to
be true.
Things are further confused by the
advent of generative AIs like ChatGPT,
whose models are trained on opensource material that may have inbuilt
biases and inaccuracies. As Pontius
Pilate says (well, warbles) to Jesus in
Andrew Lloyd Webber’s Jesus Christ
Superstar, ‘And what is ‘truth’? Is truth
unchanging law? We both have truths.
Are mine the same as yours?’
Is there an answer to all this?
Personally, I’m longing for the day
when my AI-powered augmented reality
(AR) glasses highlight misinformation in
text (with associated links to validated
sources), they add pop-up information
bubbles to correct misrepresentations
on television, and they pre-emptively
whisper ‘they’re lying’ in my ear whenever a politician opens his or her mouth.
(What? Cynical? Moi?)
Practical Electronics | September | 2023
|