This is only a preview of the August 2022 issue of Silicon Chip. You can view 41 of the 104 pages in the full issue, including the advertisments. For full access, purchase the issue for $10.00 or subscribe for access to the latest issues. Articles in this series:
Items relevant to "Wide-Range Ohmmeter, Part 1":
Articles in this series:
Items relevant to "isoundBar with Built-in Woofer":
Items relevant to "SPY-DER: a 3D-printed Robot":
Items relevant to "Secure Remote Mains Switch, Part 2":
Articles in this series:
Purchase a printed copy of this issue for $11.50. |
By Arijit Das
SPY-DER
A 3D-PRINTED DIY ROBOT
SPY-DER is a speech and web-controlled surveillance spider robot.
It walks like a spider and acts as a spy using its camera, hence the
name “SPY-DER”. The best aspect of it is that you can make it yourself
using some 3D-printed parts, a bunch of servos and some low-cost
off-the-shelf electronic modules!
Y
OU CAN CONTROL THIS ROBOT IN TWO WAYS — USING
VOICE COMMANDS OR ITS WEB-BASED CONTROL INTERFACE. For example, I have nicknamed mine “Bumblebee”. Whenever I call it by that name, it starts listening
to me, and it will then act on voice commands. I am using
two main technologies to enable this: hot-word or wakeword detection and speech recognition.
The speech recognition also involves intent detection,
so that I can give it the same command in different ways.
For example, if I say “wave your hands” or “say hello”,
either way, it will wave its legs.
For the web control part, one can simply open a particular URL in any browser and use it to control the SPY-DER.
The web-based interface contains all the control options
as buttons. You can open another URL to watch the live
video feed from this robot’s camera.
You can see a short demonstration video that shows what
SPY-DER can do at https://youtu.be/3edXTxIZ_2U
64
Silicon Chip
Developing SPY-DER
Initially, I built a simple Bluetooth-controlled spider
robot using an Arduino Nano, but it could only be controlled using an Android or iOS app. Thus, I added speech
recognition, web control and surveillance features.
Implementing all these features using an Arduino was
impossible; I needed a small computer. That’s why I decided
to add the Raspberry Pi Zero. The whole system could have
been implemented using just the Raspberry Pi Zero, but
it would be too time-consuming to rewrite all the spider
movement control code.
So I decided to keep the Arduino and add the Raspberry
Pi and have them communicate over a serial link. The
Arduino controls all the spider’s movements while the
Raspberry Pi sends commands to the Arduino. This also
means that I don’t have to worry about the Raspberry Pi
being so busy doing speech recognition that it loses control of the limbs!
Australia's electronics magazine
siliconchip.com.au
Fig.1: this diagram shows all the wiring
required for the SPY-DER robot. The
order in which the servos are connected
is important; see Fig.2, and note that the
wire colour coding can vary between
models. Also, be careful to check the
labelling on the other modules as they
might not precisely match what we’ve
shown.
All the Raspberry Pi code is written in Python. For the
web-based control part, I used the Flask framework and
built the web page using HTML, CSS and jQuery. For the
live video streaming, I used RPi-Cam-Web-Interface (see
https://elinux.org/RPi-Cam-Web-Interface) because it has
very low latency.
For speech recognition and hot word detection, I used
Picovoice (https://picovoice.ai/) and modified the code in
Python. I tried using local speech recognition, but as the
RAM and processing power of the Raspberry Pi Zero is very
limited, the accuracy was not that good, and the latency
was also very high.
The physical robots parts are based on an existing robot
that I found at thingiverse.com/thing:2901132 (but it has
since been removed).
I redesigned a few parts in TinkerCAD (www.tinkercad.
com/) and made all of the relevant parts available online at
thingiverse.com/thing:4815137 I 3D-printed all those parts
siliconchip.com.au
using an Ender 3 3D printer (see Photos 1 & 2).
Starting assembly
If you prefer to watch a video, I have made a video just
over one hour long going over the project in detail at https://
youtu.be/KkZiZggtvIU which is definitely worth watching
before you start assembly. Also see the parts list later in
the article for what you will need to build it.
I have created another video just under 30 minutes long
that concentrates on the steps for building SPY-DER, which
you can view at https://youtu.be/fnMmnd9k6q8
Step 1 – 3D printing the parts
First, if you haven’t already done so, print all the 3D
parts that make up the robot.
Step 2 – attach the servo motors
Next, you need to attach the twelve SG90 servo motors
Australia's electronics magazine
August 2022 65
#1
#2
using M2 screws, as shown in Photos 3 & 4. Four of the
12 servo motors connect to the body while the other eight
connect to the legs. Attach them with screws, but don’t
add the ‘horns’ yet.
Plastic gear servo motors are used for this project as the
robot is pretty light.
I have some details on attaching the servos, along with
the following Steps 3, 4, 5 & 6 in the video at https://youtu.
be/fnMmnd9k6q8
shield. While attaching the servo motors, make sure you
have attached them according to the numbering shown
in Figs.1 & 2 and with the black wires to the side marked
“G” (for ground).
The I/O shield also needs to be wired up to the power
supply which powers the servo motors and the Arduino.
Make sure the power switch is off when you connect it.
Photo 7 shows what the Nano looks like once placed
inside the robot’s body.
Step 3 – join the body parts
Step 6 – servo calibration
Then connect all the 3D-printed body parts through the
servo motors – see Fig.2. Don’t attach the horns just yet.
Step 4 – connect the battery and BMS
As the power requirements of the 12 servo motors are
pretty high, I used two 18650 Li-ion cells in series. The
Arduino, servo motors and Raspberry Pi all require a 5V
DC supply. An LM2596 buck converter is used to convert
the 7-8V output of the battery to a regulated 5V, which is
then fed to all the components. For safety, a battery management system or BMS is also used.
Fig.1 shows how these parts are connected, including
some other parts we’ll get to shortly.
Make sure that when you join these, it can still fit within
the robot’s body, as inserting it is the next step. Photo 5
shows how I wired these parts up (including the on/off toggle switch), while Photo 6 shows it installed in the robot
body. Note how the servo power/control leads have been
fed into the main cavity.
Now you need to upload the code to the Arduino Nano.
The code is available to download from https://github.com/
Arijit1080?tab=repositories (a copy of this is also available
from the Silicon Chip website).
The first step is to calibrate the robot legs. The program
to do this is in the “Legs” folder (named “legs.ino”).
Before calibrating the servo motors, check that their connections are correct and they are appropriately powered.
After running the legs.ino calibration sketch, screw the
horns that hold the legs to the body.
Step 7 – initial functional testing
First, plug the Arduino Nano into the socket on the
Prototype Shield – make sure it’s the right way around.
Next, plug all the servo motors into the headers on the I/O
To check the basic functionality of the robot, there is
another Sketch named “program1.ino” in the program1
folder of the GitHub repository. After uploading this, when
you power the robot up, it will automatically start testing
all the features in the following order:
• Stand up
• Move forward
• Move backwards
• Move left
• Move right
• Hand wave
• Dance
#5
#6
Step 5 – setting up the Arduino
66
Silicon Chip
Australia's electronics magazine
siliconchip.com.au
#3
#4
Any deviations from the above movements need to be
checked as they suggest an incorrect connection or component that is not working correctly etc. To know more about
this and the last step, you can watch my videos.
Arduino with the 3.3V I/Os on the Raspberry Pi. The Arduino connects to the 5V (“HV”) side of the level shifter while
the Pi goes to the 3.3V (“LV”) side.
I have a general video about using a level shifter like
this for serial communication between different boards at
https://youtu.be/e04br5J4UpQ
To connect a microphone to the Raspberry Pi Zero, there
are three options:
1) Connecting a USB microphone using an OTG cable
2) Connect a microphone with a 3.5mm jack plug using
a Raspberry Pi sound card and OTG cable
3) Using a Raspberry Pi audio hat.
I suggest you connect a USB microphone using an OTG
cable as I did. The Raspberry Pi supports most standard
USB microphones.
For the camera, use a standard Raspberry Pi camera
(www.raspberrypi.com/products/camera-module-v2/) and
plug it in as per the instructions. I have a video on using
the Raspberry Pi Camera with a Raspberry Pi Zero at https://
youtu.be/oo0A_yRrIxQ
Step 8 – uploading the final Arduino code
Now you can upload the final code to the Arduino.
This will work with the Raspberry Pi. The code is available from siliconchip.com.au/link/abd3 (and the Silicon
Chip website). Upload the “SPY-DER_Arduino.ino” file to
the Arduino.
This program takes commands from Raspberry Pi and
acts accordingly.
Step 9 – preparing the Raspberry Pi
Start by installing the latest version of the Raspbian operating system on the Raspberry Pi. You can use SSH or a direct
HDMI connection while working with the Raspberry Pi.
Step 10 – Raspberry Pi microphone & camera
The Raspberry Pi needs to have the mic, camera and logic
level shifter attached, as shown in Photo 8.
This logic level shifter is needed to interface the 5V
Step 11 – setting up the Raspberry Pi
The remaining setup steps are as follows:
Fig.2: match these servo
numbers up with the
connections shown in Fig.1.
siliconchip.com.au
Australia's electronics magazine
August 2022 67
#7
#8
1) Set up VNC Connect on the Raspberry Pi so that you
can remotely access and control it from your computer.
2) Switch on the camera in the settings or use raspi-config
from the command line. Check that the camera works;
RaspiStill can be used to test it.
3) Enable the microphone and then test recording from
the terminal. You might need to modify the “.asoundrc”
file to set up the mic.
4) Test serial communications between the Raspberry
Pi and Arduino.
5) Clone all the code from my GitHub repo (siliconchip.
com.au/link/abd3) onto the Raspberry Pi (say, into the
home folder).
6) Clone the Picovoice (https://picovoice.ai/) repository
from https://github.com/Picovoice/picovoice and then
launch the Picovoice program in my GitHub repository
(see the README file).
7) Install RPi-Cam-Interface for video streaming. You
can get it from https://elinux.org/RPi-Cam-Web-Interface
and see the video at https://youtu.be/yzpqEw1kEGo for
more details
8) Train the Rhino speech-to-intent model so that for a
single task, you can use different commands; Rhino is contained in the Picovoice repository.
To train the model, open a web browser and go to https://
console.picovoice.ai/rhn and input different kinds of commands and their intentions – see Screen 1. Depending
on the intentions you use here, you need to change the
“picovoice_demo_mic.py” file.
After writing down all the commands and intents, follow the prompts on the webpage to train the model by
using the microphone, then upload the trained model to
the Raspberry Pi.
9) For web control, you need to install the Flask
framework in Python; all the Python & HTML files are in
my repository.
Step 12 – finishing the build & controlling the robot
Fit everything inside the body (Photo 9) and glue the
microphone and camera into the holes provided in the lid
(Photo 10). Attach the lid, power it up and then use VNC to
connect to the Raspberry Pi wirelessly from your computer.
Photo 11 shows the completed robot with the lid attached.
To start the web control interface, open a console inside
the SPY-DER GitHub repository root folder and enter the
following commands:
cd web_control
python3 web_control.py
After running these commands, you can access the
web control interface from any browser using the URL
http://<raspberry_pi_ip_address>:5010 (insert the current
IP address of your Raspberry Pi) – see Screen 2. From here,
the robot can be controlled using all those buttons. You can
modify the control interface by changing the code in the
“web_control” folder.
Step 13 – Speech control
Go into the “picovoice” folder to run the speech control
system. There are three files there you will need. The first
one is the main code file named “picovoice_demo_mic.
py”. Modify this code according to your speech to text
model training.
The next file needed is the porcupine keyword file. This
is the keyword that you will use to call the robot. There
are many pre-trained files available in the Picovoice repository. You can choose any of the keywords to use as your
robot’s wake word.
#11
68
Silicon Chip
Australia's electronics magazine
siliconchip.com.au
#9
#10
Finally, you need the speech-to-text model, which you
have already trained and downloaded. Then you can run
the code with these two files using the following commands:
cd picovoice
python3 demo/python/picovoice_demo_mic.py \
--keyword_path resources/porcupine/resources/
keyword_files/raspberry-pi/bumblebee_raspberrypi.ppn \
--context_path your_rhino_model
In this example command, I have used “bumblebee_raspberry-pi.ppn” as the keyword file, so “bumblebee” is the
wake word for my robot.
Step 14 – Video streaming
You can enable live video streaming either using voice
commands or the web control interface. After turning on
the live video surveillance, to access it, open the URL
http://<raspberry_pi_ip_address>:80 in a web browser.
Conclusion & future improvements
There is plenty of room for modifications to this project.
For example, if a local speech recognition system could
be designed that would perform well on a Raspberry Pi,
that would speed up its response to voice commands and
remove the need for an internet connection.
Snow-boy hot-word detector is an open-source hot-word
detector that works pretty well on the Raspberry Pi. It provides several image processing features like object detection, face recognition etc. It could potentially be added to
this project. Maybe I will upgrade it in the future!
SC
Parts List – SPY-DER Robot
3D printed robot parts
1 Arduino Nano microcontroller module
1 Raspberry Pi Zero W embedded computer
1 Raspberry Pi camera
1 5V to 3.3V logic-level shifter
[AliExpress siliconchip.au/link/abdk]
1 Nano 3.0 Prototype Shield
[AliExpress siliconchip.au/link/abdl]
12 SG90 mini servo motors
[AliExpress siliconchip.au/link/abdm]
1 LM2596-based buck converter module
[Silicon Chip Cat SC4916]
1 Lithium-ion 2S battery (nominally ~7.4V)
[eg, from Hobby King or two 18650 Li-ion cells in
series]
1 Li-ion 2S battery management system
1-2 bright LEDs (eg, 5mm blue types, for eyes)
1-2 current-limiting resistors for LEDs (eg, 220W 1/4W)
1 USB microphone
1 USB OTG Micro-B cable or adapter
1 SPST/SPDT switch (eg, toggle or slide) rated 5A DC
4 M2 x 50mm machine screws and nuts
1 pack of DuPont jumper wires (mostly short femalefemale types)
36 No.2 x 6mm self-tapping screws (may be included
with servos)
various lengths and colours of medium-duty hookup
wire
► Screen 1: the Picovoice Rhino
training console. Here you
can teach it how you say the
different words that you will
later use to control the robot.
You’ll need to sign up for
an account on the Picovoice
website to allow you to do this.
Screen 2: the SPY-DER web ►
control interface is quite
simple, and all the functions of
the buttons are pretty obvious.
This works in parallel with
voice control, assuming you
have voice control up and
running.
siliconchip.com.au
Australia's electronics magazine
August 2022 69
|