Close
0%
0%

Gauss the Machine Learning Robot Platform

This is a friendly DIY Robot I made for a class at my university that uses Matlab and Python to process images and sound.

Similar projects worth following
At my fourth year studying Electronics Engineering I decided to make a Robot for a class I attended last semester. The class was called Theory of Systems and Signals and there I became fascinated of the things we can accomplish with simple mathematical operations powered by the computing capabilities of a programs such as Matlab or programming languages such as Python.
The main applications developed include Machine Learning for autonomous driving in a circuit, Voice Recognition for, remote control an telematics, Augmented reality and QR Code recognition and also face recognition. An android App was developed to control the robot.

The Robot it's based on a Raspberry Pi model B+ configured with a custom Wheezy OS. I has a  WiFi card for creating his own network for remote conections and remote control. This allows the board to connect to a computer and access the telematics data contained in the board and the arduino. The connection is SSH based and updates can be sent to the board via a complementary TCP Socket.

The Raspberry Pi clearly doesn't have the computing power required for what I whant to achive ( Autonomous Driving and Image processing) so the way to go it processing everything in an external computer and then let the robot act. So the main robot application would run in a computer outside the robot and the robot would in term act as a "Puppet" for the machine. This way we can also control the robot remotely with a joystick and aquire the telematics data required for autonomous driving. This solves the issue of processing power at a relatively low cost enableing us to test much more advanced algorithms such as the VGG16 Neural Architecture.

The robot is capable of delivering telematics information on the following aspects:

  • Video Feed of RGB images at 60 FPS with a resolution of 1080 by 720
  • Wheels Speed based on PWM pulse frequency
  • Servo Angles and Target Position for the camera
  • Mono Audio input from the Camera's Microphone
  • Any information contained in the Raspberry PI such as stoered images, files and GPIO data

Autonomous Driving

For the Autonomous driving part we mainly used a VGG16 Network headless. This means we deleted the dense netwok that classifies the features of the images and replace it with our own which in terms gives the desired wheel speeds as outputs. For training a circuit was drawn in the floor of my appartment using strips of paper, and the robot was configured for remote control. I piloted the robot through diferent circuits making sure to change it every few passes. The result was a havy Dataset containing information of the images and the recorded wheel speeds at each frame. We trained the Neural Net in the GPUs available at my university for a couple of hours. The result is a really good approcimation to autonomous driving, with a fairly reasonable speed considering the latency in the network transfer of images and processing. Every frame took 80ms to be processed with resized images in black and white. Future test would include a faster GPU for RGB processing and much more complet Neral Networks.

Face Recognition

Another application or usecase for the robot was recognizing people's faces. This was the main attraction when I presented the robot at a congress representing our University. People stopped for a minute and approached our posterbeacause as they said "I saw that robot looking at me". The algorithm was the well known Face Recognition algorithm using Haar-like Features coded in Python. Once it locates a face the servos move in order to center the face in the camera. Simple but quite effective and apparently friendly to many people. Simple interactions like this one help to ease the relationship between Humans and Robots by making sure the Human feels confortable in the presence of the Robot, deeming the Robot as approachable rather than alien.

  • 1 × Raspberry Pi B+
  • 1 × Arduino Pro Mini
  • 1 × Custom PCB
  • 2 × DC Motor Whells
  • 2 × S3003 Servo Motors

View all 9 components

  • 1
    Step 1

    So I needed to set up the robot to be wireless, to have a camera and to drive 2 DC motors and 2 Servos wile reading the input from the sensors and camera, eeeeasy. Well for the wireless part I required the robot to move freely without Ethernet cables, the best way i found to do this is to make a WiFi hotspot in he Raspberry Pi itself. I'll leave links to how i setted up this and the next few thing in this project page. After that the computer can connect to the raspi with no effort and establish an SSH connection with the Pi, the bonus here is that the Pi always gets the same IP address so I didn't have to modify the script every time it runs. For the camera I first saw the example of the Raspi cam in the Mathworks page, but being honest in my country the Rapi cam costs as much as the devboard, so i opted for a webcam that i knew it was compatible. Now I needed to get the images from a cam not supported on the Matlab original functions, the work around here is to establish an IP Camera through the 8080 port in the Pi and reading the images with Matlabs Ip Cam support packages (I'm still trying to do this, right now i used the motion program but I have a 2 sec delay, now I'm leaning toward the mjpeg streamer but couldn't configure it quite nice yet). It's worth to mention that if you have a Raspi Camera this set up is very straight forward and doesn't requiere this much configuration.

    As every blog on the internet points put, the Raspi doesn't handle quite well PWM. The response I always get in the forums is that the core frequency of the Raspi can fluctuate thus generating discrepancies in the PWM output signals. So i can't drive the wheels nor the servos with it. My solution, use Arduino. I made a simple communication standard between the to using matrices from Matlab sent over serial, addapting voltages with a two resistor voltage divider, so that thing was done. After a couple of hours I designed a Raspbery Hat that can hold the Arduino Pro Mini and regulate the voltage to 5v for both devices, using Eagle CAD. The board also has connections for all the pins of both devboards, an L293D as a DC Motor Driver, a Voltage Regulator, In Serial Programer pinout, and Servo and DC motor output pins.

    In the middle of all this electronics and software mess I managed to design a Chasis in AutoCAD, and tested it's structural stability and part clearance with Autodesk Inventor (Did this because as I mentioned the previous robot had a tendency to tear itself apart and I used an easily breakable material on it, not a great combination after all; but as I said I'm still working on it). The day after I had my chasis printed in my local Laser cutting workshop, and by that afternoon it was complete (this was on August 9th).

    This is my first update on the project, and hopefully I'll have more to present to you guys in a couple of weeks. The finished date of this project is on the 1st of March according to my University, but that does not mean that I'll be done with it, told you we can make a lot of things with this thing and I'm sure I won't by March. You can follow updates on my works via instagram (@ich_heisse_augusto) and fell free to tweet me (@Fre4k4zoid) . Lastly I'll leave you with some reference links down below, thanks for reading :)

    Raspi Matlab Getting Started Guide

    http://www.mathworks.com/help/supportpkg/raspberrypiio/examples/getting-started-with-matlab-support-package-for-raspberry-pi-hardware.html

    Raspi as Hotspot Setup

    http://raspberrypihq.com/how-to-turn-a-raspberry-pi-into-a-wifi-router/

    Raspi as IP Cam Setup

    https://pimylifeup.com/raspberry-pi-security-camera/

  • 2
    Step 2

    Today 30/11/2016 I just finished compiling mjpg-streamer for the Raspi, took me quite some tiime to find a good guide on how to do so but yesterday I found this link (https://www.raspberrypi.org/forums/viewtopic.php?f=91&t=100818) which explains perfectly how to do so. This allowed me to have a continuos image stream over Wi Fi with a latency of at most 100ms. Next week I'll be working on the face tracking algorithm since I discovered it wasn't behaving correctly.

View all instructions

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates