Close
0%
0%

Getting started with ROS

Summary of projects to dive (back) into the world of robotics and learning ROS (Robot Operating System)

Similar projects worth following

The past

Back in 2005 I got first in contact with robotics when I built up a small robot from scratch. The main structure was a wooden chassis and two stepper motors with wooden wheels for propulsion. An 8 Bit Microchip microcontroller was used to control the robot. Equipped with five self built IR obstacle sensors and a bumper sensor on the front, the robot was able to drive around and avoid obstacles. 

First robot

The next step was to implement navigation in software. At that time LIDAR sensors were still too expensive to even think about using them. Even ultrasonic distance sensors were too pricy for the student I was back then. This led me to implement dead reckoning for navigation. Having stepper motors it was easy to calculate the traveled distance as well as the rotation angle when avoiding obstacles. The software used this information to calculate a vector pointing from the current robot position to the position where it started from as well it’s pose relative to this point. Equipped with the updated software, the robot drove around for a given time avoiding obstacles on his way. When the time was up the mission was to return back to the starting point. Since the software did not store any history of moves, the robot tried to follow a straight line pointed by the calculated home vector. If there was an obstacle on the path, the robot drove around it, updated the home vector and continued his way home. The return mission was stopped when the robot arrived in a predefined radius around the starting point.

My next intention was to implement a map as well as the algorithms to make the robot building it. Unfortunately I never found the time doing it although I always kept an eye on what was going on in the fields of robotics. Recently I discovered ROS and I am excited about its possibilities and the ecosystem of framework, debug and visualization tools as well as simulation tools it offers. So here’s what this Hackaday.io project is about: 

  • Learn using ROS
  • Learn SLAM for navigation using ROS
  • Learn using camera based object recognition in conjunction with ROS
  • Either revive my old robot or build up a new one
  • and hopefully much more...

Learning using ROS

The internet is full of valuable content to learn ROS ranging from the ROS Wiki with documentation and tutorials to paid learning programs like offered by The Construct. Also available on this site is a podcast animated by Ricardo Tellez  which is quite interesting to listen to because it provides information about how to use ROS, interviews with people incorporating ROS into products as well as interviews with ROS maintainers.

My approach to learn ROS is based on these resources accompanied by building up a small prototype as the interaction of hardware and software is what motivates me the most. The host for ROS will be a PC with Ubuntu as well as a Raspberry Pi 3. For the prototype I will use an STM32 MCU connected via USB to the host  and for the communication the rosserial package.

Learning SLAM for navigation in ROS

LIDAR sensors are surprisingly affordable nowadays varying from a complete module like e.g. the SLAMTEC RPLIDAR  for ~400 € to sensors like the Garmin Lidar Lite for ~150 €. The latter requires either a motorized assembly or the robot to turn to perform 360 ° scans. While these sensors are readily available, I am still taking a different route in the first place. Convinced that one learns the most when building from scratch and wanting to use what I have in my parts bin I will build up my own distance sensor. But I like to highlight that the goal is to learn not to design a low cost solution to save money.

Part list:

  • STM32F103 BluePill MCU module eventually to be replaced by an ESP32 to test a node connection via TCP
  • Sharp GP2Y0A21YK0F distance sensor (range according to the data sheet between 10 cm and 80 cm)
  • Nema 17 stepper motor
  • Watterott TMC2100 SilentStepStick to drive the stepper motor
  • Everlight ITR8102...
Read more »

MCU_STMD_PCB_Milled.jpg

PCB directly after milling. Still needs some cleaning. Obviously I need to exchange the end mill by a new one.

JPEG Image - 336.29 kB - 10/25/2018 at 12:15

Preview
Download

Sensor assembly and electronics.png

Sensor assembly on stepper motor + electronics

Portable Network Graphics (PNG) - 1.19 MB - 10/24/2018 at 09:39

Preview
Download

Sensor assembly.png

Sensor assembly on stepper motor

Portable Network Graphics (PNG) - 806.04 kB - 10/24/2018 at 09:37

Preview
Download

  • ROS finally

    cc.lexow01/28/2019 at 09:32 0 comments

    In this project log I will describe how I got my distance sensor assembly to work with ROS and how I displayed the scan data using RViz. In addition to this project log I prepared a short video showing the distance sensor assembly performing a 360° distance scan.

    Test setup

    The test setup consists of:

    • a PC running Ubuntu 18.04 with ROS Melodic Morenia installed
    • the Sharp IR distance sensor assembly connected to the PC via USB
    • an „arena“ made from card board for the side walls and black paper for the floor

    ROS

    The ROS framework / middleware provides a rich toolset to develop, debug and analyse robotic applications of which my example uses just a small subset. The primary target is to learn how to prepare the distance scan data for use in ROS and once available on the PC how to display the data with the tool RViz. My setup consists of a sensor (IR distance sensor), an actor (stepper motor) and a debug tool (RViz). Data exchange within ROS is done using messages which are sent by publishing to topics and received by subscribing to topics. These messages can either be taken from a predefined set or newly defined if none of the existing are appropriate. Same is valid for topics while the typical use case is to define application specific topics. Thus the distance sensor assembly publishes its gathered data on one topic and subscribes to another topic over which it receives commands to control its behavior. These commands are generated on the PC. Instead of creating a ROS package I chose to use the program rostopic to publish the command messages. For ease of use I wrote a small GUI in python which encapsulates the calls to rostopic for the different commands. The package rosserial provides the interface between both systems and enables the embedded system to publish and/or subscribe to topics. It is integrated as a library into the software of the sensor assembly and used as an executable on the PC.

    Embedded System

    For the integration of rosserial into my embedded software I took inspiration and source code from the github repository of Itamar Eliakim. Accordingly I converted my MCU project from C to C++ as rosserial is written in C++. Furthermore I am using FreeRTOS to organize the program flow. Like described in a previous project log a distance scan value is obtained by sampling the analog sensor output 100 times at a sample rate of 1 kHz. This is now achieved by timer triggered A/D conversions with DMA transfer of the results to an array in memory. Currently these samples are averaged to provide one distance scan value. A further use could be to calculate statistics in order to evaluate the reliability of the sensor output, which is not the scope of this project. The averaged A/D conversion is transformed to a distance value in meters by using linear interpolation. A look-up table with values from my previous sensor qualification measurements provides in total 40 supporting points for the calculation.

    Performing one distance scan takes approximately 450 ms in the current implementation, mainly due to positioning and A/D conversion, which is quite slow. This lead me to design the following measurement procedure as a compromise between angle resolution and 360° full scan duration. In general the stepper motor turns the sensor by 2.25° between each distance scan. The total angle resolution is enhanced to 1.125° by interleaving the scan positions in clock- and counterclockwise direction. Turning the sensor in clockwise direction covers the angles 0°:2.25°:357.75° while the angles 358.875°:-2.25°:1.125° are scanned in counterclockwise direction. Using this procedure it takes 72 seconds to get a coarse overview of the environment which is improved after another 72 seconds.

    Distance scan measurement principle

    Measurement results are provided to ROS using the LaserScan message out of...

    Read more »

  • First 360° Scans

    cc.lexow10/29/2018 at 10:41 0 comments

    The current software implementation allows to perform 360° scans with a rotation angle of 0.45° between two measurements. At each position the voltage output of the distance sensor is sampled 100 times at a sample rate of 1 kHz. The average value of these samples together with the current angle position is transferred via USB to the PC. The incoming data stream is stored in a csv file for later analysis in Octave. A black paper on the floor and walls made from cardboard form the test environment in which the distance sensor assembly can be put at arbitrary positions.

    Test Environment

    I performed a first scan using this setup with the sensor assembly put into the center position. The generated csv file contains  a tuple of current angle and object distance per line [alpha, d] which is the measurement result in polar coordinates. A transformation to cartesian coordinates is necessary to plot the data as a 'map'. 

    First the angle alpha in degree needs  to be converted to radian:

    Next the x and y coordinates can be obtained using the following formulas: 

    The graphs below show the measurement data before and after the coordinate transformation.

    Raw output of 360° distance scan
    Plot of raw data - distance over scanning angle
    360° Distance scan without obstacle
    Plot after transformation to cartesian coordinates

    This is a very satisfying result for a first try as the generated 'map' reflects the rectangular shape of the test environment.

    Putting an obstacle into the test environment leads to the following graph of which two observations can be made. 

    Test environment with object
    Test environment with object
    360° Distance scan without and with obstacle
    360° Distance scan without and with obstacle

    First the sensor assembly shows a good reproducibility of the distance scans. The two consecutive measurements almost match completely in the range without obstacle. The second observation is about how the distance scan varies in the transition between the obstacle in the foreground and the wall behind leading to larger gaps between the measurements. I will look into that later to see if this information can be used to improve the mapping or to implement an adaptive scan resolution algorithm.

    Next on to making the scan angle resolution configurable and to learn how to use this data in ROS.

  • Qualification of the Sharp Distance Sensor

    cc.lexow10/24/2018 at 09:24 0 comments

    Electrical characteristics

    The supply voltage to the sensor is stabilized using a 22 uF capacitor. Anyway there are still voltage spikes visible on the output signal. An RC low pass filter reduces these spikes significantly.

    Sensor output voltage

    The filtered voltage is fed to the 12 Bit A/D converter within the STM32 MCU and sampled at a rate of 1 kHz. The data sheet states a measurement duration of 38.3 ms +/- 9.6 ms which led me to take an average of 100 samples (=100 ms) for one distance measurement.

    At http://sim.okawa-denshi.jp/en/CRlowkeisan.html I found a nice tool to calculate RC low pass filters and analyze their behavior.

    Output characteristic curve

    The data sheet of the distance sensor contains the characteristic curve of the sensor response to an object at a given distance in front of the sensor. One way is to digitize this curve using e.g. the WebPlotDigitizer and take the extracted data to calculate the object distance. Anyway it is still a good idea to perform measurements on your own to evaluate the sensor characteristic. In addition this provides information how the mechanical and electrical integration as well as the environment affects the measurements.

    The setup for this test consists of a black floor and a white paper as a reference object, which was moved stepwise from 1 cm to 80 cm distance relative to the sensor.

    Measurement setup

    At each position the MCU SW takes 10k samples and sends them via USB to a PC. Octave, the open-source equivalent to Matlab, is used on the PC to analyze the measurement data.

    With the given setup the distance sensor has the following output characteristic curve:

    Sharp Distance Sensor Output Curve

    Below ~7 cm and above ~55 cm the sensor has ambiguous output voltages. This limits the usable measurement range to [7..55] cm. In software I will take this measurement data to implement a look up table and use linear interpolation for the distances in between.

    The next step is to implement the motor control in SW to perform 360° scans.

View all 3 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates