The past
Back in 2005 I got first in contact with robotics when I built up a small robot from scratch. The main structure was a wooden chassis and two stepper motors with wooden wheels for propulsion. An 8 Bit Microchip microcontroller was used to control the robot. Equipped with five self built IR obstacle sensors and a bumper sensor on the front, the robot was able to drive around and avoid obstacles.
The next step was to implement navigation in software. At that time LIDAR sensors were still too expensive to even think about using them. Even ultrasonic distance sensors were too pricy for the student I was back then. This led me to implement dead reckoning for navigation. Having stepper motors it was easy to calculate the traveled distance as well as the rotation angle when avoiding obstacles. The software used this information to calculate a vector pointing from the current robot position to the position where it started from as well it’s pose relative to this point. Equipped with the updated software, the robot drove around for a given time avoiding obstacles on his way. When the time was up the mission was to return back to the starting point. Since the software did not store any history of moves, the robot tried to follow a straight line pointed by the calculated home vector. If there was an obstacle on the path, the robot drove around it, updated the home vector and continued his way home. The return mission was stopped when the robot arrived in a predefined radius around the starting point.
My next intention was to implement a map as well as the algorithms to make the robot building it. Unfortunately I never found the time doing it although I always kept an eye on what was going on in the fields of robotics. Recently I discovered ROS and I am excited about its possibilities and the ecosystem of framework, debug and visualization tools as well as simulation tools it offers. So here’s what this Hackaday.io project is about:
- Learn using ROS
- Learn SLAM for navigation using ROS
- Learn using camera based object recognition in conjunction with ROS
- Either revive my old robot or build up a new one
- and hopefully much more...
Learning using ROS
The internet is full of valuable content to learn ROS ranging from the ROS Wiki with documentation and tutorials to paid learning programs like offered by The Construct. Also available on this site is a podcast animated by Ricardo Tellez which is quite interesting to listen to because it provides information about how to use ROS, interviews with people incorporating ROS into products as well as interviews with ROS maintainers.
My approach to learn ROS is based on these resources accompanied by building up a small prototype as the interaction of hardware and software is what motivates me the most. The host for ROS will be a PC with Ubuntu as well as a Raspberry Pi 3. For the prototype I will use an STM32 MCU connected via USB to the host and for the communication the rosserial package.
Learning SLAM for navigation in ROS
LIDAR sensors are surprisingly affordable nowadays varying from a complete module like e.g. the SLAMTEC RPLIDAR for ~400 € to sensors like the Garmin Lidar Lite for ~150 €. The latter requires either a motorized assembly or the robot to turn to perform 360 ° scans. While these sensors are readily available, I am still taking a different route in the first place. Convinced that one learns the most when building from scratch and wanting to use what I have in my parts bin I will build up my own distance sensor. But I like to highlight that the goal is to learn not to design a low cost solution to save money.
Part list:
- STM32F103 BluePill MCU module eventually to be replaced by an ESP32 to test a node connection via TCP
- Sharp GP2Y0A21YK0F distance sensor (range according to the data sheet between 10 cm and 80 cm)
- Nema 17 stepper motor
- Watterott TMC2100 SilentStepStick to drive the stepper motor
- Everlight ITR8102...