Close
0%
0%

Phoebe TurtleBot

DIY variant of ROS TurtleBot for <$250 capable of simultaneous location and mapping (SLAM)

Similar projects worth following
The open source Robot Operating System has two standard platforms: the very expensive PR2, and the "low cost" TurtleBot. Sadly, at several thousand dollars, it was only "low cost" relative to the six-figure PR2 . Recent developments have reduced the price of admission to as low as $550 (TurtleBot 3 Burger) but resourceful hackers can build their own for even less. Phoebe is one way to build a TurtleBot variant on a budget, yet still has enough sensors for autonomy.

It is easy to build a very basic robot that runs the open source Robot Operating System: just put a motor controller on a Raspberry Pi and connect them to motors that turn wheels. However, such a basic robot without sensors is unaware of its surroundings. The power of ROS ecosystem is in its vast library of software for intelligent robot behavior, and to tap into that power, a robot needs to provide sensory input those software pieces require.

As the entry-level ROS platform, a TurtleBot offers two streams of sensor data:

  1. Encoders that precisely count wheel rotation. We can estimate a robot's position if we know how far each wheel has traveled. This information is commonly presented under the name "/odom" (short for odometry).
  2. A laser distance scanner (LDS) that continuously spins a LIDAR module to measure distance to obstacles all around the robot. This information is commonly presented under the name "/scan".

Providing such sensor data in addition to motor control are the minimum requirements to access a range of publicly available algorithms, such as those that allow a robot to gain awareness of its surroundings. This robot task of mapping its surroundings, and locating itself within its map, is called simultaneous location and mapping or SLAM.

This project, Phoebe, is a DIY TurtleBot variant that can provide odometry and laser scanning data similar enough to the official TurtleBot that Phoebe can run all the same ROS autonomy modules that a TurtleBot can. If all the parts had to be purchased, they would add up to around $200-$250. However, I chose these components because they were already on hand in my parts bin. For a more detailed breakdown on component cost, see build log entry "Cost to Build Phoebe From Scratch"

This "Parts Bin TurtleBot", acronym "PB TB", is called "Phoebe" following tradition set by R2-D2 ("Artoo")

Phoebe Spine.stl

STL for Phoebe's main backbone structure. Will need to be printed with support.

Standard Tesselated Geometry - 1.57 MB - 10/16/2018 at 18:58

Download

Phoebe 1.0.step

STEP file exported from Onshape CAD. (See links section.)

step - 2.43 MB - 10/16/2018 at 18:58

Download

Phoebe Electronics Tray.stl

STL for Phoebe's tray for Raspberry Pi 3, Roboclaw motor controller, 5.0V and 3.0V voltage regulators. Can be printed with sloped bottom side facing print bed.

Standard Tesselated Geometry - 817.37 kB - 10/16/2018 at 18:58

Download

Phoebe LIDAR Front Support.stl

STL for a bracket that clips on to the backbone structure and supports front of Neato LIDAR unit. Designed to be printed upside down without supports.

Standard Tesselated Geometry - 122.84 kB - 10/16/2018 at 18:58

Download

  • 1 × Raspberry Pi 3 Responsible for running ROS nodes to interface with motors and sensors.
  • 1 × microSD card for Raspberry Pi 3 Main system storage for Raspberry Pi 3.
  • 1 × Battery output DC to 5V DC converter Power source for Raspberry Pi 3. I've had good experience with cheap MP1584 step-down converters from Amazon.
  • 1 × Neato XV-11 (or similar) robot vacuum laser distance scanner Units salvaged from broken vacuums are available on eBay, search for "Neato LIDAR"
  • 1 × Battery output DC to 3V DC converter Neato LIDAR's scanning motor requires 3V to spin at the correct speed. (3.3V is too fast.) Alternatively: use a DC motor speed controller to explicitly control spin speed of motor.

View all 11 components

  • Version 1.0 Complete

    Roger10/14/2018 at 19:48 0 comments

    I started the Phoebe project with the goal of building something to apply what I’ve learned about ROS. Get some hands-on experience, learning the ropes. Now that Phoebe can map and autonomously navigate its environment, it is a good place to pause and evaluate potential paths forward. (Also: I have other demands on my time so I need to pause my Phoebe work anyway… and now is a great time.)

    Option #1: Better Refinement

    Phoebe can map surroundings then, using that map, navigate that environment. This level of functionality is on parity with the baseline functionality of TurtleBot 3. Though neither the mapping nor the navigation is quite as polished as performed by TurtleBot built by people who know what they are doing. For that, Phoebe’s ROS modules need tuning of their parameters to improve performance. There are also small bugs hiding in the system that need to be rooted out. I’m sure the ~100ms timing difference mystery is only the tip of the iceberg.

    Risk: This is “the hard part” of not just building a robot, but building a good robot. And I know myself. Without a clear goal and visible progress towards that goal, I’m liable to get distracted or discouraged, trailing off and never really accomplishing.

    Option #2: More ROS Functionality

    I had been disappointed that the SLAM and navigation tutorials I’ve seen to date require a human to direct robot exploration. I had thought automated exploration would be part of SLAM but I was wrong. Thanks to helpful comments by  Humpelstilzchen(who is building a pretty cool ROS robot too) I’ve now learned autonomous exploration is built on top of SLAM and Navigation.

    So now that Phoebe can do SLAM and can navigate, adding one of the autonomous exploration modules would be the obvious next step.

    Risk: It’s another ROS learning curve to climb.

    Option #3: More Phoebe Functionality

    Phoebe has wheel encoders and a LIDAR as input, and it might be interesting to add more. Ideas have included:

    • Obstacle detection to augment LIDAR, such as
      • Ultrasound distance sensor.
      • Infrared distance sensor (must avoid interference with LIDAR).
      • Bumpers with microswitches to detect collision.
    • IMU (inertial measurement unit).
    • Raspberry Pi camera or other video feed.

    Risk: Over-complicating Phoebe, which was always intended to be a minimal-cost baseline entry into the world of ROS following the footstep of ROS TurtleBot.


    Options 1 and 2 take place strictly in software, which means mechanical chassis will remain untouched.

    Option 3 changes Phoebe hardware, and that would start deviating from TurtleBot. There’s value in being TurtleBot-compatible and hence value in taking a snapshot at this point in time.

    Given the above review, I declare the mechanical construction project of Phoebe the TurtleBot complete for version 1.0. As part of this, I’ve also updated the README file on Phoebe’s Github repository to describe content. Because I know I’ll start forgetting!

  • ROS Navigation Up and Running

    Roger10/13/2018 at 17:29 0 comments

    I’ve been making progress (slowly but surely) thorough the ROS navigation stack tutorial to get it running on Phoebe, and finally reached the finish line.

    After all the configuration YAML files were created, they were tied together into a launch fileas parameters to the ROS node move_base. For now I’m keeping the pieces in independent launch files, so move_base is ran independently of Phoebe’s chassis functionality launch file and AMCL (launched using its default amcl_diff.launch).

    After they were all running, a new RViz configuration was created to visualize local costmap and amcl particle cloud. And it was a huge mess! I was disheartened for a few seconds before I remembered seeing a similar mess when I first looked at navigation on a Gazebo simulation of TurtleBot 3 Burger. Before anything would work, I had to set the initial “2D Pose Estimate” to locate Phoebe on the map.

    Once that was done, I set a “2D Nav Goal” via RViz, and Phoebe started moving! Looking on RViz I could see the map along with LIDAR scan plots and Phoebe’s digital representation from URDF. Those are all familiar from earlier. New to the navigation map is a planned path plotted in green taking account of the local cost map in gray. AMCL contributed the rest of the information on screen, with individual estimates drawn as little yellow arrows and estimated position in red.

    Phoebe Nav2D 2

    It’s pretty exciting to have a robot with basic intelligence for path planning, and not just a fancy remote control car.

    Of course, there’s a lot of tuning to be done before things actually work well. Phoebe is super cautious and conservative about navigating obstacles, exhibiting a lot of halting and retrying behavior in narrower passageways even when there are still 10-15cm of clearance on each side. I’m confident there are parameter I could tune to improve this.

    Less obvious are what I need to adjust to increase Phoebe’s confidence in relatively wide open areas, Phoebe would occasionally brake to a halt and hunt around a bit before resuming travel even when there's plenty of space. I didn’t see an obstacle pop up on the local costmap, so it’s not clear what triggered this behavior.

    (Cross-posted to NewScrewdriver.com)

  • ROS Navigation Stack Setup

    Roger10/12/2018 at 18:26 0 comments

    Section 1 “Robot Setup” of this ROS Navigation tutorial page confirmed Phoebe met all the basic requirements for the standard ROS navigation stack. Section 2 “Navigation Stack Setup” is where I need to tell that navigation stack how to run on Phoebe.

    I had already created a ROS package for Phoebe earlier to track all of my necessary support files, so getting navigation up and running is a matter of creating a new launch file in my existing directory for launch files. To date all of my ROS node configuration has been done in the launch file, but ROS navigation requires additional configuration files in YAML format.

    First up in the tutorial were the configuration values common for both local and global costmap. This is where I saw the robot footprint definition, a little sad it’s not pulled from the URDF I just put together. Since Phoebe’s footprint is somewhat close to a circle, I went with the robot_radius option instead of declaring a footpring with an array of [x,y] coordinates. The inflation_radius parameter sounds like an interesting one to experiment with later pending Phoebe performance. The observation_sources parameter is interesting – it implies the navigation stack can utilize multiple sources simultaneously. I want to come back later and see if it can use a Kinect sensor for navigation. For now, Phoebe has just a LIDAR so that’s how I configured it.

    For global costmap parameters, the tutorial values look equally applicable to Phoebe so I copied them as-is. For the local costmap, I reduced the width and height of the costmap window, because Phoebe doesn’t travel fast enough to need to look at 6 meters of surroundings, and I hoped reducing to 2 meters would reduce computation workload.

    For base local planner parameters, I reduced maximum velocity until I have confidence Phoebe isn’t going to get into trouble speeding. The key modification here from tutorial values is changing holonomic_robot from true to false. Phoebe is a differential drive robot and can’t strafe sideways as a true holonomic robot can.

    The final piece of section 2 is AMCL configuration. Earlier I’ve tried running AMCL on Phoebe without specifying any parameters (use defaults for everything) and it seemed to run without error messages, but I don’t yet have the experience to tell what good AMCL behavior is versus bad. Reading this tutorial, I see the AMCL package has pre-configured launch files. The tutorial called up amcl_omni.launch. Since Phoebe is a differential drive robot, I should use amcl_diff.launch instead. The RViz plot looks different than when I ran AMCL with all default parameters, but again, I don't yet have the experience to tell if it's an improvement or not. Let’s see how this runs before modifying parameters.

    (Cross-posted to NewScrewdriver.com)

  • ROS Navigation Requirements

    Roger10/11/2018 at 20:10 0 comments

    Now that basic coordinate transform frames have been configured with help of URDF and robot state publisher, I moved on to the next document: robot setup page. This one is actually listed slightly out of order list item on ROS navigation page, third behind the Basic Navigation Tuning Guide. I had started reading the “Tuning Guide” and saw that, in that introduction, the tuning guide assumes people have read the robot setup page. It’s not clear why they are out of order, but clearly robot setup needs to come first.

    Right up front in Section 1 “Robot Setup” was a very helpful diagram labelled “Navigation Stack Setup” showing major building blocks for an autonomously navigating ROS robot. Even better, these blocks are color-coded as to their source. White blocks are part of the ROS navigation stack, gray parts are optional components outside of that stack, and blue indicates robot-specific code to interface with navigation stack.

    This gives me a convenient checklist to make sure Phoebe has everything necessary for ROS navigation.

    • Sensor source – check! Phoebe has a Neato LIDAR publishing laser scan sensor messages.
    • Base controller – check! Phoebe has a Roboclaw ROS node executing movement commands.
    • Odometry source – check! This is also provided by the Roboclaw ROS node reading from encoders.
    • Sensor transforms – check! This is what we just updated, from a hard-coded published transform to one published by robot state publisher based on information in Phoebe’s URDF.

    That was the easy part. Section 2 was more opaque to this ROS beginner. It gave an overview of the configuration necessary for a robot to run navigation, but the overview assumes a level of ROS knowledge that’s at the limit of what I actually have in my head right now. It’ll probably take a few rounds of trial and error before I get everything up and running.

    (Cross-posted to NewScrewdriver.com)

  • Phoebe Digital Avatar in RViz

    Roger10/10/2018 at 19:37 0 comments

    Now that Phoebe URDF has been figured out, it has been added to RViz visualization of Phoebe during GMapping runs. Before this point, Phoebe’s position and orientation (called a ‘pose‘ in ROS) is represented by a red arrow on the map. It’s been sufficient to get us this far, but a generic arrow is not enough for proper navigation because it doesn’t represent the space occupied by Phoebe. Now, with the URDF, the volume of space occupied by Phoebe is also visually represented on the map.

    This is important for a human operator to gauge whether Phoebe can fit in certain spaces. While I was driving Phoebe around manually, it was a guessing game whether the red arrow will fit through a gap. Now with Phoebe’s digital avatar in the map, it’s a lot easier to gauge clearance.

    I’m not sure if the ROS navigation stack will use Phoebe’s URDF in the same way. The primary reason the navigation tutorial pointed me to URDF is to get Phoebe’s transforms published properly in the tf tree using the robot state publisher tool. It’s pretty clear robot footprint information will be important for robot navigation for the same reason it was useful to human operation, I just don’t know if it’s the URDF doing that work or if I’ll end up defining robot footprint some other way. (UPDATE: I’ve since learned that, for the purposes of ROS navigation, robot footprint is defined some other way.)

    In the meantime, here’s Phoebe by my favorite door to use for distance reference and calibration.

    And here’s the RViz plot, showing a digital representation of Phoebe by the door, showing the following:

    • LIDAR data in the form of a line of rainbow colored dots, drawn at the height of the Neato LIDAR unit. Each dot represents a LIDAR reading, with color representing the intensity of each return signal.
    • Black blocks on the occupancy map, representing space occupied by the door. Drawn at Z height of zero representing ground.
    • Light gray on the occupancy map representing unoccupied space.
    • Dark gray on the occupancy map representing unexplored space.

    (Cross-posted to NewScrewdriver.com)

  • Fixing Functional Problems in URDF

    Roger10/09/2018 at 21:36 4 comments

    Once I had a decent looking URDF for Phoebe up and running, I added it into the Phoebe launch files and started working on the problems exposed by putting it to work.

    The first problems were the drive wheels. Visually, they were stuck at the origin and didn’t move with the rest of the robot. Looking through error messages I realized ROS had expected me to read wheel encoder values and publish them as joint state. Since I hadn’t done so, this meant the wheels (attached with “continuous” joint) didn’t know their location. Until I get around to processing wheel encoder values, the joint type was changed to “fixed” to attach them to the chassis.

    Looking at the model from multiple angles, I realized I forgot the caster wheel. Since it’s not driven, it is represented as a simple sphere and also attached via a fixed joint.

    That’s enough to start driving around as a single unit, but the robot movement in RViz was reversed front/back with LIDAR data plot. This was caused by the fact I forgot to tell ROS the LIDAR is pointed backwards on the robot. Once I had done so, the 180 degree yaw is visible on the object axis visualization: The LIDAR’s X-axis (red cylinder) is pointing backwards instead of forwards like all the other axis.

    The final set of changes might be more cosmetic than functional. When reading about differential drive robots in ROS, it was brought up several times that the robot’s X/Y origin base_link need to be lined up with the pivoting axis of the robot. However, it wasn’t clear where the Z axis is supposed to be. Perhaps this is different for each ROS mapping module? The algorithm hector_slam defined several frames but they don’t appear to be supported by gmapping.

    I first defined Phoebe origin as the center point between its two drive wheel axles. When rendered in RViz, this means the Z plane intersects the middle of the robot. It seems to work well, but the visualization looks a bit odd. Intuitively I want the Z plane to represent the ground, so I decided to drop the robot origin to ground level. In the object visualization, this is visible as the purple arrow heads all pointing at a center point below the robot. If I learn this was a bad move later, I’ll have to change it back.

    All these changes combined gave me a Phoebe URDF with minimal representation in RViz visualization of Phoebe behavior.

    (Cross-posted to NewScrewdriver.com)

  • Describe Phoebe For ROS Using URDF

    Roger10/08/2018 at 20:57 0 comments

    Now that I've decided to bring up the ROS navigation stack for Phoebe, where do I start? Well, the ROS Wiki page for the subject is always a good place to start, as they tend to have a tutorial for the subject. ROS navigation is no exception.

    The first recommended page is actually a familiar sight - the brief overview on tf was required reading back when I first assembled the chassis. At the time, I could get away with a very simple static publisher, because I just had to tell ROS how and where my Neato LIDAR is mounted on my robot chassis. But now I guess I need to advanced to the next step and publish robot state. And this means describing Phoebe in more detail for ROS using a XML syntax called URDF (Unified Robot Descriptor Format).

    So in order to bring up ROS navigation on Phoebe, the navigation wiki page has pointed me to robot state publisher and also the ROS URDF Tutorial. To learn one thing I had to learn another, the typical bootstrap process when learning something new.

    For the purposes of robot physics simulation, the robot should be described using very basic geometry: a combination of rectangular solids, cylinders, and spheres. This keeps the computation workload for collision detection simple. While the visual representation can be more complex than the collision detection representation, it doesn't have to be. So for this first draft, I'll just do a super simple Phoebe for visual representation, suitable for use in collision calculations if I get into that later.

    I started with Phoebe's Onshape CAD file.

    Taking the critical dimensions, I created a simplified version in Onshape CAD using just rectangular boxes and cylinders. This exercise makes it a fairly straightforward exercise to translate into URDF.

    By measuring the dimensions in CAD, I could declare a few primitives with URDF and see what it looks like in RViz for comparison against CAD. Once the visual appearance is roughly correct, it's time to tune the details and make sure they work for ROS functional purposes.

    (Cross-posted to NewScrewdriver.com)

  • Next Project Goal: ROS Navigation

    Roger10/08/2018 at 00:36 2 comments

    When I started working on my own TurtleBot variant (before I even decided to call it Phoebe) my intention was to build a hardware platform to get first hand experience with ROS fundamentals. This project page's subtitle declared itself as a ROS robot for <$250 capable of SLAM. Now that Phoebe can map surroundings using standard ROS SLAM library ‘gmapping‘, that goal has been satisfied. What’s next?

    One disappointment I found with existing ROS SLAM libraries is that the tutorials I’ve seen (such as this and this) expect a human to drive the robot during mapping. I had incorrectly assumed the robot would autonomously exploring its space, but “simultaneous location and mapping” only promises location and mapping – nothing about deciding which areas to map, and how to go about it. That is left to the human operator.

    When I played with SLAM code earlier, I decided against driving the robot manually and instead invoked an existing module that takes a random walk through available space. A search on ROS Answers web site for something more sophisticated than a random walk resulted in multiple pointers to the explore module, but that code hasn’t been maintained since ROS “groovy” four versions ago. So one path forward is to take up the challenge of either update explore or write my own explorer.

    That might be interesting, but once a map is built, what do we do with it? The standard ROS answer is the robot navigation stack. This collection of modules is what gives a ROS robot the ability to plan a path through a map, watch its progress through that plan, and update the plan in reaction to unexpected elements in the environment.

    At the moment I believe it would be best to learn about the standard navigation stack and getting that up and running on Phoebe. I might return to the map exploration problem later, and if so, seeing how map data is used for navigation will give me better insights into what would make a better map explorer.

    (Cross-posted to NewScrewdriver.com)

  • Phoebe Accessory: HDMI Plug

    Roger10/05/2018 at 19:47 0 comments

    In most ROS demonstrations, the robots are running through a pristine laboratory environment. Phoebe is built to roam my home, which is neither a laboratory or pristine. This became a problem when Phoebe ran across some dust bunnies and picked them up with its leading edge.

    When choosing an orientation for Raspberry Pi 3 on Phoebe’s electronics tray, I chose to make the HDMI port accessible so I could connect a monitor as necessary. This resulted in that port facing forward along with the micro-USB power port and the headphone jack. All three of these ports were plugged up with debris when Phoebe explored some paths less well-traveled.

    After I cleaned up the mess, all three ports appeared to work, but I was worried about Phoebe encountering some less fluffy obstacles. The audio jack was not a high priority as Raspberry Pi default audio is notoriously noisy and I haven’t needed it. The power jack could be easily bypassed by sending power via the GIPO pins (as I’m doing right now). That leaves the HDMI port, which can be quite inconvenient if damaged. If I need a screen on a Pi with damaged HDMI port, I’d need to buy or borrow a screen that goes into the alternate DSI port like the official Raspberry Pi touchscreen.

    Fortunately, there are little plastic plugs that come with certain HDMI peripherals for protection during shipping. In my case, I had a small red HDMI plug that came with my MSI video card. I installed it on Phoebe’s Raspberry Pi to protect the HDMI port against future debris encounters. Now Phoebe has a red nose. If it should glow I might have to rename my robot to Rudolph the Red Nosed Robot.

    But it doesn’t glow, so Phoebe won’t get a name change.

    (Cross-posted to NewScrewdriver.com)

  • Phoebe Accessory: Battery Voltage Monitor

    Roger10/04/2018 at 23:01 0 comments

    And now, a few notes on some optional accessories. These aren't required for anyone building their own Phoebe, but are nice to have.

    The first item is a battery voltage meter and alarm. While Phoebe can monitor battery voltage in software via Roboclaw API, I also wanted an always-available physical readout of battery voltage. On Sawppy I thought I just needed to show the battery's output voltage, but the number is only good if I could read it. During Sawppy's all-day outing at JPL, California sunlight was too bright to read the number and I couldn't tell when my battery dropped below recommended level for lithium chemistry batteries.

    Searching for a better solution, I found these battery voltage alarms. Not only do they display voltage, when the level gets too low they also sound a buzzer. Judging by its product description, these were designed for remote-control aircraft where it's not convenient to read a small number up in the air.

    The downside is that the alarm is designed to be audible while up in the air and buried inside a fuselage. When it is on the ground and right in front of my face, it is a piercing shriek. Which isn't so bad if it only occurs during low battery... but it also sounds a test beep when I first plug it in. It is loud. Very loud. To save my eardrums, the alarm buzzer has been muffled with some cotton pulled from a cotton swab. It's still loud, but no longer gives me a headache afterwards.

    I've also soldered a JST-XH connector onto the unpolarized input pins to fit my battery's balance charging plug. Having a polarized connector helps make sure I don't plug the battery in backwards. Those exposed pins are also a short-circuit risk, which I crudely mitigated by wrapping a layer of servo tape around them. Finally servo tape is used to secure the alarm to Phoebe's backbone.

    Now I can drive Phoebe around the house, even out of sight, confident that if I ever run the battery too low I'll be notified with an alarm.

    (Cross-posted to NewScrewdriver.com)

View all 30 project logs

Enjoy this project?

Share

Discussions

Tom Coyle wrote 03/10/2022 at 21:15 point

A very nice beginner project for ROS. You ended your project log a little over three years ago. Do you plan to update it anytime in the future?

Here is a bigger version of Phoebe: https://github.com/chrisl8/ArloBot which is based on the discontinued Parallax Arlo robot chassis. 

I am presently thinking about building an "off the shelf" version of Arlo using a 14 in diameter chassis, a Rpi 4B/4GB, 2X7 RoboClaw, Pololu D37 motors with encoders, Slamtec A1M8 RPLidar, and Li-Ion batteries.

Regards,

TCIII

  Are you sure? yes | no

Roger wrote 03/14/2022 at 23:27 point

My eventual ambition is to repeat this learning exercise with ROS2. It will be a more extensive project because (last I looked) nobody has yet written ROS2 nodes for the Neato LIDAR or RoboClaw motor controller so I would also have to either port the ROS1 code Phoebe is using or rewrite them from scratch.

I want to see if performance would improve with use of Raspberry Pi 4 with more RAM, as I saw some virtual memory swapping with the 1GB RAM of a Pi 3. Especially when occupancy map resolution is increased.

But all that will have to wait until the global electronics disruption shakes out and I could buy more Pi at reasonable prices. My Phoebe is currently without a brain as I've repurposed that Pi 3 to another project.

  Are you sure? yes | no

Tom Coyle wrote 03/15/2022 at 19:53 point

Hi Roger,

Thanks for the response, much appreciated. 

The ArloBot creator originally used a laptop, but I have substituted a Rpi 4B/4GB and a LattePanda 4GB/64eMMC (x86) in place of the laptop with satisfactory results. The ArloBot creator has also discussed moving to ROS2. He is also moving to a RoboClaw 2X7 motor controller, in place of the discontinued Parallax DHB-10 motor controller, to be used in a smaller chassis than the ArloBot. 

You might want to consider using either the Slamtec A1M8 RPLidar or the EAI YDLIDAR X4 in place of the Neato LIDAR.

Please keep me updated with your path forward once you get going again.

TCIII

  Are you sure? yes | no

Dongwon Lee wrote 10/15/2018 at 07:41 point

Wow awesome work !!..  Actually I had made your sawppy and needed ROS. So I thought turtle bot3 was best starter... But It was expensive.. Your Phoebe is the best starter kit for ROS 

Do you have plan to share STL files?

Thanks Roger !!! 

  Are you sure? yes | no

Roger wrote 10/16/2018 at 19:05 point

You can generate STL files from Onshape CAD project, found in the 'links' section of this project page. I expect builders of their own parts bin TurtleBot will want to modify the design to fit whatever is available in their own parts bin.

But for people who want to build Phoebe just as I've built mine, I've generated and uploaded STL files for the 'files' section. Bonus: while I was at it, I've also exported a STEP version of Phoebe's 3D printed parts.

Happy building!

(And if you've built Sawppy, please post pictures and/or information on Sawppy's project page. I'd love to see it.)

  Are you sure? yes | no

Humpelstilzchen wrote 09/09/2018 at 07:13 point

Hi,

which laserscanner do you use in this budget?

  Are you sure? yes | no

Roger wrote 09/09/2018 at 17:36 point

Thanks for finding my project before I even had a chance to fill out all the details! I'll be posting more information soon but I'm happy answer your specific question now: I'm using a laser scanner salvaged from a Neato robot vacuum. Search on eBay for "Neato LIDAR" and you should see several options with "Buy It Now" prices in the $50-$75 range.

  Are you sure? yes | no

Humpelstilzchen wrote 09/09/2018 at 17:38 point

thanks, I'l patiently wait for more details then.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates