Close
0%
0%

Twerk Lidar Robot

An insect-quad style robot with an IMU plus ToF lidar sensor for mapping and navigation.

Similar projects worth following
The name is a gimmick but it is true of what the robot does. It "twerks" to scan the environment with a single-point lidar beam. The onboard IMU is used to track the robot's orientation and figure out how far away things are/where they are as the robot moves.

About

This project is self-learning of various subjects for me. I don't have much experience with IMUs and I have this personal fantasy of autonomous ground drones that map things/work on their own. This will also stream telemetry/use 3D rendering via ThreeJS that's also pretty new to me.

Components

  • Teensy 4.0
  • MPU-9250
  • VL53L0X (ToF laser ranging sensor)
  • ESP-01
  • 18650 battery
  • MPS mEZD41503A-A 5V @ 3A (step-up converter)
  • 12 x 9g servos
  • TFmini-s Lidar

Unit cost $100.00 (blue servos, no TFmini-s lidar)

This project is ongoing, at this time 01/20/2022 I've just completed the physical design/build aspect and wiring of everything. I now have to actually do the mapping/3D telemetry stuff which is all doable because of exporting glTF and ThreeJS, plus the data streams off the robot through a web socket where a web interface can consume it.

Scanning example


NOTE
This project is still in development and is not really intended to be reproduced due to the complexity of soldering the proto board. It's not so much complexity but a PITA. Also the project generally sucks, needs more planning and better parts.

  • A web-based simulator with ThreeJS (proof of concept)

    Jacob David C Cunningham11/13/2022 at 20:24 0 comments

    Since you can import SketchUp models into ThreeJS using glTF export (Khronos) seems like you ought to be able to make a 3D  visualizer for the robot.

    I prototyped this before but this enhancement brings in individual parts to be moved independently so you could do a walking cycle. Unfortunately at the moment I still have not learned things like inverse kinematics (if they apply) and how to automatically move the legs for gaits, so these movements are manually programmed.

    Some notes:

    • PerspectiveCamera, fourth param is camera frustum near plane (rendering) set that very close 0.01
    • light controls the colors
    • each gltf has meta info including its model (scene) where you can translate or rotate it

    Video below

  • When your own code annoys you

    Jacob David C Cunningham10/24/2022 at 01:10 0 comments

    This weekend I worked on the ToF scanning and IMU sampling. Overall have not really landed on anything concrete. I need to study up on basic physics. I also need to work out the beam sampling more, specifically for overhead stuff since if the servo wires get caught on anything it pulls the board off the IMU pins.

    Anyway I'm getting closer to starting from scratch ha... not entirely because I'm still not at the point where I'm using real robotics code eg. inverse kinematics... all my stuff is manually coded regarding the gaits/servo positions. Which sucks when you gotta change it and the motions don't propagate.

    I have a lot of random bits and pieces of existing code I can pull eg:

    • getting sensor values (IMU, ToF, TFmini-S), battery voltage
    • comms (ESP-01 websocket to web interface)
    • 3D GUI eg. exported model from SketchUp to ThreeJS (could make a simulator)

    But what I have right now is just random crap thrown together... bad code.

    I did improve the navigation code but I kind of just half-assed it, lost track of the different edge cases/logic for the cone scanning.

    The goal was to build this thing, have it safely navigate on its own and plot a rough map of where it's been. Did I accomplish that? Kind of... I'm lacking on the IMU side (true values).

    That will be easier to do with the wheeled robot (Floating Navigation Sensor Assembly project) since it moves more smoothly than a legged robot.

    These projects are a time sink. I don't know when it will be "done".

    There is something new I did, I designed and printed a basic battery clip, since the battery does fall out eventually from all the twerking the robot does.

    I'll focus on getting this into a "done" state with the hard-coded version. It's a good physical platform for writing better code.

    IMU video

    ToF scanning video

  • You're back on the water, boys.

    Jacob David C Cunningham10/16/2022 at 21:36 0 comments

    Life Aquatic quote

    TLDR;

    started working on this again, no real progress yet on improved navigation

    So... I think I had a burnout moment, I could not do anything "productive" for about 3 weeks. But now I'm back!

    To recap, the Twerk Lidar Robot (TLR) used to have just one power supply for everything... and so when the power spiked/got too low, the Teensy 4.0 would die, which is the robot brain. After adding on a second power supply (both using same battery) that currently is only for the Teensy, this helps it greatly. Now this means if it runs into a wall, it probably won't die from a current spike, rather it will die from destroying itself. So now the navigation has to be good.

    I've just been working on random stuff but I think it's time to start putting it together. One unfinished aspect is the long-scan data send... which sends a lot of data through my janky polling serial websocket connection. It has no data-send-acknowledgement so if the data is not completely sent it's wasted effort due to not being able to plot/use the transmitted data.

    The other issue is the TFmini-S takes longer to take a sample than the purple sensor. But the single-point "lidar" is more accurate/narrower beam. So I still want to use it.

    With all of that in mind, the work for today will be to come up with a robust navigation system.

    The other issue is no multi-tasking... what I can probably rely on is the CPU speed since it is insane eg. 600 MHz.

    I forget how much electronics is on this thing.

    Long video

  • Servo jitters (how to waste time)

    Jacob David C Cunningham09/05/2022 at 21:11 0 comments

    So after I updated the servos from the cheap blue 9G ones to MG90Ds, there was now a startup jitter problem. The servos would flip out and it would take me several times of on/off/on/off... to get the robot to get into the booted state (standing upright).

    I thought I could fix it by waiting until the Teensy was turned on, then turning on the servos separately, but since everything uses the same power supply, when the servos get turned on, they reset the Teensy and then the Teensy loses control/boots, and that makes the servos go crazy. When you turn the servos on without the Teensy being on, they don't do anything...

    But yeah... I wasted like 4 hours rebuilding this MOSFET high-side switch thing and in the end I just used another power supply. Unfortunately (waste) it only powers the Teensy... but anyway it gets the job done, this robot is a learning one/not a great design but I can move forward with this.

    I also tried to use super caps in parallel with the power supply I'm using and that was a bad idea (not helpful, too much capacitance, already has capacitance built in). I thought it would provide more current to draw from as a "bank" but nope.

    I'm also pleased that the robot can stand up from laying flat on the ground... these servos were worth the money.

    I forgot how super slow the TFmini-s sampling is... I'll need to change that, maybe continuously sample and store with markers or something... vs. individual reads for every degree since it takes like 5 minutes to do a scan.

    Oh and it's trying to transmit data dang, I forgot what this thing was doing last time I worked on it.

    It also failed to transmit all the data since there was no check/acknowledge back from the web to the robot to make sure all data was received.

  • Baby's got a new set of feet (MG90D upgrade)

    Jacob David C Cunningham07/29/2022 at 04:18 0 comments

    Since the MG90D's are taller than the blue 9g servos, I have to update the design and reprint ALL of the leg joints... I considered making the outer joint be able to swing all the way around the middle joint but I don't think there's much to gain in doing so. Also extending that would change the gaits. So in order to not affect the gaits, I will just extend the bases.

    Thankfully most of the dimensions are the same though, so it's really just a height mismatch. I also had to update the servo boots since the wires are flush with the base of the servo for the MG90D.

    It'll take me a few days to print these (starting 07/26). I'll just run the prints after work.

    Print times (x4)

    • servo boot 30mins (x2)
    • inner joint 46mins
    • middle joint 60 mins
    • outer join 51mins

    So it's about 3.5hrs per leg or 14hrs total

    Patiently waiting for new legs

    Got one... already the servo positions changed, will need to check the gaits again

    This particular leg is short that area circled in red. But I updated the rest. I was having problems with the measurements because the servo horn on the MG90D is not flush like the blue servo. So it adds extra height, and the case of the MG90D is also a little taller. I will get this done by Friday night and then I can build the floating sensor assembly project over the weekend.

    Thursday 07/28/2022 it's done

    Bad focus here, still a photography noob.

    I need to reprogram it all again, since the servo starting positions changed and I have not finished that abstracted gait mod.

    But man... it's so stiff and stable, I'm pumped. Worth it. It's crazy like I'm not even sure if it's on because the servos don't make any sound/barely deflect from the weight of the robot.

  • Goodbye stranger, it's been nice

    Jacob David C Cunningham07/24/2022 at 17:41 0 comments

    Was early on a Saturday, I strip the pads off my TFmini-s lidar...

    This was an "end goal" set back but I ordered more. I have another project in mind that will use one of these as well, this "floating sensor assembly". I bought bearings for it, so that project is coming. It's like a detachable compute/navigation unit that you can stick onto things (assuming they're bridged together wirelessly). Anyway I thought the resistors on the board led to the connectors but I guess not, since I probed them/connected them and it didn't fix my TFmini-s.

    Today I made this work, it's robot straight to web mesh plotting, vs. me copying-pasting from the serial monitor into a Google spreadsheet and taking 20 minutes or so to operate on the data/get it ready for the mesh plot ingest. It takes time to transmit the data though due to the janky 1 second back and forward poll. The send is also limited by 250 chars, so yeah it takes a bit to send say 12KB+ of data to plot below.

    Oof... I just reformatted my 3TB external drive on accident thinking it was the camera SD card, dang... lot of files... years old, gone. Luckily I still have all the videos I recorded for this project.

    Short video

    It took me about 14hrs+ to get this done, yeah one of those days in the zone autozone.

    Long video

    Cat

  • Actual Lidar (?)

    Jacob David C Cunningham07/17/2022 at 20:32 0 comments

    Maybe not true lidar but the FOV is much smaller (3.6 degrees), better for my case.

    Edit: actually it's 2 degrees FOV which is even better

    So for a long time I assumed that the VL53L0X is a lidar sensor. Some sources say it is, most don't. But the 25 degree FOV is not idea for something that is supposed to be "Lidar" or a laser/single point. The TFmini-s is something that I had already from another project which I stopped working on.

    So I stole that sensor and put it on here. I forgot about the UART/I2C stuff... on the left side I'm using a Pi to do all the computation, there is an Arduino just to control the servos (less problematic on boot). And in this particular code base I had found some sample code provided in a raspberrypi.com thread about the TFmini-s. This code was written in serial. Initially (yesterday) I tried to use the Benewake GUI to switch the sensor from UART to I2C using the hex commands mentioned at this thread. However I kept getting "wrong command". I decided before I went too deep, I would see how I got it to work before on the Raspberry Pi using the sample code mentioned above. So I figured out how to convert that Python serial write/read code to Arduino and it worked out.

    Initially I was going to just have TFmini-s on its own but I decided to keep both sensors on there since I have the pins and I already know how to interface with both of them. The main reason I wanted it is for scanning close range, since the TFmin-s is good for 1ft (30cm) and beyond. It can't measure anything less than 12 inches. Where as the ToF ranging module can measure below 12".

    You can see a comparison here of the difference in the scanning area. The TFmini-s has a FOV of 3.6 degrees vs. the 25 degrees for the VL53L0X. I will have to rework all of the code to change the dimensions/scanning process. Also the robot is a little heavier now it weighs 10.5oz.

    According to the torque diagram from this video theses servos should have 7.8oz/1.5in which is beyond the weight of this robot (per leg), however I think the servos are still too weak. I was also thinking I would need to incorporate the IMU with the servo motions eventually to try and keep the robot steady/not wobble.

    The problem is the single threaded aspect, it's hard to mix different domains together in real time. The Teensy 4.0 is fast 400MHz or something. So maybe in "real time" it's not noticeable even if per degree moved I'm pinging the IMU to check/make the robot more stable/self-balancing.

    I really like these "eyes" though, the Lidar module lenses. It makes it look like a jumping spider. Also the pins on the VL chip look like little teeth in the right lighting.

    I was thinking I could send the measured points and angles to do the plotting on the web... the websocket is pretty good with sending the characters... just a matter of chunking the data and then ensuring they all make it to the web. I want to make the scanning really good by combining the sensors and doing tests against physical objects and knowing what I can expect performance wise.

    Anyway it's just more work... opportunity to write better code.

    Long video

  • Read the manual

    Jacob David C Cunningham07/12/2022 at 03:58 0 comments

    So yeah... while discussing the sensor with someone they pointed out that it doesn't shoot a straight line/laser, that it actually makes a cone with 25 degrees FOV... so it looks like the above. What I had assumed in the past is the beam is a line and I was panning every degree to make 5 roughly 2D slices. But it actually looks something like the above. So at the moment I am now scanning every 4th degree (might lower to 3 just because) and the outer ends. This makes the scanning process much faster.

    I also am reworking the web interface... it is still crap because I'm not using a SPA or any pattern/standards. I'm just injecting things as needed. But you can see in the short video below the demo of the upside down alerting.

    Anyway I'll keep working on this. One thing I'll add is the backward gait and a crab gait (left/right) so that it can get away from a wall (this is a problem) and for this to work, I will need to use the IMU to make sure it's moving/not stuck based on expected IMU values.

  • Moving forwards in reverse

    Jacob David C Cunningham07/04/2022 at 21:38 0 comments

    Greeting's sire

    This is mostly a code refactor update, however there are some new thoughts/directions I'll mention.

    Initially I was driven to refactor the code in hopes that I could easily mirror the existing gaits to have full 2D control eg. the missing move backwards/turn right. However... this did not go as planned. The mirroring of the default stance >= to =< made the robot also move forward... so I think ultimately I'll have to figure out how to turn without doing that mirroring, or improve that mirroring so it doesn't move forward. Similarly I wanted to improve the gaits where they're not hardcoded (did not happen) by simply specifying the degree coverage to move and direction... with regard to refactoring, what I ended up doing was changing the direct Servo bindings to a bunch of struct instances (legs) so that I could add additional info to each servo like the max/min position based on geometry. Then I added a safeServoWrite function to guard any incoming servo write commands. Largely though the code refactoring was just moving code around/organizing the files.

    As part of the min/max positions I added a manual web-based command to target specific servos and move them... this is not really hard, just sending/parsing a string using the existing websocket connection. What sucks is the single-threaded aspect of Arduino so I have to rethink how to structure my code so that I can do the main/general processing but also listen to incoming web commands... because right now the web commands are blocked during the navigation/motion part of the main loop. I could also do a scheduler, where the messages are not removed/in pending state until the robot receives it/responds.

    Continuing on... while I was letting it roam around, it flipped over (not the first time) and this gave me the idea to implement a "help I'm stuck" feature that would ping the web. Similarly if it dies from too much current draw (like running into something) it will notify the web interface that this is happening. Continuing with that thought, the progress/motion measured by the IMU will also be factored into this. It's hard because the robot shifts back and forward so trying to get a "forward distance covered" value from that, which is positive/reliable is tricky. Although it does work, I did some sample work with that at some point. Generally I want to make the code make more sense/better structure. In the long video below I'll discuss that.

    I'm also working on the actual web interface, not just random bits. I don't really have a solid design in mind yet. I am thinking tabbed interfaces and then I want an event stream (from the socket bridge).

    Mostly this robot is problematic because I can't trust it to be on its own. Since the sensing ability is crude and it will damage itself by running into things/burning out a servo. That's the overarching goal/design/purpose of this thing is to self-navigate and then stream its state to the web.

    In the long run this robot is flawed because of manual programming of the servos... every time the servos are adjusted the change propagates into several places (motion and pan/tilt gaits). I have to trace it to make sure the servos do not jump/skip positions for smooth operation.

    Long video below

    Closing

    I'm trying to get the "pump" (interest) to keep working on this project. I'm also learning other technology too on my spare time but I'd like to "close" this out. Though I don't think I will until I design/build a better one. There is still so much work to do/more things to dive into and improve.

  • Import 3D model into ThreeJS via glTF

    Jacob David C Cunningham05/29/2022 at 19:55 0 comments

    So just a little update on myself, I'm currently on the job hunt so I'll be delayed on posting any significant progress on this project. My aim is to be employed within June I've been working on this startup that has not taken off and I'm broke ha.

    Anyway here is this update, I checked in the past if this could be done and it turns out it is possible. In SketchUp I export the model (entire thing right now) via glTF export plugin, then I load it in ThreeJS.

    The full way to do this is to split up the model in parts and import them individually then positioning them for the "standing" pose and then program in the gaits virtually to show the walking animation as the robot moves in real life. Of course the motion is not tied to the real thing other than it is walking or not.

    Another thing I'd like to address is to make the gaits dynamically set eg. by inverse kinematics like a real robot vs. manually programming in the gaits. Then it would be nice to be able to easily change the direction it moves as it walks if it's turning slowly to the left for example.

    Relevant code

    Note that I just got this to work in like 30 minutes so it sucks but it does work. I'll have to work on the lighting too as it is a bit dark. It is gonna suck programming the parts to move via raw JS and ThreeJS haha... you're probably supposed to animate it in Blender or something then export the code into JS. I'll probably do a lazy/generalized version where the entire leg moves vs. each joint.

View all 16 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates