Close
0%
0%

Modular Humanoid Robot "MARB"

Meet MARB, a modular humanoid robot you can build yourself

Similar projects worth following
"I happen to believe that robotics will be bigger than the Internet." Marc Raibert, CEO and founder of Boston Boston Dynamics.
MARB will protect your home, support you during your everyday life and be your fellow. The ambition of this project is to enable robotic computing for every household.

Main structure (of the first prototype) based on popular MakerBeam parts as well as some custom designed 3-D printed, machined and and laser cut parts. Every arm has 4 DOF, the head 2 DOF. Only digital servos with brush-less motors are used. Head consists mainly of a SunFounder 24x8 LED Dot Matrix Module called EMO. The robot base uses 2 angular geared DC motors, supported by 2 caster wheels so that it can turn in any direction and turn in its own length. Controlled by several Arduino micro-controllers, main brain probably a Raspberry Pi.  Beside of how to build this robot, the project will address artificial intelligence and provide all the source codes.

More details will follow in the build logs. So far:

This project is published under the MIT license.

netfabb - 933.14 kB - 06/21/2018 at 13:30

Download

netfabb - 349.07 kB - 06/21/2018 at 13:29

Download

Switch mount_1.stl

Print 1 x

netfabb - 294.75 kB - 06/21/2018 at 13:26

Download

Vision module.stl

Mount for the Sparkfun APDS-9960 breakout. Print 1x

netfabb - 162.74 kB - 06/18/2018 at 17:48

Download

Sensor module.stl

Mount for the sensor breakouts. Print 1x

netfabb - 800.08 kB - 06/18/2018 at 17:47

Download

View all 14 files

View all 21 components

  • Algorithm bits

    M. Bindhammer06/03/2018 at 16:27 2 comments

    In this log I will discuss some algorithm ideas I have in mind for MARB. Some are easy, some require higher math.

    1. Weather forecast

    MARB will not be connected to the internet. Call me old-fashioned, but it has nothing to do with AI to get for instance the weather forecast from some online services like weather.com. So how can MARB forecast the weather?

    In the old time we used analog barometers like the following:

    With a digital barometer like the BMP180 and a few if-statements we can predict the current weather. By storing the previous measured atmospheric pressure and compare it with the current one, we can also predict if the weather is improving or deteriorating. To forecast the weather of the next couple of days, we can use so called Markov chains. The probabilities of weather conditions (rainy or sunny), given the weather on the preceding day, are represented by the transition matrix:

    The matrix P represents the weather model in which the probability that a sunny day followed by another sunny day is 0.9, and a rainy day followed by another rainy day is 0.5. Markov chains are usually memory-less. That means, the next state depends only on the current state and not on the sequence of events that preceded it. But we can change the initial state if we have other experiences to model the weather or something else. One more note: Columns must sum to 1 because it's a stochastic matrix.

    The weather on the current day 0 is either sunny or rainy. This is represented by the following vectors (100% sunny weather and 0% rainy weather or 0% sunny weather and 100% rainy weather):The general rule for day n is now:

    or equally

    This leaves us with the question how to compute these matrices multiplications for our example:Let us assume the weather on day 0 is sunny, then weather on day 1 can be predicted by:


    Hence, the probability that the day 1 will be sunny is 0.9.

    Applying the general rule again, weather on day 2 can be predicted by:

    2. Intelligent obstacle avoidance

    Today I spend the half day writing basic code for an obstacle avoidance algorithm, mainly a function to store all the distances of the ultrasonic sensors in an array and functions to represent all possible movements of the robot in the plain, taking PWM (maybe based on temperature, means entropy, or remaining battery power) and other things into account. Now I have to think about  how MARB could avoid obstacles in an intelligent way. I have some ideas, let's see, how it goes...

    Let every sensor Si of the sensor array representing a bit of a 7-bit binary number S0, S1, S2, S3, S4, S5, S6. Let Si = 1 if the sensor detects an obstacle and else 0.


    Now we can build a decision tree of an obstacle avoidance strategy and asign a movement behavior of the robot to every of the 128 possible permutations, binary 0000000 to 1111111 or decimal 0 to 127. Further reading of a similar idea: Obstacle avoidance design for a humanoid intelligent robot with ultrasonic sensors.

    First code snippet, which converts all obstacle occurrence possibilities into a decimal number between 0 and 127. I am sure, this could be coded much more elegant, but for now I want to focus on the functionality, on the beauty I will work later...

    ... Read more »

  • Vision module

    M. Bindhammer06/03/2018 at 12:42 0 comments

    An APDS-9960 RGB and gesture sensor acts as a low level machine vision to recognize gestures, colors of objects and distances. The Samsung Galaxy S5 actually uses the same sensor. A bi-directional logic level converter is needed for 5 V micro-controllers, because the APDS-9960 works only with a VCC of 2.4 - 3.6 V.

    Wiring diagram:

    Mounting bracket design:

    I am using this bi-directional level shifter from Adafruit. Those breakouts have usually no mounting holes, so I added a cavity on the back of the mounting bracket, where the level shifter will be glued in place.

    A lot of work went today into the vision module. I glued the level shifter into place with 2-component epoxy adhesive, wired the level shifter and the APDS-9960 breakout and assembled the module on MARB's head.

    I also tested two APDS-9960 libraries, one from Sparkfun and one from Adafruit. The Sparkfun library is much better regarding the gesture control. It senses every movement correctly at a distance up to 20 cm while the Adafruit fails very often even at much lower distances.

    I will also try to detect  colors of close objects, using the k-nearest neighbors algorithm in an n-dimensional Euclidean space. More on that soon in the log Algorithm bits.

  • Sensor module

    M. Bindhammer05/25/2018 at 14:31 0 comments

    MARB will be equipped with a bunch of sensors to sense time, humidity, temperature, barometric pressure, ambient light, gas concentration, orientation and acceleration. So far I have tested a DHT11 based hygrometer, a TEMT6000 ambient light sensor and a MiCS-5524 based gas sensor.

    The gas sensor can easily detect an open bottle of ammonia solution.

    Testing a DS1302 RTC breakout from Sunfounder:

    After tinkering a while with many different sensors, I decided to use for now following sensors:

    • DHT11 breakout humidity sensor 20-80% (5% accuracy)
    • BMP180 breakout barometric pressure sensor (300-1100 hPa, 0.03 hPa resolution)
    • Dallas 18B20 breakout temperature sensor (-55 to +125 °C)
    • TEMT6000 breakout light sensor (illumination Range: 1 – 1000 Lux, spectral response closely emulates human eye)
    • DS3231 precision RTC breakout
    • MiCS5524 CO, alcohol and VOC gas sensor breakout
    • LSM303DLHC compass and accelerometer breakout

    Resulting sensor module design:


    3-D printed part in the mail:

    Sensors assembled:

  • Ultrasonic sensor module

    M. Bindhammer05/24/2018 at 09:57 0 comments

    The ultrasonic module consists of 7 HC-SR04 distance sensors and a 3-D printed enclosure. The module will be mounted on the front of the robot base, a second one can be mounted on the backside if necessary. So far I finished the 3-D drawing of the enclosure.

    3-D printed part from Shapeways:

    7 HC-SR04 mounted:

    I also designed an extension shield for the two Arduino MEGA's to have better access to the remaining I/O and analog pins:

    Populated extension shields for the two Arduino MEGA's, MARB has on board:

    Wired and tested ultrasonic sensor module:

  • Videos!

    M. Bindhammer05/21/2018 at 18:17 0 comments

    All videos of MARB will be posted here!

    First video shows nothing exciting. Just very basic arm movements, emojis, TFT SD card and bitmap test:


    Again nothing fancy, testing MARB's motor driver. Beside of wiring the ultrasonic sensors I need to connect the compass breakout to do some advanced obstacle avoidance. Stay tuned.


    MARB is doing a first experiment of decision tree based obstacle avoidance:


    MARB performing rudimentary vision and Corneal reflex:


    MARB performing color sensing and speech. Beside of this I tested I²C communication between two Arduinos.

  • Power supply

    M. Bindhammer05/11/2018 at 20:36 0 comments

    Time to take in some considerations about the power supply of MARB. I am using a 9.6 V, 5000 mAh NiMH racing pack battery. Obsolete, you will think. Why not using a Li-Ion or Li-Po battery? I can tell you. NiMH batteries are much easier to handle than Li-Ion or Li-Po batteries. They forgive mistakes Li-Ion or Li-Po batteries won't. They may have less energy capacity and a memory effect, but they normally don't burn or explode when treated the wrong way.

    Battery mounted on the robot:


    3-D printed bracket for the ON/OFF switch:


    Another 3-D printed part for the power distribution and motor control module:

    Assembled and partly wired module. The LM 2596 based voltage regulator powers the micro controllers, the other step-down DC-DC converter the servos.

  • Human Machine Interface

    M. Bindhammer05/03/2018 at 20:40 0 comments

    Main human machine interface will be a DFrobot 2.8" TFT touch shield on an Arduino MEGA (which works fine on Arduino UNO and MEGA by setting the 3 tiny SMD switches on the backside of the PCB and using the SPI, Adafruit GFX and Adafruit ILI9341 libraries) and a custom designed shift registers based keyboard PCB, which I still need to populate and test:3 parts to 3-D print: the TFT and Arduino MEGA bracket, the keyboard bracket and the TFT cover.

    Finally I populated the keyboard PCB and tested it:

    Printed parts:

    Touch screen module completed:

    Keyboard module completed:

    Like for my previous robot project Murphy I am using a 2 W, 8 Ω miniature loudspeaker, an EMIC-2 text-to-speech module, a TPA2005D1 breakout board and a custom designed carrier board:

    Populated :

    Wiring diagram:

    Important notes: The EMIC-2 does not work properly with Arduino hardware serial. Use the SoftwareSerial Library instead. The potentiometer is not really necessary as the volume can be adjusted by software.

    Speech module design:


    Again, Shapeways has done a great job printing this part:


    EMIC-2, TPA2005D1 breakout board and loudspeaker assembled and attached to the robot:


  • Propulsion system of the robot

    M. Bindhammer05/01/2018 at 15:01 0 comments

    I will use these angular geared motors to propel the robot:

    I designed according mounting plates to attach the motor to the 15 x 15 mm MakerBeam T-nut profiles and laser cut it from 3 mm stainless steel plates. The DFX file will be provided later in the file section.


    Steel plates attached to the motors and tested by a Pololu Dual VNH5019 Motor Driver Shield.

    Beefy aluminum wheel, CNC'd by a colleague in his workshop.  Heavy, but this will help to keep the center of gravity nearby the ground, because the robot is very tall. Tire is a 10 mm dia O-ring. Next to it an off-the-shelf wheel hub for a 8 mm motor axis. The bore hole in the center of the wheel is there to position wheel and hub perfectly before fixing it by six M5 screws.

    Motors attached to base.

    Wheels attached to the motors:


    To assemble the caster wheels to the MakerBeam system I CNC'd two mounting brackets:

    It is important that the caster wheels have a few millimeter ground clearance, otherwise the robot can not overcome smallest obstacles like a door sill.

View all 8 project logs

Enjoy this project?

Share

Discussions

ActualDragon wrote 05/04/2018 at 19:36 point

oh, right. XD i saw that, idk why it didn't register.

  Are you sure? yes | no

M. Bindhammer wrote 05/04/2018 at 18:34 point

Wheels later, see Propulsion system of the robot

  Are you sure? yes | no

ActualDragon wrote 05/03/2018 at 21:29 point

Love this project! Thanks for sharing. Any chance for legs/wheels/tracks?

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates