Close
0%
0%

Robotic Painting Arm (made from cardboard)

This is a robotic arm that will be able to paint paintings, it is made from cardboard, scraps, and Servos (quarantine project).

Similar projects worth following
I am making a program that uses GAN's (which will most likely be processed in RunwayML as my laptop doesn't have the proper graphics card) and separately Markov Chains (which I have written myself) to creative unique images, which will be painted by the robotic arm I have constructed out of cardboard. I have wired in a MaxSonar EZ so that it can use echolocation to determine depth.

I am fairly new to tinkering, coding, and GitHub, but I am really giving it my all.

I would really like for this to be an accessible project, to allow anyone with an interest in robotics and machine learning to have their own AI artist. With low costs and no specialized tools: no 3D printer, no Laser-cutters, not even a full wood-shop. Of course with those you could expand on this, but this really lends itself to anyone interested. 

All the cardboard is hand cut with a pen-knife, hot glued together, with some screws here and there. You will need a soldering-iron, though you might even be able to get away with really long jumper wires.  The most expensive part is the maxSonarEZ1, which I've made optional. 


In the pictures you can see the completed robotic arm, it is fairly large. Other pictures include, an example of  the computer vision that I've coded for the program to determine which colors to paint and where (right now it is extremely simplified, using only 6 colors, for ease of coding. I will expand on this as I go). This image is the one that looks like an 8-bit image, this was a picture I took with my webcam (So it's actually a self-portrait!). 

The other is what the end results might look like, they are generated images processed in Runway ML using the Coco GAN created by running the computer vision image.  

And of course the electronic diagram, made using Fritzing. 

I really hope other people find this helpful and give it a try!

  • 3 × MG 996R Servo This does the heavy lifting. Turns the base and moves the arms
  • 2 × KY66 Longruner Servo This does the precision work; moving the paint brush
  • 1 × Arduino
  • 1 × MaxSonar EZ I haven't implemented this yet, but it will calculate the distance from the paint brush to the canvas. It's probably not even necessary tbh.

  • Servo Control Issue

    Weff07/08/2020 at 08:14 0 comments

    I've modified the larger servos by adding an analog feedback wires to the potentiometers inside them, so now they are more accurate. There are tons of tutorials on how to do this, or you can buy servos that already have a feedback wire. This means I will have to update my circuit diagram to include this ( which I will), but for now I'm just running these extra wires to A1 and A2. This also means I will update my GitHub to include this feedback in the code (which is another thing I will do).  By the way my GitHub account is Weef Teef (in my last log it was auto corrected and therefore mispelled.)

  • Now, more follow-along-able!

    Weff06/30/2020 at 08:13 0 comments

    June 30

    I added a circuit diagram to the images associated with the project, hopefully with this and the code posted on GitHub it can be easily replicated. ( Sorry I didn't do this sooner! I probably should have done it first thing).

    Second, I've gotten the arm to point to locations on the real world canvas (RWC) by clicking inside a pseudo canvas on the screen. The next step will be to tie this function to my image recognition code. The RWC so far needs to be at a pre-determined distance to calculate the forward kinematics, I do however have the Maxsonar EZ1 readings being input and displayed. This will be used to determine the distance in the future (but I was also thinking of writing a setup function that creates some kind of "heat-map" of distances. The reasons I would do this are, 1. it's totally cool and would make this project more versatile, and 2. I don't think it will take much extra coding from the original intent to get there ). The "hand" is also auto-leveling now, so as it accesses any point on the RWC the paint-brush is level to the ground, one next step will be to add brush strokes associated with the image content once it arrives at the correct position.

    One note, that seems important to mention, is that the servos aren't terribly accurate, specifically the MG66R's...I'm looking into some ways of reducing the margin of error (adding more power, and a handful of other ideas I've found. I will update on any failures/ successes of that ). The reason I'm not using stepper motors is...I didn't have any. If I have to break down and buy one, I'll of course mention that, but one of the tenets of this project is: make a robotic arm that anyone interested in robotics could make with as little cost as possible. I think stepper motors tend to cost a little bit more, especially for the size I would need. But! I also don't want to have a wildly inaccurate arm. To be continued! 

    Lastly, I will likely post two sets of code for this project, the simplified one for just manually moving the arm (which is a good thing to start with anyways for calibration, testing, and fun) and the more fully developed code to include the art making processes. The former should be updated within a matter of days, and the latter perhaps by the end of next month (if I have a really productive month).

  • (some) Code is now on GitHub

    Weff06/25/2020 at 09:02 1 comment

    Just a quick update:

    I've added the code I'm using to control this with (as well as calibrating displays) to my GitHub: Week Teef. It still needs loads of work, as I'm still in the process of figuring things out, but it DOES control the arm with (reasonable) accuracy. I don't know how user friendly the code is yet, I'm kind of flying by the seat of my pants, I will go back and make sure that the code makes sense later; as in I will make sure there are notes everywhere and full instructions.  But! If anyone has any questions before then I'd be more than happy to answer them.

    Other info: It's written in openFrameworks using Standard Firmata loaded onto my Arduino Uno. 

    Feel free to let me know if I'm forgetting or missing anything.

  • First update

    Weff06/20/2020 at 19:20 0 comments

    June 20 :

    Disclaimer: As I am new to open source projects, gitHub, and coding, there is a chance that I will forget to mention critical things and go into too much detail about useless stuff. Hopefully you will bare with me while I make adjustments!

    So! I've been working on this for a few weeks and so far I've gotten the robotic arm to run on openFrameworks using the mouseX, mouseY to manipulate the arm.  Im not sure if anyone would even be interested in that code (which is perhaps nothing special, but not entirely useless!) I will try to post that to my gitHub for people to use.  I'm not sure if it would be useful to show how exactly I made the arm out of cardboard or not, but of course I'm willing to do so if that's at all helpful.

    Going on, I've also managed to get the openCV library to take an image from my webcam and translate that into one of six colors + a default color. In the near future I am going to use OSC messages to automatically connect to RunwayML for the GAN alterations (again I'm using Runway because my laptop is not suited to that).  

    I had been using 4 AA batteries to run all 5 servos (ky66(x2) and MG996(x3) ) but I don't think that's going to work long-term so now I've got a rechargeable 6 volt battery pack. 

View all 4 project logs

Enjoy this project?

Share

Discussions

Mike Szczys wrote 06/22/2020 at 19:11 point

Great first prototype. It works remarkably well for something made from cardboard. What are your plans for a controller for this?

  Are you sure? yes | no

Weff wrote 06/23/2020 at 08:40 point

First, thank you for the like and follow! Btw you've got some great projects, I'm especially interested in reverse engineering stuff, like your Custom Display for Elliptical Trainer.

I'm surprised how durable cardboard can be, I will say I inserted some wooden coffee stirs into the cardboard where I thought it might need extra support (not many though! a total of 4).

Right now the arm is controlled by moving a mouse hooked up to my laptop in coordination with the left and right buttons. Moving the cursor in x,y positions moves the lower and upper arms. Then, while the mouse is pressed the x,y positions move the 'hand' and brush as the rest stays static.

Ultimately, the plan is to generate a jpeg image and have the arm move itself to paint that picture onto a canvas controlled by openFrameworks in conjunction with Firmata. I am including echolocation readings so that the whole thing can be dropped in front of any area, self calibrate, and then paint away.

That being said, since I've started the project I've noticed people using bluetooth playstation controllers for their projects. Once I've finished my initial goal I'd like to see if I could do something similar.

**I've been watching tutorials on how to add stuff to GitHub, and will attempt to upload the code I've written so far using the mouse. Then all anyone will need is 5 servos and some cardboard!

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates