Close
0%
0%

UV Sanitizing Autonomous Robot

Cost-effective robotic solution for surface sanitization in home

Similar projects worth following
Ultraviolet germicidal irradiation (UVGI) is a disinfection method that uses short-wavelength ultraviolet C light to kill or inactivate microorganisms by destroying nucleic acids and disrupting their DNA, leaving them unable to perform vital cellular functions. UVGI is used in a variety of applications, such as food, air, and water purification, or treat biologically contaminated surfaces. Recent studies have shown that UV short-wave radiation is capable of eliminating COVID-19 viruses at the hospital level, thus improving the cleanliness of the intensive care area, general medicine rooms and individual rooms. Covid-19 is continuing to spread around the world, with around 246 million confirmed cases and more than 5 million deaths across almost 200 countries. Today health is a priority; and as main goal, this innovative idea helps us thrive in this new normal because has cost effective and useful to sanitize the whole home or objects by using artificial intelligence and voice commands.

A video summarizing the journey of this project:

In Motion Version, specific goals:

  • 3D printing of the list of parts that will be used to assemble the Autonomous Robot;
  • Mounting the Chassis: "4WD Robot Car";
  • "Alexa Echo Dot" connection with the ESP32-WROOM-32 board, to transmit voice commands to the Robot;
  • Calculation of Neural Networks with Python to be used on the Arduino board, and to control the Robot;
  • Use of a PID Controller to control the speed and turn of the Robot; and
  • Using of the Tesla coil to light an UV lamp.

Not in Motion Version, specific goals:

  • Developing a reflector to concentrate the energy of UV lamp on an object such as backless stool;
  • Making the Haar Cascade Classifier of a backless stool with OpenCV and Python; and
  • Detecting and aiming the reflector over the backless stool using the Cascade Classifier and the Raspberry Pi board with it's camera.

opencv_cascade_classifier.zip

Training of opencv cascade classifier, and used in the second version of this project.

x-zip-compressed - 9.96 MB - 10/11/2021 at 23:26

Download

stl_files.zip

All STL files used in both versions of this project

x-zip-compressed - 107.96 kB - 10/11/2021 at 23:22

Download

schematic_diagrams.zip

All schematic diagrams used in both versions of the UV Sanitizing Autonomous Robot

x-zip-compressed - 1.14 MB - 10/11/2021 at 23:35

Download

  • 1 × Amazon Alexa Echo Dot
  • 1 × Raspberry PI 3B+
  • 1 × Arduino UNO
  • 1 × ESP32-WROOM-32
  • 1 × Arduino Pro Mini

View all 17 components

  • Step 11. Project Report Updated

    Guillermo Perez Guillen10/12/2021 at 03:49 0 comments

    Next, I show you the project wrap-up and lessons learned:

    --> Introduction

    • Here I show you the problem that means the infected places in the home workplace. In particular I am talking about Covid-19 and similar dangerous viruses.
    • This pandemic doesn't allow us to move freely and this solution has the advantage that the robot can move autonomously and obeying voice commands. Also, this robot can help a person with a disability to disinfect their tools, work areas, their food, their packages sent by courier, etc.
    • Finally, I show you a price comparison of my solution with products that already exist in the market.

    --> Printing Custom Parts

    • In the first version of this project I show you the parts that need to be printed with a 3D printer. I attach these files to my Github code repository: uv-sanitizing-autonomous-robot

    --> Assembling 4WD Robot Car Chassis

    • Next, I show you with images how to assemble the parts printed in the previous step, the sensors and the programming boards on the "4WD Robot Car 'Chassis".

    --> ECHO DOT & ESP32-WROOM-32, part 1

    • In this chapter, I show you the technical specifications of the "Alexa Echo Dot" device and the ESP32-WROOM-32 board, how this project works, and the installation of the FauxmoESP library as a prerequisite.

    --> ECHO DOT & ESP32-WROOM-32, part 2

    • Here you will find the schematic diagram, the code that is uploaded to the ESP32-WROOM-32 board, and how to configure the Android application to connect the "Alexa Echo Dot" device with this board.

    --> Neural Networks

    • Here I explain how to design the neural network, how to improve the neural network, and how to get the coefficients of the neural network. The programming language used is Python.

    --> PID Controller

    • Now it is time to add a PID controller for the autonomous robot to control the speed. For example, the robot avoids hitting an object on its left side by increasing the speed of its left wheels and reducing the speed of the right wheels.

    --> Tesla Coil & UV Lamp

    • Here I explain what a Tesla coil is, and what a UV lamp is. I also show you the schematic diagram and code of a UV meter with Arduino (optional).

    --> Test and Conclusion

    • Finally, here I do tests with the prototype and make my performance conclusions. Also; I describe the difficulties and possible solutions.

    --> All Hardware and Software Updated: UV Reflector & OpenCV  

    • I've worked on a new version of my project and I will show you the progress of this one. Here I've added a reflector to concentrate the UV light, and object recognition through OpenCV and the camera of my Raspberry Pi. The robot is stopped and moves and aims the reflector on the desired object.

    --> Challenges

    • I'm satisfied, however there are still things to do, e.g.: add two or three UV lamps and add an upper platform to place the lamps, add more batteries, add more voice commands, add more object recognition, try TensorFlow, Machine Learning, etc.

  • Step 10. All Hardware and Software Updated: UV Rerflector & OpenCV

    Guillermo Perez Guillen10/11/2021 at 17:34 0 comments

    This is a nice project and I have updated it. Below you can see the particular goals:

    • Developing a reflector to concentrate the energy of UV lamp on an object: backless stool;
    • Making the Cascade Classifier of a backless stool u other object; and
    • Using of OpenCV on the Autonomous Robot to locate the correct position of the backless stool, and aim the reflector on this object.

    Notes: 

    • It's important to clarify that in this new version, the autonomous robot doesn't move and only aims the UV radiation on the desired target when we give it the voice command.
    • As we did in first version of this project, now the estimated price of the hardware components is approx $ 369 USD.

    UV Reflector

    We will print several parts that will be used to assemble UV Reflector with the "4WD Robot Car" chassis. In the figure below I show you the image of UV Reflector holder - part 1.

    And the UV Reflector holder - part 2.

    Below I show you the piece that helped me to make the UV reflector (you must print 4 pieces and cover them with aluminum foil).

    Below, I show you all the assembled parts. Note: Fix the servo as shown in the picture.

    Finally, below I show you how to mount the UV reflector on the chassis of the autonomous robot.

    Close-up of the image where we can see the Arduino Pro Mini board

    Mounting the Raspberry Pi camera.

    A view from the opposite side we can see the Raspberry Pi board.

    OpenCV

    A nice tutorial for installing OpenCV on my Raspberry Pi is: https://pimylifeup.com/raspberry-pi-opencv/

    The steps to make the classifier are shown below:

    • A --> Collecting Image Database
    • B --> Arranging Negative Images
    • C --> Crop & Mark Positive Images
    • D --> Creating a vector of positive images
    • E --> Haar-Training
    • F --> Creating the XML File

    Notes:

    Schematic Diagram

    Now, we must assemble our schematic diagram as shown in the figure below.

    I also made my own cable for connections between Raspberry Pi, Arduino board and the battery:

    Codes:

    On my Raspberry Pi board, I must run the next code: uv_autonomous_robot.py

    # AUTHOR: GUILLERMO PEREZ GUILLEN
    # import the necessary packages
    from picamera
    import time
    import cv2
    import structural
    a=0
    b=0
    x1=0
    y1=0
    ser = serial.Serial('/dev/ttyUSB0',9600)
    
    #Load a cascade file for detecting faces
    backless_stool_cascade = cv2.CascadeClassifier('backless_stool.xml')
    
    # allow the camera to warmup
    time.sleep(0.1)
    count = 0
    
    # capture frames from the camera
    for frame in camera.capture_continuous(rawCapture, format="bgr", use_video_port=True):
            image = frame.array
            gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
            backless_stool = backless_stool_cascade.detectMultiScale(gray, 1.3, 5)
            for (x,y,w,h) in backless_stool:
                    a=int((2*x+w)/2)
                    b=int((2*y+h)/2)
                    x1=int(a/3.66)
                    y1=int(b/2.55)
                    ser.write(struct.pack('>BB', x1,y1))
                    cv2.rectangle(image, (x,y), (x+w,y+h), (255,0,0), 2)
                    count += 1
    
            # clear the stream in preparation for the next frame
            rawCapture.truncate(0)
    
            # if the `q` key was pressed, break from the loop
            if key == ord("q"):
                    break
    

    This code finds the horizontal and vertical position of the first vertex of the object (backless stool). Then I send the data through the serial port (ttyUSB0) to the Arduino board. On my Arduino Pro Mini board, I must load the next code: arduino_pro_mini.ino

    // AUTHOR: GUILLERMO PEREZ GUILLEN
    
    #include <Servo.h>
    int data_x = 0;
    int data_y = 0;
    
    Servo myservo_x;
    Servo myservo_y;// create servo object to control a servo
    
    void setup() {
      Serial.begin(9600);
      myservo_x.attach(9);  // attaches the servo on pin 9 to the servo object
      myservo_y.attach(10);
      myservo_x.write(900);
     myservo_y.write(...
    Read more »

  • Step 9. Test and Conclusion

    Guillermo Perez Guillen08/25/2021 at 19:02 0 comments

    You can see the tests with the robot in the video below:

    Conclusion:

    At the end of this project, I can say that I achieved all my goals and that it was not easy to work with this entire project:

    • I had to connect the ultrasonic sensors on the ESP32-WROOM-32 board because the Arduino board couldn't do everything and it would troubles;
    • I made several attempts to achieve a stable neural network; even I removed 14 combinations of possible 64 in table of the section five; these removed combinations were difficult to happen, for example when all entries are 1.
    • I had to reduce the speed of the gearmotors experimentally so that the robot had time to predict the best decision; and even I couldn't reduce the speed of the gearmotors too much because they get stuck;
    • I had to find the right distance for the Mini Tesla coil to light the UV lamp; I also had to move the Tesla coil away from the programming boards so that it wouldn't induce voltages;
    • I had to make use of two batteries, the first battery was used to power the programming boards and sensors, and the second battery was to power the L298N driver, gear motors and Tesla coil.
    • This is a nice prototype that can be upgraded to new versions.

  • Step 8. Tesla Coil & UV Lamp

    Guillermo Perez Guillen08/25/2021 at 19:00 0 comments

    Tesla Coil

    A Tesla coil is an electrical resonant transformer circuit designed by inventor Nikola Tesla in 1891. It is used to produce high-voltage, low-current, high frequency alternating-current electricity. Tesla experimented with a number of different configurations consisting of two, or sometimes three, coupled resonant electric circuits. Tesla used these circuits to conduct innovative experiments in electrical lighting, phosphorescence, X-ray generation, high frequency alternating current phenomena, electrotherapy, and the transmission of electrical energy without wires. Reference: https://en.wikipedia.org/wiki/Tesla_coil

    Tesla coil closeup

                                   Tesla coil close-up

    In this project I'm using this principle to transmit electrical energy to the UV lamp by means of a Tesla mini coil. Thanks to this great invention I have the following advantages:

    • I have saved money on the purchase of a ballast, and an AC converter;
    • The robot is less heavy and less big;
    • I'm not using UV LEDs, which have very low power, and I'm not simulating UV radiation. This is real.

    Where can I get this device? Example: https://www.elecrow.com/mini-diy-tesla-coil-kit.html

    Mini DIY Tesla Coil

                                   Mini DIY Tesla Coil

    UV Lamp

    I'm using a UV lamp. UV light helps detect the records and watermarks that are included in bills and important documents. This lamp has a power of 6 watts and a life time of approximately 8000 hours.

    UV lamp

                                   UV lamp

    Assembling Tesla coil and UV lamp, recommendations:

    • Mount the Tesla coil and UV lamp on the back of the autonomous robot.
    • My UV lamp was turned on at a maximum distance of 2 cm from the Tesla coil, so I set them at a distance of 1 cm to ensure the lighting of the UV lamp. You can try something similar.

    UV Meter

    The World Health Organization publishes a practical guide on the UV index in which it explains the health risks of ultraviolet radiation and proposes some protective measures depending on their intensity.

    UV index

                                   UV index

    This is optional; to measure UV radiation I've developed this device, and using the UVM30A sensor. I show you the electrical diagram in the figure below:

    UV Meter

                                   UV Meter

    Code: uv-meter.ino

    //AUTHOR: GUILLERMO PEREZ GUILLEN
    
    #include <MCUFRIEND_kbvw.h>    
    #include <TouchScreen.h>
    
    int16_t BOXSIZE;
    uint16_t ID, currentcolor;
    uint8_t Orientation = 0;    //PORTRAIT
    String UVIndex = "0";
    String Index = " ";  
    
    // Assign human-readable names to some common 16-bit color values:
    #define BLACK   0x0000
    #define CYAN    0x07FF
    #define MAGENTA 0xF81F
    #define YELLOW  0xFFE0
    #define WHITE   0xFFFF
    
    void setup()
    {
        while (!Serial);
      Serial.begin(57600);
       
        uint16_t tmp;
        tft.reset();
        ID = tft.readID();
        tft.begin(ID);
        tft.setRotation(Orientation);
        tft.fillScreen(BLACK);
    }
     
    void loop()
    {
      float sensorVoltage;
      float sensorValue;
    
      sensorVoltage = (sensorValue * (5.0 / 1023.0))*1000;  //Voltage in miliVolts
    
    ////////////////////////// UV Index
    
      if(sensorVoltage<50.0)
      {
        UVIndex = "0";
        Index = "LOW";
      }
      else if (sensorVoltage>=50.0 && sensorVoltage<227.0)
      {
        UVIndex = "0";
        Index = "LOW";
      }
      else if (sensorVoltage>=227 && sensorVoltage<318)
      {
        UVIndex = "1";
        Index = "LOW";    
      }
      else if (sensorVoltage>=318 && sensorVoltage<408)
      {
        UVIndex = "2";
        Index = "LOW";    
      }else if (sensorVoltage>=408 && sensorVoltage<503)
      {
        UVIndex = "3";
        Index = "MEDIUM";    
      }
      else if (sensorVoltage>=503 && sensorVoltage<606)
      {
        UVIndex = "4";
        Index = "MEDIUM";    
    ...
    Read more »

  • Step 7. PID Controller

    Guillermo Perez Guillen08/25/2021 at 18:52 0 comments

    A proportional–integral–derivative controller (PID controller) is a control loop mechanism employing feedback that is widely used in industrial control systems and a variety of other applications requiring continuously modulated control. A PID controller continuously calculates an error value e(t) as the difference between a desired set point (SP) and a measured process variable (PV) and applies a correction based on proportional, integral, and derivative terms (denoted P, I, and D respectively). https://en.wikipedia.org/wiki/PID_controller

    In my case I used "PID Example By Lowell Cady" to simulate the behavior of the PID controller and in the figure below you can see the graph, which has a stable behavior as time goes by: https://www.codeproject.com/Articles/36459/PID-process-control-a-Cruise-Control-example

    PID controller simulation

                                   PID controller simulation

    The Autonomous robot is equipped with 3 analog infrared sensors, which detect the distance at which the walls are, one in front and two on the left and right sides. To calibrate the distances of the infrared sensors GP2Y0A41SK0F and GP2Y0A51SK0F, you can see the post and my code below: https://www.instructables.com/id/How-to-Use-the-Sharp-IR-Sensor-GP2Y0A41SK0F-Arduin/

    float ir_sensor_left =  6*pow(volts0, -1); // worked out from datasheet graph //GP2Y0A51SK0F - 2 a 15 cm
    float ir_sensor_center = 12.4*pow(volts1, -1); // worked out from datasheet graph //GP2Y0A41SK0F - 4 a 30 cm
    float ir_sensor_right = 5.2*pow(volts2, -1); // worked out from datasheet graph //GP2Y0A51SK0F - 2 a 15 cm

    Also the autonomous robot is equipped with 2 ultrasound sensors: 1) The HC-SR04 is on the left side and oriented in the direction of 45 degrees; and 2) SRF05 is on the right hand side and oriented 45 degrees. Thus we use the two GP2Y0A51SK0F sensors to control the speed of the Autonomous Robot. The robot uses PID controller to maintain a central distance between the left and right walls. If the robot is near the left wall, then it can decrease the speed of the right motor and increase the speed of the left motor, to make the robot move to the right, and moving away from the left wall, and vice versa.

    PID Controller

                                   PID Controller

    The speeds d0 of the left engine and d1 of the right engine are calculated with the following code:

    dif = analogRead(A3) - analogRead(A0);    // PID CONTROLLER
    error = floor(Kp*(dif)+Kd*(difAnt-dif));    // PID CONTROLLER
    difAnt=dif;    // PID CONTROLLER
    int d0 = constrain(150 - error, 0, 150);//left speed - PID CONTROLLER
    int d1 = constrain(150 + error, 0, 150);//right speed - PID CONTROLLER

    However, the robot's movement may be unstable due to the error caused by a small time error, we have added a second correction factor to make the movement smoother. That is to say: difAnt= dif; now the speeds are applied by means of PWM signals to the two gearmotors:

    analogWrite(ENA, d0);
      analogWrite(ENB, d1);
      digitalWrite(IN1, out1 * HIGH); 
      digitalWrite(IN2, out2 * HIGH); 
      digitalWrite(IN3, out3 * HIGH);
      digitalWrite(IN4, out4 * HIGH);
      delay(20);

    PWM signal 

  • Step 6. Neural Networks

    Guillermo Perez Guillen08/25/2021 at 18:41 0 comments

    In this project we will create a neural network with Python and copy its weights to a network with forward propagation on the Arduino UNO board, and that will allow the Autonomous Robot to drive alone and without hitting the walls.

    For this exercise we will make the neural network have 4 outputs: two for each motor pair, since to the L298N driver we will connect 2 digital outputs of the board for each car motor pair (the two motors on the left are electrically linked, the same case with the two motors on the right.). In addition the outputs will be between 0 and 1 (depolarize or polarize the motor).

    Neural Networks

                                  Neural Networks

    We will have seven inputs:

    • First input is the activation of the Autonomous Robot that we saw on section 4 (Second case of Alexa's voice commands).
    • The next five inputs correspond to the infrared and ultrasound sensors; and
    • The seventh input is for the BIAS, the values will be 0 and 1.

    The inputs are assigned with the following logic:

    • The GP2Y0A51SK0F IR sensors on the left and right sides will have a value of 1 if the distance is less than 15 cm, and will have a value of 0 if the distance is greater than 15 cm;
    • The GP2Y0A41SK0F IR center sensor will have a value of 1 if the distance is less than 30 cm, and will have a value of 0 if the distance is greater than 30 cm;
    • The same case, HC-SR04 and the SRF05 ultrasound sensors will have a value of 1 if the distance is less than 30 cm, and will have a value of 0 if the distance is greater than 30 cm; and
    • The BIAS will have a value of 1.

    Here we see the changes in the table below:

    Inputs, Outputs and Actions of the gearmotors                              Inputs, Outputs and Actions of the Gearmotors

    To create our neural network, we will use this code developed with Python 3.7.3: neural-network.py

    import numpy as np
    
    # We create the class 
        def __init__(self, layers, activation='tanh'):
            if activation == 'sigmoid':
                self.activation = sigmoid
                self.activation_prime = sigmoid_derivada
            elif activation == 'tanh':
                self.activation = tanh
                self.activation_prime = tanh_derivada
    
            # Assign random values to input layer and hidden layer
            for i in range(1, len(layers) - 1):
                r = 2*np.random.random((layers[i-1] + 1, layers[i] + 1)) -1
                self.weights.append(r)
            # Assigned random to output layer
            r = 2*np.random.random( (layers[i] + 1, layers[i+1])) - 1
            self.weights.append(r)
    
        def fit(self, X, y, learning_rate=0.2, epochs=100000):
            # I add column of ones to the X inputs. With this we add the Bias unit to the input layer
            ones = np.atleast_2d(np.ones(X.shape[0]))
            X = np.concatenate((ones.T, X), axis=1)
            
            for k in range(epochs):
                i = np.random.randint(X.shape[0])
                a = [X[i]]
                for l in range(len(self.weights)):
                        dot_value = np.dot(a[l], self.weights[l])
                        activation = self.activation(dot_value)
                        a.append(activation)
                #Calculate the difference in the output layer and the value obtained
                error = y[i] - a[-1]
                deltas = [error * self.activation_prime(a[-1])]
                
                # We start in the second layer until the last one (A layer before the output one)
                for l in range(len(a) - 2, 0, -1): 
                    deltas.append(deltas[-1].dot(self.weights[l].T)*self.activation_prime(a[l]))
                self.deltas.append(deltas)
    
                # Reverse
                deltas.reverse()
    
                # Backpropagation
                # 1. Multiply the output delta with the input activations to obtain the weight gradient.             
                # 2. Updated the weight by subtracting a percentage of the gradient
                for i in range(len(self.weights)):
                    layer = np.atleast_2d(a[i])
                    delta = np.atleast_2d(deltas[i])
                    self.weights[i] += learning_rate * layer.T.dot(delta)
    
        def predict(self, x): 
            ones = np.atleast_2d(np.ones(x.shape[0]))
            a = np.concatenate((np.ones(1).T, np.array(x)), axis=0)
            for l in range(0, len(self.weights)):
                a = self.activation(np.dot(a, self.weights[l]))
            return a
    
        def print_weights(self):
            print("LIST...
    Read more »

  • Step 5. ECHO DOT & ESP32-WROOM-32, part 2

    Guillermo Perez Guillen08/25/2021 at 18:22 0 comments

    According to our schematic diagram, we make the connections of our ESP32-WROOM-32 device.

    Code: esp32-wroom-32.ino

    // AUTHOR: GUILLERMO PEREZ GUILLEN
    
    #include <Arduino.h>
    #include <NewPing.h> // SRFO4
    #define ultrasonic_pin_1 4 // SRF04
    #define ultrasonic_pin_2 25 // SRF05
    
    const int UltrasonicPin = 2; // SRFO4 
    const int MaxDistance = 200; // SRFO4
    
    const unsigned int TRIG_PIN=27; //SRF05
    const unsigned int ECHO_PIN=26; //SRF05
    
    
    NewPing sonar(UltrasonicPin, UltrasonicPin, MaxDistance); // SRFO4
    
    #ifdef ESP32
      #include <WiFi.h>
      #define RF_RECEIVER 13
      #define RELAY_PIN_1 12
      #define RELAY_PIN_2 14
    #else
      #include <ESP8266WiFi.h>
      #define RF_RECEIVER 5
      #define RELAY_PIN_1 4
      #define RELAY_PIN_2 14
    #endif
    
    #include <RCSwitch.h>
    
    #define SERIAL_BAUDRATE 115200
    
    #define WIFI_SSID "XXXXXXXXXX"
    #define WIFI_PASS "XXXXXXXXXX"
    
    #define LAMP_1 "lamp"
    #define LAMP_2 "car"
    
    fauxmoESP fauxmo;
    
    RCSwitch mySwitch = RCSwitch();
    
    // Wi-Fi Connection
    void wifiSetup() {
    
      // Connect
      Serial.printf("[WIFI] Connecting to %s ", WIFI_SSID);
      WiFi.begin(WIFI_SSID, WIFI_PASS);
    
      // Wait
      while (WiFi.status() != WL_CONNECTED) {
        Serial.print(".");
        delay(100);
      }
      Serial.println();
    
      // Connected!
      Serial.printf("[WIFI] STATION Mode, SSID: %s, IP address: %s\n", WiFi.SSID().c_str(), WiFi.localIP().toString().c_str());
    }
    
    void setup() {
      pinMode(ultrasonic_pin_1, OUTPUT); // SRF04
      digitalWrite(ultrasonic_pin_1, LOW); // SRF04
    
      pinMode(ultrasonic_pin_2, OUTPUT); // SRF05
      digitalWrite(ultrasonic_pin_2, LOW); // SRF05    
      pinMode(TRIG_PIN, OUTPUT); // SRF05
      pinMode(ECHO_PIN, INPUT); // SRF05
      
      // Init serial port and clean garbage
      Serial.begin(SERIAL_BAUDRATE);
      Serial.println();
    
      // Wi-Fi connection
      wifiSetup();
    
      // LED
      pinMode(RELAY_PIN_1, OUTPUT);
      digitalWrite(RELAY_PIN_1, LOW);
    
      pinMode(RELAY_PIN_2, OUTPUT);
      digitalWrite(RELAY_PIN_2, LOW);
      
      mySwitch.enableReceive(RF_RECEIVER);  // Receiver on interrupt 0 => that is pin #2
    
      // By default, fauxmoESP creates it's own webserver on the defined port
      // The TCP port must be 80 for gen3 devices (default is 1901)
      // This has to be done before the call to enable()
      fauxmo.createServer(true); // not needed, this is the default value
      fauxmo.setPort(80); // This is required for gen3 devices
    
      // You have to call enable(true) once you have a WiFi connection
      // You can enable or disable the library at any moment
      // Disabling it will prevent the devices from being discovered and switched
      fauxmo.enable(true);
      // You can use different ways to invoke alexa to modify the devices state:
      // "Alexa, turn lamp two on"
    
      // Add virtual devices
      fauxmo.addDevice(LAMP_1);
      fauxmo.addDevice(LAMP_2);
    
      fauxmo.onSetState([](unsigned char device_id, const char * device_name, bool state, unsigned char value) {
        // Callback when a command from Alexa is received. 
        // You can use device_id or device_name to choose the element to perform an action onto (relay, LED,...)
        // State is a boolean (ON/OFF) and value a number from 0 to 255 (if you say "set kitchen light to 50%" you will receive a 128 here).
        // Just remember not to delay too much here, this is a callback, exit as soon as possible.
        // If you have to do something more involved here set a flag and process it in your main loop.
            
        Serial.printf("[MAIN] Device #%d (%s) state: %s value: %d\n", device_id, device_name, state ? "ON" : "OFF", value);
        if ( (strcmp(device_name, LAMP_1) == 0) ) {
          // this just sets a variable that the main loop() does something about
          Serial.println("RELAY 1 switched by Alexa");
          //digitalWrite(RELAY_PIN_1, !digitalRead(RELAY_PIN_1));
          if (state) {
            digitalWrite(RELAY_PIN_1, HIGH);
          } else {
            digitalWrite(RELAY_PIN_1, LOW);
          }
        }
        if ( (strcmp(device_name, LAMP_2) == 0) ) {
          // this just sets a variable that the main loop() does something about
          Serial.println("RELAY 2 switched by Alexa");
          if (state) {
            digitalWrite(RELAY_PIN_2, HIGH);
          } else {
            digitalWrite(RELAY_PIN_2, LOW);
          }
        }
      });
    
    }
    
    void loop() {
      delay(25);
    ...
    Read more »

  • Step 4. ECHO DOT & ESP32-WROOM-32, part 1

    Guillermo Perez Guillen08/25/2021 at 17:48 0 comments

    Alexa Echo Dot                              Alexa Echo Dot

    Echo Dot is a smart speaker that is controlled by voice and connects you to Alexa via Wi-Fi network. Alexa can play music, answer questions, tell the news, and check the weather forecast, set alarms, control compatible Smart Home devices, and much more.

    ESP32-WROOM-32 Pinout

                                  ESP32-WROOM-32 Pinout

    ESP32-WROOM-32 is a powerful, generic Wi-Fi+BT+BLE MCU module that targets a wide variety of applications, ranging from low-power sensor networks to the most demanding tasks, such as voice encoding, music streaming and MP3 decoding. Datasheet: https://circuits4you.com/wp-content/uploads/2018/12/esp32-wroom-32_datasheet_en.pdf

    Alexa's voice commands:

    • First case: In this project we're going to use and modify an Alexa application to turn on/off a lamp with voices commands. The figure below shows a high-level overview on how the project works to control an UV lamp.
    • Second case: It works similarly for the lamp, and we using this command voice to activate the Robot: start the motion or stop.

    How does it work?

    To control your ESP32 with Amazon Echo, you need to install the FauxmoESP library. This library emulates a Belkin Wemo device, allowing you to control your ESP32 using this protocol. This way, the Echo Dot instantly recognizes the device, after uploading the code, without any extra skills or third party services.

    Prerequisites: Installing the FauxmoESP Library

  • Step 3. Assembling 4WD Robot Car Chassis

    Guillermo Perez Guillen08/25/2021 at 17:25 0 comments

    The chassis I used was the popular "4WD Robot Car Chassis", which is economical and practical since it has two platforms, 4 gearmotors, 4 wheels, and enough holes to mount the devices of our design.

    4WD Robot Car

                                  4WD Robot Car Chassis

    How to assemble this kit?

    Assembling autonomous robot, recommendations:

    • On the lower platform mount: Battery, gearmotors with wheels, L298N driver, power switch, and IR distance sensors.
    • On the upper platform mount: Battery, Tesla coil, UV Lamp, Relay, Arduino and ESP32-WROOM-32 boards, and the ultrasonic sensors.
    • On the top platform add (Not in Motion Version): Raspberry Pi and Arduino Pro Mini boards, rasperry pi camera, and reflector with servo.
    • I used 22 additional screws to the ones in the chassis kit.

    Now, I show you the parts assembled with their sensors mounted on the 4WD Robot Car Chassis in the figures below:

    4WD Robot + L298N driver + battery

                                  4WD Robot Car Chassis + L298 driver + BatteryGP2Y0A51SK0F

                                  GP2Y0A51SK0F IR Sensor

    SRF05                              SRF-05 Sensor

    UV lamp + Tesla coil

                                  UV lamp + Tesla coil

    ESP32-WROOM holder                              ESP32-WROOM-32 board

  • Step 2. Printing Custom Parts

    Guillermo Perez Guillen08/25/2021 at 17:08 0 comments

    Now we're going to print several parts that will be used to assemble the sensors and the programming boards on the "4WD Robot Car" chassis. In the figures below I show you the images of these parts, and I comment the use of each one.

    Notes:

    • In the download section you can get STL files.
    • Software used: FreeCAD and Ultimaker Cura.
    • You can get the STL files on my Github account or on the Files section.

    UV lamp holder                              UV lamp holder

    IR sensor holder (GP2Y0A41SK0F)                              IR sensor holder

                                  Ultrasound sensor holder

                                  ESP32-WROOM-32 WiFi holder

View all 11 project logs

View all 3 instructions

Enjoy this project?

Share

Discussions

Guillermo Perez Guillen wrote 10/11/2021 at 23:54 point

This project can help people who work at home. Today we have to sanitize all the tools and work areas, our food, the packages that are sent to us by messenger service, etc.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates