Close
0%
0%

Ai Equiped Wasp (and Asian Hornet) Sentry Gun

A powerful laser guided by cameras will vaporize these pests in flight. Hopefully.

Similar projects worth following
The Asian hornet (Vespa mandarina / Vespa velutina) is an invasive pest that predates on honey bees. It must be destroyed! It must be eradicated! (Rant over). This project is actually mostly about training neural networks on the Nvidia DIGITS system using the standard bvlc_googlenet.caffemodel pre-trained model to 'infer' the location of wasps and hornets. The trained model is then deployed on an Nvidia Jetson TX2 which has an onboard GPU and video camera and inference data is spat out via I2C to an Arduino for further processing.If anybody wants to try and create their own model for inferring the position of wasps with bounding boxes, I've uploaded all the images in .zip format here: https://www.kaggle.com/tegwyntwmffat/european-wasp-vespula-vulgaris-kitti-format

If anybody wants to support any of our projects, check out our Amazon Wishlist here: https://www.amazon.co.uk/hz/wishlist/ls/3AW6R7V5BVU3R?ref_=wl_share

Patreon donations can be made here: patreon.com/Goa

We all know that bees are under threat from modern farming practices, but did we know about the Asian hornet, slowly spreading north through the united states of Europe closer and closer to the Land of Dragons (where I live) ? The Asian hornet is a voracious predator and will 'hawk' outside bee hives, swooping down on bees in flight on their home coming to feed their brood. Then, as the hive becomes more and more weakened due to diminishing bee numbers, the hornets will enter the hives and gobble up all the innocent youngsters inside.

So what do we do - give up with out a fight? Hell no .... We should fight back against this monstrous predator with everything that we've got! In this case we'll be using so called 'artificial intelligence' to 'infer' the location of the hornet and attempt to destroy it, preferably as dramatically as possible.

Since the hornets have not actually arrived here yet, I'll be attempting to train my machine on detecting the common European wasp, Vespula vulgaris, which also predates on the larvae of honey bees, but is not quite the threat that is the Asian hornet. Below is an Asian hornet hawking a honey bee:

One of the great things about this project is that anybody can get involved and no equipment is required other than your regular PC connected to the internet. This is because most of the work is involved with data processing such as processing images and creating labels. The data can then be uploaded onto the Amazon Web Service (AWS) for processing and nice looking graphs can be produced such as below:

In fact, some companies / individuals run online competitions with quite substantial prizes for doing this kind of work with other image sets, such as photos of whales, where you'd need to adjust the neural network itself to get a chance of winning.

Full instructions for using AWS and Nvidia DIGITS are given in the logs. NB. AWS do charge a range of fees for their service, but currently (2018) I can train networks in about 1.5 hours at $3.10 an hour with no flat rates, which is IMO cheap considering the equipment used to do this costs may 1,000s of dollars to buy.

caffemodel - 22.83 MB - 03/03/2019 at 14:44

Download

prototxt - 36.85 kB - 03/03/2019 at 14:28

Download

caffe_output.log

All files are on Kaggle here: https://www.kaggle.com/tegwyntwmffat/european-wasp-vespula-vulgaris-kitti-format

log - 907.50 kB - 12/01/2018 at 11:05

Download

  • Bee Hive Tunnel

    Capt. Flatus O'Flaherty ☠05/01/2019 at 16:33 4 comments

    Aluminum box section has been used to produce a tunnel through which bees will travel to get into the hive. So will the wasps / A. Hornets if they want to attack. Bottom right is a 1 watt laser. Middle left is hole for camera, protected from bees by a glass microscope slide. Wasps detected by the camera will be zapped by the laser and simply fall out of the tunnel. This apparatus will hopefully be attached to a small beehive tomorrow for capturing video footage of the bees inside to create 'null' images fr detection.

  • Project Re-Think

    Capt. Flatus O'Flaherty ☠04/20/2019 at 16:15 0 comments

    • Lasers are dangerous when used out in the open.
    • Gimbal is not accurate enough.
    • Camera will probably never pick up wasps/hornets in flight.
    • Variable outside lighting conditions will always be a problem.

    The proposed solution is to police the entrance to the hive by forcing the creatures to travel up the length of a 25 mm aluminum box section which has a camera mounted in the middle, looking inwards. I checked the Logitech C230e focus and it is effective to 22mm from the front of the camera, so 25 mm box is perfect. There's also a 1000 mA laser head available with 25 x 25 mm front profile. The insects will crawl over the top of the laser and get spotted by the camera as they go. If a wasp or hornet is spotted ..... they get blasted by the laser. The bees *should* clear out the dead bodies themselves if they are not completely vapourised for some reason.

    Small LEDs can be mounted at strategic locations within the box tunnel to help the camera.Currently they are green, as those were the only ones i had lying around.

    Here's a camera side view:

    This is with only a couple of LEDs. Once the lighting is sorted, the image will be very nice - perfect for feeding into the object detection system.

  • Comparison Between Mobilenet SSD and bvlc Googlenet

    Capt. Flatus O'Flaherty ☠02/18/2019 at 12:33 0 comments

    As mentioned in previous logs, there was something seriously wrong with the model trained using mobilenet SSD …… And I don't know what went wrong …. The training and subsequent model optimisation seemed to run smoothly. I ran out of time and patience so have gone back to the Jetson TX2 and made a 'like for like' appraisal using a 'fresh' video from YouTube for testing the detection capabilities. It's important to test the network on footage that was NOT used to train the network.

    Using an external USB camera, I could see no false positives on the TX2 and it did not pick up on similar insects such as flies. It's also worth noting that the network was trained using quite a few images with honey bees in the back ground, which will help focus the results away from similar looking insects. 

    The bvlc_googlenet model was trained on Amazon Web Services (AWS) for about 5 hours, 63300 iterations with a approx. 2,500 images set, 640 x 640 pixels.

  • Possible Servo Upgrade

    Capt. Flatus O'Flaherty ☠01/31/2019 at 13:47 0 comments

    The servos used here are a bit 'rough' in operation and there are much smoother servos available, particularly the Hitec 8XXX and 9XXX series which claim 12 bit (4096) operation with deadbands of 0.008 us. They are, however, rather pricey:

    https://www.multiplex-rc.de/produkte/detail/index/sArticle/6645

  • 12 Hours Later

    Capt. Flatus O'Flaherty ☠01/30/2019 at 09:26 0 comments

    Training is going well with loss stats gradually tapering off as the model becomes more accurate. Another 2 days to go.

  • Re-training the Wasp MobileSSD model

    Capt. Flatus O'Flaherty ☠01/29/2019 at 13:15 0 comments

    Since the results of the blob inference time documented in the previous log was a bit disappointing, I thought I'd try retraining the MobileSSD network with no 'pretrained model' as i thought that this model was probably what prevented me reducing the number of classes to train option. All the solverstate files and caffemodels had to be deleted in the 'results' folder as the software tried to carry on from the previous train. Initially the software complained that their was no pretrained model, but setting: 'pretrain_model = ""' fooled the python script into carrying on regardless. 'num_classes must be set to a minimum of 2 as their always needs to be a 'dont care' class.

    The model is now being trained on the Jetson TX2, which is a bit slow for this kind of work, but it's all I've got other than messing about on AWS. It'll take at least 2 days, maybe a lot longer .... Time will tell!

  • Gimbal now Working Properly using Python Scripts

    Capt. Flatus O'Flaherty ☠01/28/2019 at 13:45 0 comments

    After abandoning attempts to use C++ as shown in previous log, I reverted to python scripts, which are a lot easier to 'compile'. Hackaday's own Lewin Day pointed towards Les Wilkinson's work with Python and Neural Network Stick 2 and Intel Openvino and after forking Les's repository ( https://github.com/paddygoat/RPi3_NCS2 ) I was quickly able to modify his bottle following robot code to make my gimbal track not only bottles, but also faces and wasps. Having already proven my wasp model to work on openvino, I  now added timers to the python script to track down bottlenecks and found that my model had a fairly large one around the 'inference blob' in the script, which was, to me at least, really interesting!

    Using the same script, I was able to demonstrate the difference in performance between two networks, namely 'MobileNet-SSD' and 'Squeezenet 1.0 modified SSD', finding that the older Squeezenet model ran a lot more efficiently on the Raspberry Pi configuration giving higher blob rates and very much better inference accuracy / precision. If the object of interest is relative simple, like a face, it might be worth looking more closely at an older and simpler network such as Squeezenet for getting better results than more complicated networks such as MobileNet SSD.

  • Video of Sentry Gun in Demo Mode

    Capt. Flatus O'Flaherty ☠01/22/2019 at 15:45 0 comments

  • Failing to get Cmake CXX to Combine Programs

    Capt. Flatus O'Flaherty ☠01/22/2019 at 12:48 0 comments

    This is the rough directory structure of the two programs I am trying to combine on Raspberry Pi. There's different kinds of make files all over the place, many duplicating each other, many saying 'do not edit'. It's a tricky puzzle and I have very little idea what I am doing!

    Basically it's Intel OpenVino + servo motor pca9685 libraries on Raspberry Pi. Both libraries work fine independently, but wont compile together due to missing dependencies linked to pca9685. If anybody can help, please get in touch!

  • Making the Gimbal Work

    Capt. Flatus O'Flaherty ☠01/20/2019 at 17:06 0 comments

    To keep things nice and easy, I used the Adafruit 12 bit pwm servo hat: https://learn.adafruit.com/adafruit-16-channel-pwm-servo-hat-for-raspberry-pi?view=all and a standard library written in C++ for coding. There are python libraries, but I'm going to need to combine the Gimbal code with another C++ program,  object_detection_demo_ssd_async/main.cpp, which is used for the inference procedure.

    Getting the gimbal moving at alarmingly crazy and jerky speeds was fairly easy, but to get it running nice and slow and smooth took a little bit more head scratching. I changed the lib-pca9685 https://github.com/paddygoat/rpidmx512/tree/master/lib-pca9685 simple.cpp file to get much better resolution than with the 'proper' servo.cpp one.

    First of all, install the dependencies and libraries:

    wget http://www.airspayce.com/mikem/bcm2835/bcm2835-1.58.tar.gz
    tar zxvf bcm2835-1.58.tar.gz
    cd bcm2835-1.58
    ./configure
    make
    sudo make check
    sudo make install
    
    cd /home/pi/rpidmx512-master/lib-pca9685
    make -f Makefile.Linux "DEF=-DRASPPI"
    
    Compile and build the examples on Linux Raspbian:
    
    cd /home/pi/rpidmx512-master/lib-pca9685/examples
    make

     Next, run the 'servo.cpp' or 'simple.cpp' file:

    cd /home/pi/rpidmx512-master/lib-pca9685/examples && make && sudo ./servo
    cd /home/pi/rpidmx512-master/lib-pca9685/examples && make && sudo ./simple
    /**
     * @file simple.cpp
     *
     */
    /* Copyright (C) 2017-2018 by Arjan van Vught mailto:info@raspberrypi-dmx.nl
     *
     * Permission is hereby granted, free of charge, to any person obtaining a copy
     * of this software and associated documentation files (the "Software"), to deal
     * in the Software without restriction, including without limitation the rights
     * to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
     * copies of the Software, and to permit persons to whom the Software is
     * furnished to do so, subject to the following conditions:
    
     * The above copyright notice and this permission notice shall be included in
     * all copies or substantial portions of the Software.
    
     * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
     * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
     * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
     * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
     * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
     * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
     * THE SOFTWARE.
     */
    
    #include <stdio.h>
    #include <stdint.h>
    #include <unistd.h>
    #include <iostream> 
    
    #include "pca9685.h"
    using namespace std; 
    int msleep(unsigned long milisec)   
    {   
        struct timespec req={0};   
        time_t sec=(int)(milisec/1000);   
        milisec=milisec-(sec*1000);   
        req.tv_sec=sec;   
        req.tv_nsec=milisec*1000000L;   
        while(nanosleep(&req,&req)==-1)   
            continue;   
        return 1;   
    }   
    
    int main(int argc, char **argv) {
        if (getuid() != 0) 
        {
            fprintf(stderr, "Program is not started as \'root\' (sudo)\n");
            return -1;
        }
    
        uint16_t OnValue, OffValue;
        PCA9685 pca9685;
    
        pca9685.Dump();
        pca9685.SetFrequency(100);
    
        //pca9685.SetFullOn(CHANNEL(0), true);                     // Channel 0 Full On
        //pca9685.Write(CHANNEL(1), VALUE(PCA9685_VALUE_MAX/2));    // Channel 1, Duty Cycle = 50 %
        
        int a = 450;                                            // Laser points upwards.
        int b = 900;   // b must be greater than a.
        int c = 5;
        int x = 500;
        
        pca9685.Write(CHANNEL(0), VALUE(a));                     // Channel 2, Duty Cycle = 20 %
                                                                // Max value = 4096. 20 % = 819.2 ~ 819 counts
        //pca9685.SetFullOff(CHANNEL(3), true);                     // Channel 0 Full Off
        
    for(int d=c; d>0; d--)
      {
        for(x =b; x>a; x--)
        {
          pca9685.Write(CHANNEL(0), VALUE(x));                  // Up and down
          pca9685.Write(CHANNEL(1), VALUE(x));
          printf("Value d: %i", d);
              printf(" Value x: %i", x);
              printf("\n");
          msleep(50);
        }    
        for(x =a; x<b; x++)
        {
          pca9685.Write(CHANNEL(0), VALUE(x));
          pca9685.Write(CHANNEL(1), VALUE(x));
          printf("Value d: %i", d);
              printf(" Value x: %i", x);
              printf("\n");
    ...
    Read more »

View all 22 project logs

Enjoy this project?

Share

Discussions

Pat Fear wrote 10/27/2021 at 11:31 point

I came across your project today. really interesting. Funny enough I have built something very similar to see if it could work. I trained my model on house flies as they are easy to come by for testing. I made a short video of the machine here- https://youtu.be/z8EO5KvcxNA. I'm now editing images of the asian hornets and bees to train a new model. Happy to chat further if anyone is interested. 

  Are you sure? yes | no

Richard wrote 01/08/2020 at 02:48 point

Cool project.  Thanks for sharing the challenges and alternatives too.

I’ve had an idea for a similar project, also laser-based, but within the bounds of a walled yard and on a decidedly easier subject - scorpions.  They conveniently fluoresce under UV light, which would make spotting them easier (and they’re frequently stationary). 

  Are you sure? yes | no

excitedbox wrote 07/23/2019 at 02:46 point

If you draw a box around the moving object on screen you can use the area of the box as ID instead of image processing to ID them. Since wasps and hornets are bigger and longer than bees if you have measurements of wasps and bees from all 4 sides it shouldn´t be too hard to figure out the difference.

You could mount the laser up high so you have the ground as a backdrop and also to prevent lasing people. 

With a stereo camera or ultrasonic distance or laser sensor you should be able to get distance data down to a couple mm and use a size comparison to detect the wasps/hornets. That way you can use a more expensive higher quality camera on a solid mount reducing the weight of the turret. 

The turret can then use the distance sensors to aim and fire. If the camera has a high enough speed and image quality it shouldn´t be a problem to ID wasps in mid air. and instead of servos use stepper or BLDC motors. At a couple meters distance you will only have to take a few steps in whatever direction to follow the wasps movements.

  Are you sure? yes | no

Keith Olson wrote 07/19/2019 at 07:12 point

If you want a safer method than a laser, salt should be your ammo: https://youtu.be/KoE08lrFhzY

  Are you sure? yes | no

alexschultze wrote 07/17/2019 at 16:30 point

I have not seen (European) Hornets or Wasps entered a healthy bee hive without getting into trouble. Asian hornets are pretty slow to do so without danger. Usually they capture bees while flying.

Further, at least here in Germany, European hornets are under protection by law. You should be very certain in distinguishing the two sorts of hornets.

  Are you sure? yes | no

Rob Ward wrote 07/17/2019 at 11:00 point

Would it be possible to just detect the yellow colour of the Hornet. We have European Wasps in Australia, and Honeybees, but the wasp is a bright yellow and the honey be more of a golden yellow. Would that make a simpler detection possible. I have considered a similar idea in killing invading ants in our house. Set up a sugar trap on their trail and as they go through a tunnel, zap every second one with a high voltage, but let every second one through to report back to the others where the "food" is. 

  Are you sure? yes | no

Nathann wrote 06/04/2019 at 18:20 point

Would it work to use an array of microphone set to the frequency of the buzing of the wing ? kind of a doppler audio radar to target the flying insect ? maybe even be able to sort the different kind of inscect depending of the soud or frequency range ? Because I don't like the idea of having a lunatic turret that may shoot high power laser in someone eyes due to some false positive in the neural networks :')

  Are you sure? yes | no

Jan wrote 10/31/2018 at 17:23 point

Nice project, though adventive predators like that type of wasp are not the bees main concern. In general beekeepers bees are the last ones which will go extinct. 

Many other problems are much more serious. F'kn roundup for example. That stuff has recently been banned in Germany. The wild bees, bumble bees and other species are doing a massive job pollinating too. 

Seriously, we have to protect and restore our nature first... 

  Are you sure? yes | no

Capt. Flatus O'Flaherty ☠ wrote 11/02/2018 at 22:31 point

Absolutely!

  Are you sure? yes | no

Anders Frihagen wrote 11/04/2018 at 15:06 point

Roundup is not banned in Germany, only restricted.

BTW: Germany voted _for_ to allow Roundup in EU

  Are you sure? yes | no

Jan wrote 11/04/2018 at 16:09 point

You're right with both statements. You know, people were really upset when the media told us Germany voted for it in the European Parliament... Lobbying everywhere, it's disgusting.

  Are you sure? yes | no

BigVulcanDeal wrote 10/31/2018 at 15:24 point

I found that iNaturalist is a nice source of curated images for bees, wasps etc. The images are a little harder to download .. I created a JavaScript routine that scrapes the images in a semi-automates way. In any case, I found it iNaturalist to be the best way to get a large collection of well curated images.

  Are you sure? yes | no

Capt. Flatus O'Flaherty ☠ wrote 11/02/2018 at 22:30 point

Thanks - I'll check it out.

  Are you sure? yes | no

Capt. Flatus O'Flaherty ☠ wrote 11/03/2018 at 13:05 point

I had a look at iNaturalist and it looks good. Is it possible to have a copy of your java script? …. Thanks!!!

  Are you sure? yes | no

excitedbox wrote 07/23/2019 at 02:53 point

I use the http://www.newprosoft.com/ web content scraper whenever I have to get content. it currently only costs $49 (I paid $100x2 and would do it again). I have used it for about 13 years now. It can copy anything from any site and even make db scripts for you.

There is a free trial but the length of the scrape job is limited.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates