A harmful algal bloom (HAB) occurs when toxin-secreting algae grow exponentially due to ample nutrients from fertilizers, industrial effluent, or sewage waste brought by runoff in the body of water. Most of the time, the excessive algae growth becomes discernible in several weeks and can be green, blue-green, red, or brown, depending on the type of algae. Therefore, an algae bloom can only be detected visually in several weeks if a prewarning system is not available. Even though there is a prewarning algae bloom system for the surface of a body of water, algal bloom can occur beneath the surface due to discharged industrial sediment and dregs.

Since there are no nascent visual signs, it is harder to detect deep algal bloom before being entrenched in the body of water. Until detecting deep algal bloom, potentially frequent and low-level exposures to HAB toxins can cause execrable effects on marine life and health risks for people. When deep algae bloom is left unattended, its excessive growth may block sunlight from reaching other organisms and cause oxygen insufficiency. Unfortunately, uncontrolled algae accumulation can lead to mass extinctions caused by eutrophication.

After scrutinizing recent research papers on deep algae bloom, I noticed there are very few appliances focusing on detecting deep algal bloom. Therefore, I decided to build a budget-friendly and easy-to-use prewarning system to predict potential deep algal bloom with object detection in the hope of preventing the hazardous effects on marine life and the ecosystem.

To detect deep algae bloom, I needed to collect data from the depth of water bodies in order to train my object detection model with notable validity. Therefore, I decided to utilize a borescope camera connected to my Raspberry Pi 4 to capture images of underwater algal bloom. Since Notecard provides cellular connectivity and a secure cloud service (Notehub.io) for storing or redirecting incoming information, I decided to employ Notecard with the Notecarrier Pi Hat to make the device able to collect data, run the object detection model outdoors, and inform the user of the detection results.

After completing my data set by taking pictures of existing deep algae bloom, I built my object detection model with Edge Impulse to predict potential algal bloom. I utilized Edge Impulse FOMO (Faster Objects, More Objects) algorithm to train my model, which is a novel machine learning algorithm that brings object detection to highly constrained devices. Since Edge Impulse is nearly compatible with all microcontrollers and development boards, I had not encountered any issues while uploading and running my model on Raspberry Pi.

After training and testing my object detection (FOMO) model, I deployed and uploaded the model on Raspberry Pi as a Linux (ARMv7) application (.eim). Therefore, the device is capable of detecting deep algal bloom by running the model independently without any additional procedures or latency.

As mentioned earlier, warmer water temperatures and excessive nutrients stimulate hazardous algae bloom. Therefore, I decided to collect water quality data and add the collected information to the model detection result as a prescient warning for a potential algal bloom. To obtain pH, TDS (total dissolved solids), and water temperature measurements, I connected DFRobot water quality sensors and a DS18B20 waterproof temperature sensor to Arduino Nano since Raspberry Pi pins are occupied by the Notecarrier. Then, I utilized Arduino Nano to transfer the collected water quality data to Raspberry Pi via serial communication. Also, I connected two control buttons to Arduino Nano to send commands to Raspberry Pi via serial communication.

Since I focused on building a full-fledged AIoT device detecting deep algal bloom and providing WhatsApp communication via cellular connectivity, I decided to develop a webhook from scratch to inform the user of the model detection results with the collected water quality data and obtain given commands regarding the captured model detection images via WhatsApp.

This complementing webhook utilizes Twilio's WhatsApp API to send the incoming information transferred by Notecard over Notehub.io to the verified phone and obtain the given commands from the verified phone regarding the model detection images saved on the server. Also, the webhook processes the model detection images transferred by Raspberry Pi simultaneously via HTTP POST requests.

Lastly, to make the device as robust and sturdy as possible while operating outdoors, I designed an ocean-themed case with a sliding front cover and separate supporting mounts for water quality sensors and the borescope camera (3D printable).

So, this is my project in a nutshell 😃

In the following steps, you can find more detailed information on coding, capturing algae images with a borescope camera, transferring data from Notecard to Notehub.io via cellular connectivity, building an object detection (FOMO) model with Edge Impulse, running the model on Raspberry Pi, and developing a full-fledged webhook to communicate with WhatsApp.

🎁🎨 Huge thanks to DFRobot for sponsoring a 7'' HDMI Display with Capacitive Touchscreen.

🎁🎨 Also, huge thanks to Creality for sending me a Creality Sonic Pad, a Creality Sermoon V1 3D Printer, and a Creality CR-200B 3D Printer.

Step 1: Designing and printing an ocean-themed case

Since I focused on building a budget-friendly and accessible device that collects data from water bodies and informs the user of detected deep algae bloom via WhatsApp, I decided to design a robust and compact case allowing the user to utilize water quality sensors and the borescope camera effortlessly. To avoid overexposure to dust and prevent loose wire connections, I added a sliding front cover with a handle to the case. Then, I designed two separate supporting mounts on the top of the case so as to hang water quality sensors and the borescope camera. Also, I decided to emboss algae icons on the sliding front cover to highlight the algal bloom theme.

Since I needed to connect an HDMI screen to Raspberry Pi to observe the running operations, model detection results, and the video stream generated by the borescope camera, I added a slot on the top of the case to attach the HDMI screen seamlessly.

I designed the main case and its sliding front cover in Autodesk Fusion 360. You can download their STL files below.

For the lighthouse figure affixed to the top of the main case, I utilized this model from Thingiverse:

Then, I sliced all 3D models (STL files) in Ultimaker Cura.

Since I wanted to create a solid structure for the case with the sliding cover and apply a stylish ocean theme to the device, I utilized these PLA filaments:

Finally, I printed all parts (models) with my Creality Sermoon V1 3D Printer and Creality CR-200B 3D Printer in combination with the Creality Sonic Pad. You can find more detailed information regarding the Sonic Pad in Step 2.1.

If you are a maker or hobbyist planning to print your 3D models to create more complex and detailed projects, I highly recommend the Sermoon V1. Since the Sermoon V1 is fully-enclosed, you can print high-resolution 3D models with PLA and ABS filaments. Also, it has a smart filament runout sensor and the resume printing option for power failures.

Furthermore, the Sermoon V1 provides a flexible metal magnetic suction platform on the heated bed. So, you can remove your prints without any struggle. Also, you can feed and remove filaments automatically (one-touch) due to its unique sprite extruder (hot end) design supporting dual-gear feeding. Most importantly, you can level the bed automatically due to its user-friendly and assisted bed leveling function.

#️⃣ Before the first use, remove unnecessary cable ties and apply grease to the rails.

#️⃣ Test the nozzle and hot bed temperatures.

#️⃣ Go to Print Setup ➡ Auto leveling and adjust five predefined points automatically with the assisted leveling function.

#️⃣ Finally, place the filament into the integrated spool holder and feed the extruder with the filament.

#️⃣ Since the Sermoon V1 is not officially supported by Cura, download the latest Creality Slicer version and copy the official printer settings provided by Creality, including Start G-code and End G-code, to a custom printer profile on Cura.

Step 1.1: Improving print quality and speed with the Creality Sonic Pad

Since I wanted to improve my print quality and speed with Klipper, I decided to upgrade my Creality CR-200B 3D Printer with the Creality Sonic Pad.

Creality Sonic Pad is a beginner-friendly device to control almost any FDM 3D printer on the market with the Klipper firmware. Since the Sonic Pad uses precision-oriented algorithms, it provides remarkable results with higher printing speeds. The built-in input shaper function mitigates oscillation during high-speed printing and smooths ringing to maintain high model quality. Also, it supports G-code model preview.

Although the Sonic Pad is pre-configured for some Creality printers, it does not support the CR-200B officially yet. Therefore, I needed to add the CR-200B as a user-defined printer to the Sonic Pad. Since the Sonic Pad needs unsupported printers to be flashed with the self-compiled Klipper firmware before connection, I flashed my CR-200B with the required Klipper firmware settings via FluiddPI by following this YouTube tutorial.

If you do not know how to write a printer configuration file for Klipper, you can download the stock CR-200B configuration file from here.

#️⃣ After flashing the CR-200B with the Klipper firmware, copy the configuration file (printer.cfg) to a USB drive and connect the drive to the Sonic Pad.

#️⃣ After setting up the Sonic Pad, select Other models. Then, load the printer.cfg file.

#️⃣ After connecting the Sonic Pad to the CR-200B successfully via a USB cable, the Sonic Pad starts the self-testing procedure, which allows the user to test printer functions and level the bed.

#️⃣ After completing setting up the printer, the Sonic Pad lets the user control all functions provided by the Klipper firmware.

#️⃣ In Cura, export the sliced model in the ufp format. After uploading .ufp files to the Sonic Pad via the USB drive, it converts them to sliced G-code files automatically.

#️⃣ Also, the Sonic Pad can display model preview pictures generated by Cura with the Create Thumbnail script.

Step 1.2: Assembling the case and making connections & adjustments

// Connections
// Arduino Nano :
//                                DFRobot Analog pH Sensor Pro Kit
// A0   --------------------------- Signal
//                                DFRobot Analog TDS Sensor
// A1   --------------------------- Signal
//                                DS18B20 Waterproof Temperature Sensor
// D2   --------------------------- Data
//                                Keyes 10mm RGB LED Module (140C05)
// D3   --------------------------- R
// D5   --------------------------- G
// D6   --------------------------- B
//                                Control Button (R)
// D7   --------------------------- +
//                                Control Button (C)
// D8   --------------------------- +

To collect water quality data from water bodies in the field, I connected the analog pH sensor, the analog TDS sensor, and the DS18B20 waterproof temperature sensor to Arduino Nano.

To send commands to Raspberry Pi via serial communication and indicate the outcomes of operating functions, I added two control buttons (6x6) and a 10mm common anode RGB LED module (Keyes).

#️⃣ To calibrate the analog pH sensor so as to obtain accurate measurements, put the pH electrode into the standard solution whose pH value is 7.00. Then, record the generated pH value printed on the serial monitor. Finally, adjust the offset (pH_offset) variable according to the difference between the generated and actual pH values, for instance, 0.12 (7.00 - 6.88). The discrepancy should not exceed 0.3. For the acidic calibration, you can inspect the product wiki.

#️⃣ Since the analog TDS sensor needs to be calibrated for compensating water temperature to generate reliable measurements, I utilized a DS18B20 waterproof temperature sensor. As shown in the schematic below, before connecting the DS18B20 waterproof temperature sensor to Arduino Nano, I attached a 4.7K resistor as a pull-up from the DATA line to the VCC line of the sensor to generate accurate temperature measurements.

I connected the borescope camera and Notecard mounted on the Notecarrier Pi Hat (w/ Molex cellular antenna) to my Raspberry Pi 4. Since the embedded SIM card cellular service does not cover my country, I inserted an external SIM card into the Notecarrier Pi Hat.

Via a USB cable, I connected Arduino Nano to Raspberry Pi.

To observe running processes, I attached the 7'' HDMI display to Raspberry Pi via a Micro HDMI to HDMI cable.

After printing all parts (models), I fastened all components to their corresponding slots on the main case via a hot glue gun.

Then, I placed the sliding front cover via the dents on the main case.

Finally, I affixed the lighthouse figure to the top of the main case via the hot glue gun.

As mentioned earlier, the supporting mounts can be utilized to hang the borescope camera and water quality sensors while the device is dormant.

Step 2: Creating an account to utilize Twilio's WhatsApp API

https://www.hackster.io/kutluhan-aktar/iot-ai-assisted-deep-algae-bloom-detector-w-blues-wireless-098343#toc-step-2--creating-an-account-to-utilize-twilio-s-whatsapp-api-3

Step 3: Developing a webhook to send notifications and receive commands via WhatsApp

https://www.hackster.io/kutluhan-aktar/iot-ai-assisted-deep-algae-bloom-detector-w-blues-wireless-098343#toc-step-3--developing-a-webhook-to-send-notifications-and-receive-commands-via-whatsapp-4

Step 4: Creating a Notehub.io project to perform web requests via Notecard

https://www.hackster.io/kutluhan-aktar/iot-ai-assisted-deep-algae-bloom-detector-w-blues-wireless-098343#toc-step-4--creating-a-notehub-io-project-to-perform-web-requests-via-notecard-6

Step 5: Collecting water quality data and communicating with Raspberry Pi via serial communication w/ Arduino Nano

https://www.hackster.io/kutluhan-aktar/iot-ai-assisted-deep-algae-bloom-detector-w-blues-wireless-098343#toc-step-5--collecting-water-quality-data-and-communicating-with-raspberry-pi-via-serial-communication-w--arduino-nano-7

Step 6: Capturing deep algae images w/ a borescope camera and saving them as samples

https://www.hackster.io/kutluhan-aktar/iot-ai-assisted-deep-algae-bloom-detector-w-blues-wireless-098343#toc-step-6--capturing-deep-algae-images-w--a-borescope-camera-and-saving-them-as-samples-8

Step 7: Building an object detection (FOMO) model with Edge Impulse

https://www.hackster.io/kutluhan-aktar/iot-ai-assisted-deep-algae-bloom-detector-w-blues-wireless-098343#toc-step-7--building-an-object-detection--fomo--model-with-edge-impulse-9

Step 8: Setting up the Edge Impulse FOMO model and Notecard on Raspberry Pi

https://www.hackster.io/kutluhan-aktar/iot-ai-assisted-deep-algae-bloom-detector-w-blues-wireless-098343#toc-step-8--setting-up-the-edge-impulse-fomo-model-and-notecard-on-raspberry-pi-13

Step 9: Running the FOMO model on Raspberry Pi to detect potential deep algae bloom and informing the user via WhatsApp w/ Notecard

My Edge Impulse object detection (FOMO) model scans a captured image and predicts possibilities of trained labels to recognize an object on the given captured image. The prediction result (score) represents the model's "confidence" that the detected object corresponds to the given label (class) [0], as shown in Step 7:

After executing the main.py file on Raspberry Pi:

🌱🌊📲 When Arduino Nano transfers the collected water quality data via serial communication, the device blinks the RGB LED as magenta and prints the received water quality information on the shell.

🌱🌊📲 If the user presses the control button (R) to send the Run Inference! command, the device blinks the RGB LED as green.

🌱🌊📲 Then, the device gets the latest frame captured by the borescope camera and runs an inference with the Edge Impulse object detection model.

🌱🌊📲 After running an inference successfully, the device counts the detected objects on the given frame. Then, it modifies the frame by adding bounding boxes for each detected object to emphasize potential deep algae bloom.

🌱🌊📲 After obtaining the model detection results, the device transfers the detection results and the water quality data sent by Arduino Nano to Notehub.io via Notecard over cellular connectivity.

🌱🌊📲 As explained in Step 4, when the Notehub.io project receives the detection results with the collected water quality information, it makes an HTTP GET request to transfer the incoming data to the webhook — /twilio_whatsapp_sender/.

🌱🌊📲 As explained in Step 3, when the webhook receives a data packet from the Notehub.io project, it transfers the model detection results with the collected water quality data by adding the current date & time to the verified phone over WhatsApp via Twilio's WhatsApp API.

🌱🌊📲 When the webhook receives a message from the verified phone over WhatsApp, it checks whether the incoming message includes one of the registered commands regarding the model detection images saved in the detections folder on the server:

🌱🌊📲 If the incoming message does not include a registered command, the webhook sends the registered command list to the user as the response.

🌱🌊📲 If the webhook receives the Latest Detection command, it sends the latest model detection image in the detections folder with the total image number on the server to the verified phone.

🌱🌊📲 If there is no image in the detections folder, the webhook informs the user via a notification (text) message.

🌱🌊📲 If the webhook receives the Oldest Detection command, it sends the oldest model detection image in the detections folder with the total image number on the server to the verified phone.

🌱🌊📲 If there is no image in the detections folder, the webhook informs the user via a notification (text) message.

🌱🌊📲 If the webhook receives the Show List command, it sends all image file names in the detections folder on the server as a list to the verified phone.

🌱🌊📲 If there is no image in the detections folder, the webhook informs the user via a notification (text) message.

🌱🌊📲 If the webhook receives the Display:<IMG_NUM> command, it retrieves the selected image if it exists in the given image file name list and sends the retrieved image with its file name to the verified phone.

Display:5

🌱🌊📲 Otherwise, the webhook informs the user via a notification (text) message.

🌱🌊📲 Also, the device prints notifications, sensor measurements, Notecard status, and the server response on the shell for debugging.

As far as my experiments go, the device detects potential deep algal bloom precisely, sends data packets to the webhook via Notehub.io, and informs the user via WhatsApp faultlessly :)

Videos and Conclusion

Further Discussions

By applying object detection models trained on numerous algae bloom images in detecting potential deep algal bloom, we can achieve to:

🌱🌊📲 prevent algal toxins that can harm marine life and the ecosystem from dispersing,

🌱🌊📲 avert a harmful algal bloom (HAB) from entrenching itself in a body of water,

🌱🌊📲 mitigate the execrable effects of the excessive algae growth,

🌱🌊📲 protect endangered species.

References

[1] Algal bloom, Wikipedia The Free Encyclopedia, https://en.wikipedia.org/wiki/Algal_bloom

[2] Algal Blooms, The National Institute of Environmental Health Sciences (NIEHS), https://www.niehs.nih.gov/health/topics/agents/algal-blooms/index.cfm