Categories
Autonomous Moth Trap

Autonomous Moth Trap Hardware Revisions

The Autonomous Moth Trap project seeks to build on the work of Kim Bjerge and his colleagues in developing a simple system controlled by a Raspberry Pi 4 for capturing images of live moths attracted to a UV light and a set of tools using OpenCV image processing and machine learning to recognise the moths imaged. My earlier post documented the results of building a version of the trap that more or less exactly replicated the setup of the Danish design, other than rewiring the LED light table so that it could be controlled with the other lights rather than requiring manual intervention.

Standardisation is key to ensuring that biodiversity observations are as comparable as possible across time and space. I have been keen not to modify the Danish system unnecessarily. However, I would like to solve some challenges now rather than leaving them until after I have been running my system for a while.

My most significant concern has been around the use of a 15W fluorescent tube for the main light that attracts the moths. These work well, but the electronics to run these well on DC are complicated and the tubes tend to blacken at one end over time and lose brightness, which inevitably introduces variation and difficulty in comparing the resulting data. It therefore seems appropriate to replace the tube with an array of high-power LEDs. I have decided to build a trap with 9 such LEDs. This significantly changes the power requirements for the trap and is probably only appropriate for use with mains power, but I want to experiment with the capabilities of the LEDs before attempting to construct a battery-powered version.

Secondly, I wanted to collect some basic environmental measurements as the images are collected, so I have added a basic temperature and humidity sensor to be read by the Raspberry Pi.

Additionally, the first version I built lacked some basic control features:

  • Shutting down the system safely (rather than simply cutting power) required an ssh connection to the Raspberry Pi required an ssh connection. I wanted to add at least a tactile switch to trigger a safe shutdown and have modified this Pi Power Button.
  • The Raspberry Pi fan was wired straight to the 5V pin and continued to run, even when the Raspberry Pi was shut down. I wanted it only to run while the system was active (and potentially to disable it below some temperature) and have used this approach from the Raspberry Pi Forum.
  • The only way to be certain that the system was running was to listen for the fan (or to use ssh or some other access protocol). I wanted to add an multicoloured LED so I could show operational status. I considered using a full-RGB LED (four pins) but only require a bicolour LED (red/green, two pins).
  • On Twitter, Hernán L. Pereira pointed out the need for a flyback diode on the resistor, so this has also been added to the circuit.
  • I’ve also added a second method to turn on the lights, a toggle switch connecting 3.3V power to the same transistor normally activated/deactivated by a cron job on the Raspberry Pi. A tactile switch would have been appropriate, but I wanted to avoid confusion between the power on/off switch and the light testing switch.

The diagram at the top of this post was created using the very friendly tools at https://circuit-diagram.org/, where it can be accessed here. It shows the circuit as currently planned. I have organised the use of the Raspberry Pi pins to keep the circuit clear and uncluttered.

Categories
Autonomous Moth Trap

Autonomous Moth Trap Project

This project seeks to build an automated moth trap with a machine learning identification model for Canberra moths (and other insects) based on the design published earlier this year by Kim Bjerge and colleagues in the journal Sensors:

Bjerge, K.; Nielsen, J.B.; Sepstrup, M.V.; Helsing-Nielsen, F.; Høye, T.T. An Automated Light Trap to Monitor Moths (Lepidoptera) Using Computer Vision-Based Tracking and Deep LearningSensors 202121, 343. https://doi.org/10.3390/s21020343

I want to thank Kim Bjerge and Toke Thomas Høye for their generous assistance answering questions and sharing code.

Background

I started using a moth trap in the mid-nineties in the UK. At that time, I used a Heath trap with a 6W actinic tube and drew pictures of moths with crayons since I did not have a digital camera. Over the years, I’ve used a wide variety of moth traps and moth lights, including 15W actinic tubes and 125W MV lamps in various configurations, mainly from Anglian Lepidopterist Supplies in the UK, and the LepiLED lights from Dr Gunnar Brehm. Since 1999, I’ve used a series of different digital cameras to document the insects I’ve attracted (see Flickr and iNaturalist).

All of these efforts have in some small way contributed to knowledge of biodiversity. Observation records flow into the Global Biodiversity Information Facility and the Atlas of Living Australia. However, every one of these records is a response to random and non-standard circumstances. In one year, I may operate a light for many nights and photograph many insects. In another year, I may do much less.

This lack of standardisation is common throughout natural history and citizen science. In recent years, there has been growing concern at the apparent collapse in insect numbers in many regions of the world, but there are few places from which we have genuinely comparable data over long periods. We therefore cannot easily assess how significant the changes have been. We certainly have no scalable way to monitor insect populations and understand how composition and abundance varies across space and through time.

I have therefore been interested for a long time in ways that we could automate detection and monitoring for insects and other more cryptic groups. One obvious route is to build the tools for DNA-based monitoring. Hence my enthusiasm for Malaise trapping as a path to building the necessary reference libraries.

One other idea I have been keen to explore is relatively simple automation of imaging of insects coming to light. I sketched this (possibly self-explanatory) concept in July 2018, but never found the time to build it. It basically involves bright insect-attracting lights, on a timer so that they can leave the insects in peace before dawn, and (in my sketch) two cameras to image species landing on a vertical surface and on the ground below.

A sketch of a possible automated camera trap for moths, 31 July 2018

The purpose would just have unsupervised collection of photographs that I would then upload to iNaturalist or elsewhere with my manual identifications.

Design

Figure 1 from Bjerge et al. 2021: The portable light trap with a light table, a white sheet, and UV light to attract live moths during night hours. The computer vision system consists of a light ring, a camera with a computer and electronics, and a powered junction box with a DC-DC converter.

Bjerge et al.’s design involves the following components:

  • Raspberry Pi 4 to control lights and webcam (with motion detection software)
  • 15W actinic tube operated with 12V DC (to attract moths)
  • A3 LED light table (as contrasting surface on which moths can rest)
  • Logitech BRIO 4K Ultra HD webcam
  • LED ring light (to illuminate light table from camera side)
  • A circuit that allows the Raspberry Pi to turn the moth light and the ring light on and off at scheduled times

The Raspberry Pi software includes Python scripts to turn the lights on and off (through one of the GPIO pins) and to run motion detection and the Motion imaging software (Motion Project at GitHub).

Images collected by the system can then be processed using the Python Moth Classification and Counting software developed by Kim Bjerge (MCC-trap at GitHub). As described in the Sensors paper, this detects blobs in the images, tracks movement of the (presumed) same blob between images, and uses a Convolutional Neural Network model to classify the blobs based on a training set. In the Danish experiment, the training set focused on a small number of common noctuoid moths (eight classes).

Construction

I used the following components to build my version of the trap:

I had not attempted any electronics since I was a young child with my father (other than some atrocious attempts to solder loose wires). The following resources helped me:

  • This YouTube Soldering Crash Course showed me everything I had got wrong with my earlier efforts at soldering – it’s actually all surprisingly easy
  • Charles Platt’s book, Make: Electronics, is a very helpful guide to the basics of using simple components in circuits

Some notes on issues I encountered or decisions I made:

  • The Huion light box, like perhaps all other models on sale, is turned on and off and dimmed or brightened with a touch sensor behind the glass. Whenever the unit is first powered on, the light is initially off. As I understand it, the Danish team turn the light box on manually when the trap is in use. The backing sheet on this Huion unit is attached with a tacky glue and can be peeled back. This makes it possible to desolder the wires that feed the LED strip and attach them directly to where the 12V DC input connects. This bypasses the touch sensor. The touch sensor in this model has negligible resistance when the light box is at its brightest, so no additional resistor is required when bypassing it.
  • The change to the light box allowed me to turn it on and off automatically in conjunction with the other lights. I included the second potentiometer so that I could vary its voltage below 12V. (The other potentiometer was already in the reference circuit to control the light ring.)
  • Bjerge et al. used a 12V battery to power their trap. Since I plan to operate the trap in areas with access to mains electricity, I used the laptop-style power supply listed above. I used the second DC-DC converter to power the Raspberry Pi at 5.1V from the same source. (The first converter was to provide 5V power for the light ring, but that circuit is off until the Raspberry Pi turns it on.)
  • Bjerge at al. included a 75 Ohm resistor to limit the voltage across the transistor and coil of the relay. Using the same resistor in my circuit with a mains DC supply that is doubtless at a higher voltage than a battery, the Raspberry Pi correctly activated the transistor through its GPIO pin, which caused the relay to close and power the lights. However, the relay remained closed once the GPIO pin was reset to 0V. Replacing the 75 Ohm resistor with 300 Ohm allowed the circuit to work as expected.
  • I wired everything inside the enclosure, with holes for the power cable, the cables for the three lights, the lens of the webcam and the ventilation grille (with two layers of the fly screen). The vent was to ensure that the Raspberry Pi does not overheat and seemed the simplest reasonably rain- and insect-proof solution I could find.
  • I printed a 3D model in ABS to hold the various components in place.
  • I mistakenly tried using a shorter USB cable to connect the camera to the Raspberry Pi. I used a USB 3.0 port but was stupid enough not to know that USB 3.0 cables are different from USB 2.0 ones. The short cable was not compatible and limited the camera to HD instead of Ultra HD 4K resolution.
  • It would have been better to have used an external knob and potentiometer to adjust the brightness of the light box (rather than the one I soldered into the circuit).
  • Using UV LEDs (as with the LepiLED) might be a simpler solution in place of the 15W tube. I may experiment with such an alternative.

Operation

Moth trap in operation

The webcam is positioned around 220 mm from the light table. It is configured to focus at this range and to use an exposure of around 130 ms. The light box is kept relatively dim simply to offer some contrast and most light comes from the ring light. I am using a white cotton pillowcase over the light table to provide an attractive landing surface for insects.

I’ve set the trap to come on automatically at 18:00 and turn off around 05:00. The cron settings are:

59 17 * * * python /home/pi/lighton.py
00 18 * * * motion
01 18 * * * /home/pi/setCamera.sh
00 05 * * * pkill motion
01 05 * * * python /home/pi/lightoff.py

At present I control the Raspberry Pi entirely through a PuTTY telnet connection and access the images by FTP through FileZilla. I plan to automate transfer of the images to a computer so they can be processed each day.

Right now, it’s winter in Canberra, often reaching below freezing for some of the night, so there are relatively few insects. I’ve uploaded some examples (including a number captured only at HD resolution) to iNaturalist.

The trap is now more or less ready for collecting what I hope will be many thousands of images in the warmer months.

Results

The MCC-trap software has not been trained for Australian moths, but this video is a test of how it works with the set of images collected overnight, 3/4 June 2021:

ArabaAMT-20210604

For the coming months, the trap will operate as a wildlife camera trap. I will segment the images, add identifications and build training sets for classifying the commoner species here. I’d like to get to the stage where I could even fully automate some records being submitted to iNaturalist.

As can be seen in the video, the out-of-the-box processing selects a fair number of background areas and misses some insects. It may be useful for me to spend some time training a model first to distinguish between things that interest me (insects and other animals) and things that don’t (shadows, blurs, etc.). This should simplify extraction of the segmented images for training an identification model.

Notes on this project