This project seeks to build an automated moth trap with a machine learning identification model for Canberra moths (and other insects) based on the design published earlier this year by Kim Bjerge and colleagues in the journal Sensors:
Bjerge, K.; Nielsen, J.B.; Sepstrup, M.V.; Helsing-Nielsen, F.; Høye, T.T. An Automated Light Trap to Monitor Moths (Lepidoptera) Using Computer Vision-Based Tracking and Deep Learning. Sensors 2021, 21, 343. https://doi.org/10.3390/s21020343
I want to thank Kim Bjerge and Toke Thomas Høye for their generous assistance answering questions and sharing code.
This is the first post in a series:
- Autonomous Moth Trap Project
- Autonomous Moth Trap Hardware Revisions
- Hardware and Software updates to Autonomous Moth Trap
- Alternatives for Autonomous Moth Trap
- Software components for Autonomous Moth Trap
- Time Lapse module for Autonomous Moth Trap
Background
I started using a moth trap in the mid-nineties in the UK. At that time, I used a Heath trap with a 6W actinic tube and drew pictures of moths with crayons since I did not have a digital camera. Over the years, I’ve used a wide variety of moth traps and moth lights, including 15W actinic tubes and 125W MV lamps in various configurations, mainly from Anglian Lepidopterist Supplies in the UK, and the LepiLED lights from Dr Gunnar Brehm. Since 1999, I’ve used a series of different digital cameras to document the insects I’ve attracted (see Flickr and iNaturalist).
All of these efforts have in some small way contributed to knowledge of biodiversity. Observation records flow into the Global Biodiversity Information Facility and the Atlas of Living Australia. However, every one of these records is a response to random and non-standard circumstances. In one year, I may operate a light for many nights and photograph many insects. In another year, I may do much less.
This lack of standardisation is common throughout natural history and citizen science. In recent years, there has been growing concern at the apparent collapse in insect numbers in many regions of the world, but there are few places from which we have genuinely comparable data over long periods. We therefore cannot easily assess how significant the changes have been. We certainly have no scalable way to monitor insect populations and understand how composition and abundance varies across space and through time.
I have therefore been interested for a long time in ways that we could automate detection and monitoring for insects and other more cryptic groups. One obvious route is to build the tools for DNA-based monitoring. Hence my enthusiasm for Malaise trapping as a path to building the necessary reference libraries.
One other idea I have been keen to explore is relatively simple automation of imaging of insects coming to light. I sketched this (possibly self-explanatory) concept in July 2018, but never found the time to build it. It basically involves bright insect-attracting lights, on a timer so that they can leave the insects in peace before dawn, and (in my sketch) two cameras to image species landing on a vertical surface and on the ground below.

The purpose would just have unsupervised collection of photographs that I would then upload to iNaturalist or elsewhere with my manual identifications.
Design
Early in 2021, I came across the paper by Kim Bjerge and his colleagues on an automated trap to detect and recognise a subset of Danish noctuid moth species.

Bjerge et al.’s design involves the following components:
- Raspberry Pi 4 to control lights and webcam (with motion detection software)
- 15W actinic tube operated with 12V DC (to attract moths)
- A3 LED light table (as contrasting surface on which moths can rest)
- Logitech BRIO 4K Ultra HD webcam
- LED ring light (to illuminate light table from camera side)
- A circuit that allows the Raspberry Pi to turn the moth light and the ring light on and off at scheduled times
The Raspberry Pi software includes Python scripts to turn the lights on and off (through one of the GPIO pins) and to run motion detection and the Motion imaging software (Motion Project at GitHub).
Images collected by the system can then be processed using the Python Moth Classification and Counting software developed by Kim Bjerge (MCC-trap at GitHub). As described in the Sensors paper, this detects blobs in the images, tracks movement of the (presumed) same blob between images, and uses a Convolutional Neural Network model to classify the blobs based on a training set. In the Danish experiment, the training set focused on a small number of common noctuoid moths (eight classes).
Construction

I used the following components to build my version of the trap:
- LABISTS Raspberry Pi 4 Complete Kit with Pi 4 Model B 8GB RAM Board, 128GB Micro SD
- Anglian Lepidopterists SK04 15W Portable 12 volt Actinic Trap Kit (actually an older version of the ALS 15W control box)
- Huion A3 Thin LED Light Box
- Neewer 6-inch USB Ring Light
- Logitech BRIO Ultra HD Webcam
- 12V 5A Power Adapter AC 100-220V to DC 60W Power Supply
- Digi-Key general-purpose relay SPST 10A 9V
- Digi-Key 100 Ohm 0.5W top thumbwheel potentiometer (x2)
- Digi-Key DC-DC converter 5V 10W
- Digi-Key NPN 45V 100mA TO92-3 transistor
- Digi-Key 0.47 Ohm 50W resistor
- Digi-Key 510 Ohm 5W resistor
- Digi-Key 75 Ohm 0.25W resistor
- Jaycar 5V DC to DC converter module
- Jaycar Sealed ABS Enclosure – 222 x 146 x 55mm
- Jaycar Vero type strip 95mm x 75mm
- Bunnings Haron 100 x 75mm Silver Aluminium Pressed Wall Vent
- Bunnings Permastik Fly Screen Repair Patches
- Assorted screws, wires, glands and heat shrink tubing
I had not attempted any electronics since I was a young child with my father (other than some atrocious attempts to solder loose wires). The following resources helped me:
- This YouTube Soldering Crash Course showed me everything I had got wrong with my earlier efforts at soldering – it’s actually all surprisingly easy
- Charles Platt’s book, Make: Electronics, is a very helpful guide to the basics of using simple components in circuits
Some notes on issues I encountered or decisions I made:
- The Huion light box, like perhaps all other models on sale, is turned on and off and dimmed or brightened with a touch sensor behind the glass. Whenever the unit is first powered on, the light is initially off. As I understand it, the Danish team turn the light box on manually when the trap is in use. The backing sheet on this Huion unit is attached with a tacky glue and can be peeled back. This makes it possible to desolder the wires that feed the LED strip and attach them directly to where the 12V DC input connects. This bypasses the touch sensor. The touch sensor in this model has negligible resistance when the light box is at its brightest, so no additional resistor is required when bypassing it.
- The change to the light box allowed me to turn it on and off automatically in conjunction with the other lights. I included the second potentiometer so that I could vary its voltage below 12V. (The first potentiometer was already in the reference circuit to control the light ring.)
- Bjerge et al. used a 12V battery to power their trap. Since I plan to operate the trap in areas with access to mains electricity, I used the laptop-style power supply listed above. I used the second DC-DC converter to power the Raspberry Pi at 5.1V from the same source. (The first converter was to provide 5V power for the light ring, but that circuit is off until the Raspberry Pi turns it on.)
- Bjerge at al. included a 75 Ohm resistor to limit the voltage across the transistor and coil of the relay. Using the same resistor in my circuit with a mains DC supply that is doubtless at a higher voltage than a battery, the Raspberry Pi correctly activated the transistor through its GPIO pin, which caused the relay to close and power the lights. However, the relay remained closed once the GPIO pin was reset to 0V. Replacing the 75 Ohm resistor with 300 Ohm allowed the circuit to work as expected.
- I wired everything inside the enclosure, with holes for the power cable, the cables for the three lights, the lens of the webcam and the ventilation grille (with two layers of the fly screen). The vent was to ensure that the Raspberry Pi does not overheat and seemed the simplest reasonably rain- and insect-proof solution I could find.
- I printed a 3D model in ABS to hold the various components in place.
- I mistakenly tried using a shorter USB cable to connect the camera to the Raspberry Pi. I used a USB 3.0 port but was stupid enough not to know that USB 3.0 cables are different from USB 2.0 ones. The short cable was not compatible and limited the camera to HD instead of Ultra HD 4K resolution.
- It would have been better to have used an external knob and potentiometer to adjust the brightness of the light box (rather than the one I soldered into the circuit).
- Using UV LEDs (as with the LepiLED) might be a simpler solution in place of the 15W tube. I may experiment with such an alternative.
Operation

The webcam is positioned around 220 mm from the light table. It is configured to focus at this range and to use an exposure of around 130 ms. The light box is kept relatively dim simply to offer some contrast and most light comes from the ring light. I am using a white cotton pillowcase over the light table to provide an attractive landing surface for insects.
I’ve set the trap to come on automatically at 18:00 and turn off around 05:00. The cron settings are:
59 17 * * * python /home/pi/lighton.py
00 18 * * * motion
01 18 * * * /home/pi/setCamera.sh
00 05 * * * pkill motion
01 05 * * * python /home/pi/lightoff.py
At present I control the Raspberry Pi entirely through a PuTTY telnet connection and access the images by FTP through FileZilla. I plan to automate transfer of the images to a computer so they can be processed each day.
Right now, it’s winter in Canberra, often reaching below freezing for some of the night, so there are relatively few insects. I’ve uploaded some examples (including a number captured only at HD resolution) to iNaturalist.
The trap is now more or less ready for collecting what I hope will be many thousands of images in the warmer months.
Results
The MCC-trap software has not been trained for Australian moths, but this video is a test of how it works with the set of images collected overnight, 3/4 June 2021:
For the coming months, the trap will operate as a wildlife camera trap. I will segment the images, add identifications and build training sets for classifying the commoner species here. I’d like to get to the stage where I could even fully automate some records being submitted to iNaturalist.
As can be seen in the video, the out-of-the-box processing selects a fair number of background areas and misses some insects. It may be useful for me to spend some time training a model first to distinguish between things that interest me (insects and other animals) and things that don’t (shadows, blurs, etc.). This should simplify extraction of the segmented images for training an identification model.