Counting flies: how a camera inspired by nature is helping waste recycling
In this thought piece, Dr Alexandre Marcireau, a researcher in Neuromorphic Engineering from the International Centre for Neuromorphic Systems at Western Sydney University (WSU) describes the application of the neuromorphic camera, technology that has been inspired by biology to accurately observe fast-moving insects such as flies. It’s a crucial technology that will enable an NSSN project done in collaboration between WSU, Macquarie University and ARC Ento Tech, which involves using fly larvae to turn mixed solid waste into useful products such as chicken feed.
To cameras and our eyes alike, absolute speed hardly matters. The Moon is speeding at 3,683 km/h above our heads, yet it seems almost static in the sky. By contrast, a cricket ball leaves one's field of view in a fraction of a second, despite its modest speed of 150 km/h . The apparent speed of objects is dictated by the speed at which the angle between you and the object changes as it moves past you. This is known as the angular speed and depends on both absolute speed and distance.
While this fact may seem trivial, it is extremely important when observing insects, whose angular velocity is often remarkably high, which makes detecting, tracking, and observing them extremely challenging. Insect monitoring has a wide range of uses, including some crucially important applications such as optimising crop yields by observing the pollination patterns of bees or enhancing disease control by intercepting mosquitoes. More surprisingly, insects can even be used to recycle garbage at low cost. ARC EntoTech, an AgriTech start-up in Somersby, is pioneering techniques that leverage the larvae of a common fly known as the Black Solider fly. These larvae convert mixed solid waste into useful materials. The number of larvae that they can grow, and the throughput of the recycling pipeline, depends on their ability to breed adult flies in large numbers. Optimising breeding requires precise knowledge of the number of flies that move into the breeding room through a hatch, which brings us back to the observation of fast-moving objects.
Counting fast objects is typically done with high-speed cameras, for instance on industrial assembly lines. While such cameras are sensitive and accurate, they generate several giga-bytes of data per second. Power-hungry computers , up to a hundred watts, are needed to process these vast amounts of data in real-time. This significantly complicates the deployment of such systems at scale or in natural environments with no easy access to electricity.
To fix this issue and to implement low-power high-speed vision systems, we need to fundamentally re-design our cameras. A conventional video camera is a fast photo camera that captures pictures at fixed intervals. For mostly static scenes, such as insects flying in front of a fixed background, the frames are all similar. In other words, the camera captures a lot of redundant data. This is the reason for the high data rates and power consumption of these systems. Fortunately, nature has evolved an excellent solution to this problem. Biological eyes, from the human eye to the compound eyes of flies and bees, do not capture frames. Instead, eyes are made up of millions of tiny light sensors known as photoreceptors, which monitor the animal’s field of view and send data back to the brain only when needed, instead of regularly like a conventional camera would. This sensing strategy provides biological systems with high-speed and low latency data that lets them vastly outperform our technological solutions in terms of speed and power consumption.
The field of Neuromorphic Engineering takes inspiration from biology to design sensors and computers that aim to catch up with the impressive efficiency of biological systems. Neuromorphic systems are made of silicon, like their conventional counterparts, but their structure and approach to data processing mimics eyes and brains. The most developed sensors that have come out of Neuromorphic Engineering research are known as event-based cameras. Each pixel of an event-based camera asynchronously detects changes in its field of view and independently sends that information to the computer. Pixels that do not detect any change, for instance pixels that face a static background, do not generate any data. By contrast, pixels that detect a moving object immediately send that information to the computer extremely quickly, with an accuracy of a millionth of a second. The result is a camera with the temporal resolution of a high-speed conventional camera (tens to hundreds of thousands of frames per second) that generates so little data that its output can be processed with a smartphone or a small computer.
Western Sydney University and Macquarie University have received funding from NSSN and ARC Ento Tech to explore the use of Neuromorphic cameras to count insects and optimise recycling pipelines. Event-based cameras can also be used to record flying patterns and wing beats and may provide insight on the insects’ behaviour and well-being. What is more, the techniques and insights that will come out of this project may open the door to other applications of Neuromorphic technologies to the insect world, such as beehives monitoring.
While the field of Neuromorphic engineering has roots going all the way back to Caltech in the 1980’s, Neuromorphic sensors have only recently become mature enough that they can meaningfully compete with conventional solutions. We have barely scratched the surface of the problems that they can tackle, from sport science to space applications. Smart sensors are the invisible backbone of all modern technology, and Neuromorphic Engineering can dramatically improve them by gleefully stealing clever ideas from nature.