What do birds get up to at night? It was a simple question that puzzled scientists for hundreds of years. There were wild theories that birds went underwater, or immersed themselves in mud. In the 19th century, a German stork was found with an African spear through its neck, providing evidence that birds in fact migrated. And in 1881, one scientist observed these migrating birds flying at night while pointing a telescope up at the moon.
Moon-watching for birds remained much of a niche science. It works like the transit method in astronomy, in which exoplanets are measured when their silhouettes pass in front of a star. Ornithologist George Lowery started quantifying this in the 1950s, organizing massive campaigns to collect nationwide data from these lunar observations. Between twilight and dawn, Lowery’s crew would look up at the full moon and mark the pathways and flight directions and number of birds they saw. Because technology was relatively crude at the time, they correlated the moon face with a circular clock face and marked the “time” (referring to location) that the bird entered and exited at.
“What we’re trying to do is automate that with our little robot,” says Wesley Honeycutt, a research associate at the University of Oklahoma. “Because while Lowery’s technique is useful, it’s painful. I have stared at the moon so much in the past few years.”
Honeycutt is referring to LunAero, which he and his team at University of Oklahoma created. The hardware components of LunAero involve a camera to record video, a small computer, a spotting scope, and a motorized mount. It uses simple computer vision techniques to keep the moon in focus and move along with it. It can pick up on birds that the passive observer might miss. The mount can accommodate a wide variety of telescopes, if birders or ornithologists wanted to bring their own.
In addition to video recordings, the system also generates a log file with information on recording time, the frame counts, and the camera sensor settings. The team is working on developing software that can analyze the videos obtained from LunAero. But for now, after video is collected, humans have to manually extract the frames that contain birds and annotate their flight paths and patterns (such as whether they fly slow or fast). The initial tests were run in April and May 2018 and 2019, a peak time in bird migration.
The researchers have already been able to glean a great deal of information from the data they have so far. “It depends on conditions and how high the birds are. There are certainly some birds that you can identify [their] genus, maybe [their] species if you make some assumptions about where you are and what birds you’re likely to see,” says Eli Bridge, an ornithologist and assistant professor at the University of Oklahoma. “There are some birds with really characteristic flight patterns that you can pick out—we can see nighthawks swoop up and down. In addition to just counting them, you can get really accurate flight directions for them. You can visualize wind drift.”
The goal for the team is to use this technique as an accompaniment to other bird migration tracking tools such as radar aeroecology.
“Radar aeroecology is cool because you can see the migrants and how they fly out from a city or a roost and how they flow and the height, but you can’t tell what you’re looking at,” Honeycutt says. “You can see that there is a water-balloon [shaped mass of something] in that general area of the sky, but you can’t tell what kind of bird, if it’s 12 birds, three insects in a trench coat—we don’t really know until we have a way to look at it.”
Anyone can build the LunAero unit with materials found around a workshop and motors off Amazon. The pattern for parts and the assembly instructions are available to the public. The cost for the LunAero components without a telescope is about $150. The most expensive component is the Raspberry Pi computer that powers the system. “One of the advantages of having these really cheap instruments is that you can deploy a lot of them all at once,” Honeycutt says. “And if you have all of these crummy sensors deployed next to each other, you eventually will hit a critical mass of sensors where you’re starting to produce data that are on par with high quality instrumentation.”
After they published the LunAero tech in a 2020 paper in the journal HardwareX, Bridge and Honeycutt have continued to make hardware upgrades, and have sent the devices out for birders to try.
“Ideally it would be a citizen science tool. I don’t know if we’re there yet. There’s been a few niggling things that have made it difficult,” Bridge says. If there are poor weather conditions or a cloud covers the moon, then they have to reconfigure the device throughout the night. The observations are tied to the lunar cycle, and data LunAero can collect plummets when the moon is less than half full.
“There’s been the constant little tweaks, the incremental improvements of how do you hold the camera stable on a lot of different scopes,” Honeycutt notes. “And while it’s not a big jump in concept in the hardware, it’s the little things that are growing the potential user base.”
The group is working up to one of their first data papers soon on social behavior that can be captured and quantified by LunAero. “You can tell whether birds are flying by themselves or in a cluster or sometimes a formation, sometimes just a loose group,” Bridge says. “I don’t think there’s any other way to see that at night unless you’ve got a spotlight or an infrared camera, or some other way of directly observing the birds.”
Data analysis is the big barrier to putting a widespread data paper out there. “It takes a bit longer to analyze the data than to collect it because you basically have to cycle through frame by frame,” Bridge says. “We don’t have the means to process lots of videos from lots of people right now.”
And while it might seem like a tool such as machine learning could help out, unfortunately, that might not yet be possible. “If you look at the video, a lot of those birds are one pixel,” Honeycutt says. “Distinguishing one pixel that is an actual bird from 10,000 pixels of not-bird is a non-trivial problem that I don’t think machine learning techniques can handle yet. That’s why we’re doing more of a naive system which is more computationally intensive.”
Right now, the group is working on proof-of-concept and early data papers for the instrument. But in five years, they imagine that it could become a unique add-on to the suite of existing migration technology.
“If you have tracking devices on [birds], or if you’re seeing them by radar, you can’t directly observe the birds. Being able to directly observe the birds, even if it’s while they fly across the moon, is a unique set of data,” says Jeff Kelly, a professor of biology at University of Oklahoma. “There’s always going to be value in integrating those data with tracking data where you’re getting information about where and when the bird flew but you can’t see it directly.”
There are still many mysteries that remain when it comes to understanding why and how birds migrate. Do birds fly together, do they fly separately, are they responding to the same wind conditions, are they all flying at the same heights? “It’s hard for us to concretely understand what these birds are dealing with,” says Kelly. “When we start talking about infrastructure that we build in the air, or bird collisions with buildings and problems with lights at night, this kind of data where people can concretely observe what’s going on will have a big impact on their ability to grasp the problem.”