Environmental Science Division (EVS)a Division of Argonne National Laboratory
Predictive environmental understanding
 

AI-Enabled Avian-Solar Monitoring

EVS is developing an AI-enabled camera system to collect a large volume of data on avian-solar interactions to understand potential impacts of solar energy development on bird populations.

Accurate understanding of the interactions between birds and solar energy infrastructure is important for continued deployment of utility-scale solar energy facilities. Current monitoring methods for avian-solar interactions that rely on periodic surveys for bird carcasses are costly, infrequent, spatially constrained, and subject to errors related to searcher efficiency and carcass predation. These methods account for collision but do not consider other avian-solar interactions, such as perching and fly-through, that could provide information on the occurrence and intensity of bird attraction to solar energy facilities.

In collaboration with Argonne’s Strategic Security Sciences and Mathematics and Computer Science divisions, EVS is developing a technology for automated detection of avian-solar interactions (e.g., perching, fly-through, and collisions). The system incorporates a machine/deep learning (ML/DL)-computer vision approach and high-definition edge-computing cameras. The automated avian monitoring technology will improve researchers’ ability to collect a large volume of avian-solar interaction data to better understand potential avian impacts associated with solar energy facilities. The use of an automated method is the most timely and cost-effective option for collecting a large volume of accurate data on avian-solar interactions across large areas.

Three-Stage Approach

We are developing ML/DL models by employing three stages of modeling objectives:

  1. Detecting moving objects in video (Stage 1),
  2. Recognizing birds among the moving objects (Stage 2), and
  3. Classifying bird interactions with solar energy infrastructure (Stage 3).
To support identification of collision-induced avian fatalities, we are designing the system to notify solar facility staff of the bird collision location shortly after the occurrence is detected.

Three-stage model development objectives
Three-stage model development objectives [Source: Argonne National Laboratory]

Iterative Model Development

We accomplish each stage of the modeling objectives by iterating over two phases:

  1. Training and evaluation of the ML/DL model using existing datasets to develop a deployable model and
  2. Deployment of the trained model and evaluation of predictions to validate model performance.
For the training-testing phase, we code, train, test, and tune ML/DL models. To achieve the best prediction accuracy, we explore and select the optimal ML/DL model and perform model-specific optimization. For the deployment-prediction phase, we deploy trained ML/DL models at solar facilities to assess detection and classification accuracy for avian-solar interactions under operating conditions.

Iterative model development
Iterative model development [Source: Argonne National Laboratory]

We developed the avian monitoring technology in partnership with Boulder AI and various industry partners and with expert guidance from the Cornell Lab of Ornithology; Northwestern University; University of Chicago; University of California, Los Angeles; and regulatory and conservation stakeholders.

Argonne’s flying object detection machine/deep learning (ML/DL) model and training data collection. The ML/DL model tracking a detected moving object in video and variety of birds recognized in previous training data (top). A bird spotted and tracked at solar panels at the Argonne site (indicated with a red box and line; bottom).
Argonne’s flying object detection machine/deep learning (ML/DL) model and training data collection. The ML/DL model tracking a detected moving object in video and variety of birds recognized in previous training data (top). A bird spotted and tracked at solar panels at the Argonne site (indicated with a red box and line; bottom). [Source: Argonne National Laboratory]

Related Research Areas

See the Research Highlights Index for a complete list of EVS Research Areas.

photo of Yuki Hamada
Biophysical Remote Sensing Scientist
Capabilities: Applications of optical and infrared remote sensing and geospatial modeling approaches for analyzing and monitoring terrestrial ecosystem functions and processes; application of plant spectroscopy to hyperspectral image analysis for terrestrial ecosystem research; development of novel image processing algorithms to extract and characterize land surface and aquatic features and properties; use of geospatial information technologies in development of a framework for data interpolation, extrapolation, and scaling from fine-resolution local scale to coarse-resolution regional scale.