A low-cost, long-term underwater camera trap network coupled with deep residual learning image analysis

PLoS One. 2022 Feb 2;17(2):e0263377. doi: 10.1371/journal.pone.0263377. eCollection 2022.

Abstract

Understanding long-term trends in marine ecosystems requires accurate and repeatable counts of fishes and other aquatic organisms on spatial and temporal scales that are difficult or impossible to achieve with diver-based surveys. Long-term, spatially distributed cameras, like those used in terrestrial camera trapping, have not been successfully applied in marine systems due to limitations of the aquatic environment. Here, we develop methodology for a system of low-cost, long-term camera traps (Dispersed Environment Aquatic Cameras), deployable over large spatial scales in remote marine environments. We use machine learning to classify the large volume of images collected by the cameras. We present a case study of these combined techniques' use by addressing fish movement and feeding behavior related to halos, a well-documented benthic pattern in shallow tropical reefscapes. Cameras proved able to function continuously underwater at deployed depths (up to 7 m, with later versions deployed to 40 m) with no maintenance or monitoring for over five months and collected a total of over 100,000 images in time-lapse mode (by 15 minutes) during daylight hours. Our ResNet-50-based deep learning model achieved 92.5% overall accuracy in sorting images with and without fishes, and diver surveys revealed that the camera images accurately represented local fish communities. The cameras and machine learning classification represent the first successful method for broad-scale underwater camera trap deployment, and our case study demonstrates the cameras' potential for addressing questions of marine animal behavior, distributions, and large-scale spatial patterns.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Animals
  • Aquatic Organisms / classification*
  • Coral Reefs*
  • Ecosystem*
  • Fishes / classification*
  • Image Processing, Computer-Assisted / methods*
  • Machine Learning*
  • Photography / methods*
  • Population Dynamics
  • Species Specificity

Grants and funding

This work was supported by grant #2018-007 to MRS and SMB from the Wake Forest University Center for Energy, Environment, and Sustainability (https://cees.wfu.edu), a Wake Forest University Richter Scholarship (https://ureca.wfu.edu), Sullivan Scholarship (https://biology.wfu.edu), and Undergraduate Research Fellowship (https://ureca.wfu.edu/) to AWHS, and a Wake Forest University Vecellio Grant (https://biology.wfu.edu) to SMB. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.