Computer Vision and Machine Learning

Applying advanced geoprocessing and artificial intelligence to large volumes of drone data
bt_bb_section_bottom_section_coverage_image

Dealing with the data!

The MaRRS lab leverages computer vision and deep learning techniques to deal with the large volumes of data produced by drone flights in marine science and conservation projects. These techniques make analysis more efficient, reduce human bias, and can provide more consistency in hypothesis testing. See below for examples of our work in this area.

Dealing with the data!

https://marineuas.net/UAS/wp-content/uploads/2021/01/CVAI.png
https://marineuas.net/UAS/wp-content/uploads/2021/01/GrayCNN.png
https://marineuas.net/UAS/wp-content/uploads/2021/05/BBA_CNNFrontPage.jpg
A Semi-Automated Method for Estimating Adélie Penguin Colony Abundance from a Fusion of Multispectral and Thermal Imagery Collected with Unoccupied Aircraft Systems

A Semi-Automated Method for Estimating Adélie Penguin Colony Abundance from a Fusion of Multispectral and Thermal Imagery Collected with Unoccupied Aircraft Systems

Bird, C.N.; Dawn, A.H.; Dale, J.; Johnston, D.W. A Semi-Automated Method for Estimating Adélie Penguin Colony Abundance from a Fusion of Multispectral and Thermal Imagery Collected with Unoccupied Aircraft Systems. Remote Sens. 2020, 12, 3692.
Monitoring Adélie penguin (Pygoscelis adeliae) populations on the Western Antarctic Peninsula (WAP) provides information about the health of the species and the WAP marine ecosystem itself. In January 2017, surveys of Adélie penguin colonies at Avian Island and Torgersen Island off the WAP were conducted via unoccupied aircraft systems (UAS) collecting optical Red Green Blue (RGB), thermal, and multispectral imagery. A semi-automated workflow to count individual penguins using a fusion of multispectral and thermal imagery was developed and combined into an ArcGIS workflow. This workflow isolates colonies using multispectral imagery and detects and counts individuals by thermal signatures. Two analysts conducted manual counts from synoptic RGB UAS imagery. The automated system deviated from analyst counts by −3.96% on Avian Island and by 17.83% on Torgersen Island. However, colony-by-colony comparisons revealed that the greatest deviations occurred at larger colonies. Matched pairs analysis revealed no significant differences between automated and manual counts at both locations (p > 0.31) and linear regressions of colony sizes from both methods revealed significant positive relationships approaching unity (p < 0.0002. R2 = 0.91). These results indicate that combining UAS surveys with sensor fusion techniques and semi-automated workflows provide efficient and accurate methods for monitoring seabird colonies in remote environments.
Deep learning for coastal resource conservation: automating detection of shellfish reefs

Deep learning for coastal resource conservation: automating detection of shellfish reefs

Ridge, J.T., Gray, P.C., Windle, A.E. and Johnston, D.W. (2020), Deep learning for coastal resource conservation: automating detection of shellfish reefs. Remote Sens. Ecol., 6: 431-440. https://doi.org/10.1002/rse2.134
It is increasingly important to understand the extent and health of coastal natural resources in the face of anthropogenic and climate‐driven changes. Coastal ecosystems are difficult to efficiently monitor due to the inability of existing remotely sensed data to capture complex spatial habitat patterns. To help managers and researchers avoid inefficient traditional mapping efforts, we developed a deep learning tool (OysterNet) that uses unoccupied aircraft systems (UAS) imagery to automatically detect and delineate oyster reefs, an ecosystem that has proven problematic to monitor remotely. OysterNet is a convolutional neural network (CNN) that assesses intertidal oyster reef extent, yielding a difference in total area between manual and automated delineations of just 8%, attributable in part to OysterNet's ability to detect oysters overlooked during manual demarcation. Further training of OysterNet could enable assessments of oyster reef heights and densities, and incorporation of more coastal habitat types. Future iterations will be applied to high‐resolution satellite data for effective management at larger scales.