An image is worth a thousand species: combining neural networks, citizen science, and remote sensing to map biodiversity
2022
bioRxiv
DOI
10.1101/2022.08.16.504150
Anthropogenic habitat destruction and climate change are altering the geographic distributions of plant communities. Although mapping vegetation changes is now possible at high resolution using remote sensing data and deep convolutional neural networks, these approaches have not been applied to model the distributions of thousands of plant species to understand spatial changes in biodiversity. To address the current lack of scalable and automatic tools to map plant species distributions at a fine-grained scale, we created a dataset of over half a million citizen science observations of 2,221 plant species across California paired with satellite images at 1 meter resolution from solely free and public sources. With this we trained a deep convolutional neural network, deepbiosphere , that predicts presences of plant species within 256 × 256 meter satellite images and outperforms common low-resolution species distribution models . We showcase the novelty and potential applications of this framework by visualizing high-resolution predictions of keystone species such as coastal redwoods, identifying spatio-temporal ecosystem changes from wildfires and restoration management, and detecting urban biodiversity hotspots. Deep neural networks continuously trained on public remote sensing imagery and citizen science observations could enable cheap, automatic, and scalable monitoring of biodiversity and detect rapid anthropogenic impacts on ecosystems.