Project Abstract
|
We developed a pipeline for geomorphological analysis that uses structure from motion (SfM) and deep learning on close-range aerial imagery to estimate spatial distributions of rock traits (diameter, size, and orientation) along a tectonic fault scarp. Our pipeline leverages UAS-based imagery to help scientists gain a better understanding of tectonic surface processes. We started by using SfM on aerial imagery to produce georeferenced orthomosaics and digital elevation models (DEM), then a human expert annotated rocks on a set of image tiles sampled from the orthomosaics. These annotations were used to train a deep neural network to detect and segment individual rocks in the whole site. This pipeline automatically extracted semantic information (rock boundaries) on large volumes of unlabeled, high-resolution aerial imagery, which allowed subsequent structural analysis to result in estimates of rock diameter, size, and orientation. Two experiments were conducted along a fault scarp in the Volcanic Tablelands near Bishop, California. We conducted the first experiment with a hexrotor and a multispectral camera to produce a DEM and five spectral orthomosaics in red, green, blue, red edge (RE), and near infrared (NIR). We trained deep neural networks with different input channel combinations to study the most effective learning method for inference. In the second experiment, we deployed a DJI Phantom 4 Pro equipped with an RGB camera, and focused on the spatial distribution of rock-trait histograms in a larger area. |