Next Article in Journal
Tensor-Based Reduced-Dimension MUSIC Method for Parameter Estimation in Monostatic FDA-MIMO Radar
Next Article in Special Issue
Satellite Image Processing for the Coarse-Scale Investigation of Sandy Coastal Areas
Previous Article in Journal
A High Latitude Model for the E Layer Dominated Ionosphere
Previous Article in Special Issue
Observing and Predicting Coastal Erosion at the Langue de Barbarie Sand Spit around Saint Louis (Senegal, West Africa) through Satellite-Derived Digital Elevation Model and Shoreline
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Using Convolutional Neural Networks for Detection and Morphometric Analysis of Carolina Bays from Publicly Available Digital Elevation Models

by
Mark A. Lundine
* and
Arthur C. Trembanis
School of Marine Science and Policy, University of Delaware, Lewes, DE 19958, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2021, 13(18), 3770; https://0-doi-org.brum.beds.ac.uk/10.3390/rs13183770
Submission received: 29 August 2021 / Revised: 16 September 2021 / Accepted: 16 September 2021 / Published: 20 September 2021

Abstract

:
Carolina Bays are oriented and sandy-rimmed depressions that are ubiquitous throughout the Atlantic Coastal Plain (ACP). Their origin has been a highly debated topic since the 1800s and remains unsolved. Past population estimates of Carolina Bays have varied vastly, ranging between as few as 10,000 to as many as 500,000. With such a large uncertainty around the actual population size, mapping these enigmatic features is a problem that requires an automated detection scheme. Using publicly available LiDAR-derived digital elevation models (DEMs) of the ACP as training images, various types of convolutional neural networks (CNNs) were trained to detect Carolina bays. The detection results were assessed for accuracy and scalability, as well as analyzed for various morphologic, land-use and land cover, and hydrologic characteristics. Overall, the detector found over 23,000 Carolina Bays from southern New Jersey to northern Florida, with highest densities along interfluves. Carolina Bays in Delmarva were found to be smaller and shallower than Bays in the southeastern ACP. At least a third of Carolina Bays have been converted to agricultural lands and almost half of all Carolina Bays are forested. Few Carolina Bays are classified as open water basins, yet almost all of the detected Bays were within 2 km of a water body. In addition, field investigations based upon detection results were performed to describe the sedimentology of Carolina Bays. Sedimentological investigations showed that Bays typically have 1.5 m to 2.5 m thick sand rims that show a gradient in texture, with coarser sand at the bottom and finer sand and silt towards the top. Their basins were found to be 0.5 m to 2 m thick and showed a mix of clayey, silty, and sandy deposits. Last, the results compiled during this study were compared to similar depressional features (i.e., playa-lunette systems) to pinpoint any similarities in origin processes. Altogether, this study shows that CNNs are valuable tools for automated geomorphic feature detection and can lead to new insights when coupled with various forms of remotely sensed and field-based datasets.

1. Introduction

1.1. Past Studies on Carolina Bays

Carolina Bays are enigmatic and widespread geomorphic features with an unknown origin. They are round, shallow, and sandy-rimmed depressions with muddy interiors, with their major axes typically oriented northwest to southeast (see Figure 1). They span the Atlantic Coastal Plain (ACP) from southern New Jersey to southern Georgia. The Bays commonly house lakes, ponds, or ephemeral wetlands, while a significant number of Bays have been naturally or artificially drained, the latter mainly for agricultural use. Their origin is highly debated, with many unconfirmed hypotheses. Many [1,2,3,4,5,6] have argued, without substantial evidence, that they are meteoritic, comet, or glacial ice impact craters, while others [7,8,9,10,11,12,13,14,15,16,17] have tried to explain their origin using slower geological processes tied to wind, waves, burning peat, and groundwater processes. Sediments collected from Carolina Bays have returned ages ranging from over 100,000 years old to within the last 5000 years, indicating that these features have undergone numerous stages of genesis and modification [12,13,15,17,18,19,20,21,22,23,24,25]. The exact details of their dynamics, however, remain unclear.

1.2. Past Population Estimates of Carolina Bays

Twentieth century estimates of the total number of Bays ranged from 10,000 [26] to 500,000 [5], while recent work has estimated the number of Bays from Florida to Virginia to be approximately 50,000, suggesting that the often-cited estimate of 500,000 is an order of magnitude too high [27]. In Delmarva, Carolina Bay population estimates exceed 14,000 [28]. The first estimate of 50,000 for the southeastern ACP was obtained through analysis of USGS topographic maps in a single South Carolina county. The counts and size proportions of Carolina Bays were then extrapolated to all the other counties in South Carolina, and then extrapolated further to an area covering most of the southeastern ACP. There are two main problems with this method: it assumes that the USGS topographic maps reflect the complete topography of the ACP, capturing every possible Carolina Bay, and it also assumes that the Carolina Bay size and geographic distribution is the same in all regions Carolina Bays are found. In the study that determined the count estimate for Bays in Delmarva, Bays were annotated with a point in bare-earth LiDAR digital elevation models (DEMs). This method is far more accurate in counting Carolina Bays (LiDAR derived DEMs typically have resolutions of 10 m, sometimes as fine as 1 m), but incredibly time consuming. Point annotations also eliminate the opportunity to analyze morphometric characteristics like maximum relief, area, perimeter, major and minor axis length and orientation, eccentricity, and other parameters.

1.3. Traditional Computer Vision, Pixel-Based Classification, and Object Detection

With such a large population size spanning the entire ACP, and with such uncertainty around the actual population size, mapping the Carolina Bays is a problem that requires an automated detection scheme. When developing a detector, two important initial questions must be answered: what kind of detector should it be, and what data will the detector run on?
For identification of Carolina Bays, high-resolution (at least 10 m) bare-earth digital elevation models are the best data source. These datasets show the complete topography and Bays are readily visible as round topographic lows with sharp highs along their rims. Identification of Carolina Bays from aerial imagery is much more difficult, particularly for heavily forested Bays or Bays that have been converted to agricultural land. In a forested region, all an aerial image could show is the greenish hue of the forests, with maybe some sense of topographic variability, while the DEM could show every undulation in the topography, revealing many Carolina Bays (see Figure 2). High resolution digital elevation datasets are widespread and easily accessible thanks to the USGS National Map program, so obtaining datasets to run a detector on to look for Carolina Bays is feasible.
For the selection of the right detection method for Carolina Bays, one might first try some traditional image processing techniques, including Hough transforms, blob detectors like the Laplacian of Gaussian or difference of Gaussians, or feature detectors like the scale-invariant feature transform (SIFT). Or one might experiment with pixel-based machine learning algorithms, including unsupervised methods like k-means, or supervised methods like Random Forest, Decision Tree, or Quadratic Discriminant. Pixel-based methods separate an image into discrete categories based on pixel values in the image. For example, an aerial image of the near-shore, beach, and salt marsh could be separated into three classes, and the algorithm would be trained to segment the image into these three classes. Pixel-based methods have been particularly useful for land-use and land-cover classification on satellite and aerial imagery [29,30,31]. The last option would be to utilize recent advancements in deep-learning object detection algorithms like convolutional neural networks (CNNs).
CNNs have been proven as valuable object detectors for a variety of applications, including detecting sea scallops from benthic imagery [32]; detecting object signatures from ground penetrating radar [33]; detecting archaeological sites from LiDAR DEMs [34]; detecting ice-wedge polygons from aerial imagery [35]; detecting rocks from aerial imagery [36]; detecting mining-related valley fill faces from LiDAR DEMs [37]; and detecting airplanes, tennis courts, basketball courts, baseball diamonds and vehicles from aerial imagery [38]. CNNs are also fast, with models like Yolo and Faster R-CNN that can run through and detect objects in several images per second, up to 65 frames per second for Yolov4 on a Tesla V100 graphical processing unit (GPU) [39].
For Carolina Bay detection, traditional image processing techniques are ineffective because they are too general. For example, using a Hough ellipse or circle detector on a DEM would look for every circular or elliptical object in the DEM, not just Carolina Bays. Feature detectors like SIFT are also too general, as they look for all interesting features in an image, like edges, large color gradients, blobs, and corners. This creates additional post-processing steps to reach the level of object detection, like defining an object by the collection of features (bag-of-words) SIFT finds within the object [40]. Some examples of poor results using traditional computer methods for Carolina Bay detection are shown in Figure 3.
For pixel-based machine learning algorithms, both supervised and unsupervised methods are ineffective in Carolina Bay detection. Several types of pixel-based classifiers were tested on a sample DEM containing Carolina Bays (see Figure 4). The classifiers were designed to classify every pixel in the image as containing Carolina Bays or not containing Carolina Bays, based upon a manually annotated training mask. Each classifier failed to distinguish the Carolina Bays from the section of a stream in the bottom left corner of the DEM. In addition, several of the supervised classifiers resulted in a speckled classification, with pixels within Carolina Bays incorrectly tagged as not Carolina Bay pixels. This speckled effect is a typical defect in pixel-based classification when using high-resolution imagery [41,42,43,44,45]. The time each classification took was between 5 s for the Decision Tree classifier and 640 s for the MLP classifier, much greater than the time a Faster R-CNN or Yolo model would take per image.
Due to the limitations in traditional techniques and pixel-based techniques, this study aimed to use CNN architectures for Carolina Bay detection. Three different CNNs were trained for Carolina Bay detection, Faster R-CNN, Mask R-CNN, and Yolov5. Specifics on the mechanics and architecture of each respective CNN can be found in each algorithm’s original paper [46,47,48].

2. Materials and Methods

2.1. Data Sources

Thanks to previous Carolina Bay studies that revealed the broad extent of Carolina Bays [2,27,28], this study gathered digital elevation models gridded at 10 m resolution from the region shown in Figure 5. The datasets were obtained from a variety of public sources, including a 2014 LiDAR survey of the entire state of Delaware [49], Virginia’s online LiDAR portal run by the Virginia Information Technologies Agency [50], the Maryland Mapping and GIS Data Portal [51], and the USGS National Map [52]. These datasets gridded at 10 m amounted to 44 GB, which consisted of tens of thousands of individual GeoTIFFs and ERDAS Imagine files. The individual DEMs were then merged to produce a continuous mosaic of elevation for the entire ACP (see Figure 5b). This mosaic is easily navigable in GIS software like QGIS or ArcGIS Pro. This dataset as well as code for merging multiple DEMs is available at https://github.com/mlundine/rasterguru (accessed on 17 September 2021).

2.2. Building the Annotation Datasets

For the Faster R-CNN and Yolov5 models, DEM images, with north always pointing up, with footprints ranging between 3 km2 and 9 km2, and with resolutions ranging between 1 m and 10 m were selected from various regions of the ACP for annotating. This amounted to over 1921 images, with 80% randomly selected for training, and 20% randomly selected for testing. Each image was annotated in LabelImg [53] for Carolina Bays by a single annotator, which took approximately two weeks to complete. Each annotation consisted of a bounding box, defined by its four corner coordinates, the image dimensions, the image file path, and the annotation label (‘bay’). The total number of Carolina Bays annotated in this annotation dataset amounted to 5913.
For the Mask R-CNN model, the annotation dataset included DEM images, bounding box annotations, and binary mask annotations. To construct the binary mask annotations, existing shapefiles containing Carolina Bay polygons were obtained from the Delaware Geological Survey [54]. These polygon datasets had been constructed by geologists at DGS from examining LiDAR data. Then the DEM data for Delaware was gridded into 366 DEMs, with 3 km2 footprints, and 10 m resolution. The Carolina Bay polygons were then used to create images that matched the footprints of the DEMs, but instead of elevation, the pixel values were either zero or one, with the pixels marked zero not containing Carolina Bays, and the pixels marked one containing Carolina Bays. Again, these images were randomly assigned for training (80%) and testing (20%), and they were again annotated in LabelImg by a single annotator. The total number of Carolina Bays annotated in the Mask R-CNN dataset amounted to 2339.

2.3. Training the CNNs and Assessing Accuracy

Training for Faster R-CNN and Mask R-CNN were performed on an Intel Core i7-4790k CPU at 4.00 GHz. This took approximately one day for the Faster R-CNN model, but for the Mask R-CNN model it took almost a week. The Yolov5 model was trained on a NVIDIA Geforce GTX 1650, which took several hours. Training speed is greatly enhanced with the GPU, and we recommend avoiding training on a CPU. Several metrics were used to determine the accuracy of each detector; each metric is described in the results section. Training settings are described in Table 1.
Using the test images from the annotation datasets, precision, and recall were calculated for each CNN. Precision and recall are defined in Equations (1) and (2), where TP represents the number of true positives, FP represents the number of false positives, and FN represents the number of false negatives.
P = T P T P + F P
R = T P T P + F N
For all possible threshold values, between 0.00 and 1.00, precision and recall were calculated.
To investigate spatial and size discrepancies between the detections, annotations, and existing DGS Carolina Bay polygons, distributions of area, perimeter, × centroid, and y centroid were compared. For this analysis, only detections and annotations from Delaware were compared.
The Faster R-CNN model was chosen as the most efficient model, due to its speed and accuracy. To obtain a near-complete catalogue of the Carolina Bays across the ACP, it was run on the entire ACP DEM. To do so, the ACP DEM was tiled into various footprints (1.5 km × 1.5 km, 3 km × 3 km, and 6 km × 6 km), with each tile size at three different overlap amounts (0%, 25%, and 50%). Every tile was oriented with north pointing up. The detector was then run on each dataset to determine the best tiling and overlap scheme for increasing true positives, decreasing false negatives, and decreasing false positives. These results were compared to annotations from Delaware.
Each detector produced false positives, so some additional GIS-based analysis was needed to eliminate these. This consisted of some simple spatial queries based on geology and land-use. For example, any detections marked outside of the coastal plain in the Piedmont were eliminated, and any detections marked in artificial ponds were also eliminated. In addition, any manmade ponds marked as Carolina Bays were discarded.

2.4. Extracting Morphologic, LULC, and Hydrologic Parameters from Detections

The filtered detection results across the ACP were analyzed for the following morphometric characteristics: area, maximum relief, and bounding box length-to-width ratio. Bounding boxes with length-to-width ratios of one represent square boxes, while bounding boxes with length-to-width ratios greater than one represent rectangular boxes. Bays detected with a more-square box represent Bays with a more circular in shape, while Bays detected with a more rectangular box represent Bays with a more elliptical shape. The diagonals of the more rectangular bounding boxes were then analyzed for their azimuthal orientation. In addition to the morphometric characteristics, the filtered detections were analyzed for land-use and land-cover (LULC) type. LULC data was collected from the USGS GAP/LANDFIRE 2011 dataset [55].
Since Faster R-CNN only outputs bounding boxes for detections, we selected a representative subset of the detections from various geographic locations and of various sizes and then annotated these detections with a polygon for Carolina Bays. This allowed us to estimate the size discrepancy between a bounding box detection and a more-representative polygon annotation.
The hydrology of Carolina Bay detections was investigated using the USGS National Hydrography Dataset [56]. The NHD contains georeferenced polygons for various surface hydrologic units, including lakes, ponds, swamps, marshes, rivers, streams, and estuaries. The hydrologic units found within Carolina Bay detections were analyzed for how much of the Carolina Bay was covered by a water body. In addition, the distance to the nearest water body for each Carolina Bay detection was recorded.

2.5. Field Investigations Based on Detection Results

Guided by the Carolina Bay detection results, over 50 sites within Carolina Bays throughout Delaware and the eastern shore of Virginia were sampled with a hand auger to collect sedimentological data (see Figure 6). This produced over 150 samples that were later processed for grain size distributions using a sieve shaker for sand and gravel sizes and pipette analysis for silt and clay sizes.

2.6. Principal Component Analysis of Topographic Metrics

Principal component analysis (PCA) was performed on multi-band images from Delaware containing the following bands: elevation, hill shade, valley depth, aspect, profile curvature, planform curvature, and slope. This was done to determine the amount of variance described by each of these topographic metrics within Carolina Bays. In this study, elevation was the only band used for training the detectors, however, we aimed to investigate whether including these metrics could add information and enhance the detectors.

2.7. Multi-Scale Detection, Aggregation, and Smoothing

Due to the variable planform size of Carolina Bays as well as the common occurrence of overlapping Bays, this study aimed to experiment with aggregating detections from various input image footprints. First, the input ACP elevation dataset was tiled at the following footprints: 1500 m × 1500 m, 3000 m × 3000 m, 6000 m × 6000 m, 12,000 m × 12,000 m, and 20,000 m × 20,000 m. Then the trained Faster R-CNN detector was run on each of these image sets, and the detections were then filtered to the appropriate confidence threshold. False positives were filtered from these detections, and then overlapping detections were aggregated into a single polygon. To capture a more representative Carolina Bay shape, the aggregated polygons, which were either isolated boxes or aggregations of overlapping boxes, were smoothed with a polynomial approximation with exponential kernel (PAEK) with various smoothing tolerances (100 m, 500 m, 1000 m, 1500 m, and 2000 m).
A flowchart depicting the workflow for processing the ACP DEM into tiles, running the tiles through Faster R-CNN, and then running the Carolina Bay detections through the various analyses described in this paper is shown in Figure 7.

3. Results

3.1. Assessment of Detection Results

The precision and recall curve is shown in Figure 8. This allowed us to determine the optimum threshold of 0.60 for Faster R-CNN and 0.30 for Mask R-CNN. At these threshold values, we found the best balance between precision and recall, which allowed us to maximize true positives marked by the detectors, while minimizing false negatives and false positives. Average precision (AP) is defined as the area under the precision-recall curve, the closer to it is to one, the more effective the detector. In terms of AP, Mask R-CNN performed better, however, its annotation set was much smaller due to the more-laborious nature of constructing mask annotations. In addition, the Mask R-CNN detector was only trained on data from Delaware and was not reliable when released upon data outside of Delaware. It is imperative that the training data encompasses all the subtle regional differences in DEMs, which is why the Faster R-CNN model was more suitable for implementation on the entire ACP dataset. The Yolov5 model is used only for demonstration and educational purposes due to its high inference speed.
Figure 9 shows distributions of area, perimeter, and maximum relief for subsets of the training annotation dataset, the test annotation dataset, the Faster R-CNN detection dataset at a threshold of 0.60, the Mask R-CNN detection dataset at a threshold of 0.30, and the DGS polygon dataset. Table 2 lays out the middle quartiles for area, perimeter, and maximum relief from the bonding box annotations, the mask annotations, the Faster R-CNN detections, and the Mask R-CNN detections. Figure 10 shows area, perimeter, x centroid, and y centroid from the test annotation dataset and test detection dataset, with a linear fit as well as a one-to-one fit. If the detector was retaining the exact location and size as the annotations, then every point would fall on the one-to-one lines. Overall, there was agreement for each parameter (area, perimeter, x and y centroid) between each dataset, indicating that each detector was performing well in terms of retaining the size and location of the annotated Carolina Bays. One finer discrepancy is that the detector often slightly underestimated the size of each Carolina Bay compared to the annotations. The results for the Delaware detections here differ from the results reported later in this paper for Delaware. This is due to all detection boxes being included in this analysis—overlapping boxes were not aggregated into a single box so some of the boxes here are of the same Bay in adjacent images. Boxes were not aggregated here due to the annotation image dataset sometimes consisting of partial Bays.
The effect of tile size as well as overlap amount are shown in Figure 11. Smaller tiles resulted in much larger Carolina Bay counts, mainly due to larger Bays being segmented into multiple images, resulting in multiple counts of the same Bay. Larger tiles tended to obscure finer details in the DEMs, which resulted in lower Carolina Bay counts. Increasing the overlap also increased the Carolina Bay counts, due to adjacent images containing the same Carolina Bays. In summary, 3 km × 3 km tiles with 25% overlap came the closest to the test annotation count, indicating that this scheme was the optimum tiling scheme for breaking up the ACP mosaic DEM for Carolina Bay detection.
With a median area of 0.0863 km2 for Carolina Bays, the tile results indicate that using tiles that are at least 100 times the size of the median feature, with 25% overlap between tiles provides the best scheme for maximizing detections and minimizing overcounting.

3.2. Spatial Distribution of Carolina Bay Detections

Faster R-CNN detection results at 60%, with false positives filtered out, are shown in Figure 12. The total count amounted to 17,107, with 9728 in Delmarva, and the remaining 7379 in North Carolina, South Carolina, Virginia, Georgia, and New Jersey. Starting in New Jersey, Carolina Bay presence begins south of the Laurentide Ice Sheet extent and continues southward to the southern New Jersey coast on the Delaware Bay. In Delaware, Carolina Bay presence begins south of the Fall Line, with its highest density in central Delaware. Carolina Bays are in great abundance on the eastern shore of Maryland but are absent west of the Chesapeake Bay. In Virginia, Carolina Bays are in great abundance on its eastern shore, forming a north-south line of Carolina Bays along the center of this peninsula. In the Tidewater region of Virginia, west of the Chesapeake Bay, Carolina Bays are virtually absent, however, there are some isolated occurrences. In North Carolina, South Carolina, and Georgia, Carolina Bays are in great abundance on interfluves within the coastal plain and out west to the Fall Line.

3.3. Morphology of Carolina Bay Detections

Figure 13 shows scatterplots comparing a subset of the detection results with manually annotated polygons. Also shown in Figure 11 are ordinary least squares (OLS) fits, with the detection box parameters as independent variables and the polygon annotation parameters as dependent variables. On the area and perimeter plot, one-to-one fit lines are shown. The x and y centroid OLS fits were essentially one-to-one fits so these two plots do not contain the one-to-one lines. The detection boxes estimate the x and y centroid of Carolina Bays with great accuracy while overestimating the area and perimeter. The overestimation of the area and perimeter of the detection boxes, however, was linear across the size spectrum of Carolina Bays. The slope of the line for each respective OLS fit provides a method of computing actual area and perimeter from the detection box area and perimeter.
Looking at Carolina Bay morphology across the ACP (see Figure 14 and Table 3), this study found that most Carolina Bays (25th to 75th percentile) have a maximum relief of 1.5 m to 3.97 m and an area between 0.0379 km2 and 0.221 km2. The larger Bays are few in number as are the deeper Bays. The elliptical Bays tend to show a northwest-southeast orientation (see Figure 15). Moving across the ACP latitudinally, there remains spread in Carolina Bay area, however, Bays in New Jersey, Delaware, and Maryland tend to skew towards smaller sizes than Bays further south. Bays further south also tend to be deeper than the Bays further north.

3.4. LULC of Carolina Bay Detections

In Figure 16, the general land-use type within Carolina Bays is shown. Forest and woodland accounted for 41.4% of the total area of the detected Carolina Bays, agriculture accounted for 32.1%, and open water accounted for 16.8%. Shrub and Herb wetland accounted for 6.7% and developed land accounted for 1.35%. The other two categories accounted for very little of the total area of all the Carolina Bay detections. As for the fraction of total count of the detected Bays, agriculture accounts for the highest fraction, followed by forested land, with these two categories amounting to over 90% of the fraction of total count. These results indicate that most Carolina Bays have been converted from wetlands to agricultural lands and that open water Bays are few in number but are typically on the larger end of the Carolina Bay size spectrum. One caveat is that the USGS GAP/LANDFIRE dataset is gridded at 30 m resolution, while the Carolina Bay detections were obtained from 10 m resolution LiDAR. This means that there could be many smaller open-water Bays that are not reflected in the GAP/LANDFIRE dataset.

3.5. Hydrology of Carolina Bay Detections

In Figure 17, distributions for the fraction of Carolina Bay surface water cover are shown by water body type according to the USGS NHD. The only water body types found within Carolina Bays were classified as swamp/marsh or lake/pond. The fraction of water cover in Carolina Bays is typically less than 0.10 for lakes and ponds. For the Carolina Bays containing swamps and marshes, most are less than 10% covered by water, however, there is more spread out to higher water cover fractions when compared to the Bays with lakes and ponds. Overall, 21.28% of the Carolina Bay detections were found to contain surface water, with 12.76% containing swamps or marshes and 8.52% containing lakes or ponds. Bays with surface water were found across the entire spatial extent of the Carolina Bay detections, showing a similar spatial distribution as the detections.
In Figure 18, a bar chart is shown illustrating the closest bodies of water to all the Carolina Bay detections by fraction of the total count. Overall, over half of the detected Carolina Bays are closest to a lake/pond and approximately 40% are closest to a swamp/marsh. All the other water body types make up less than 10% of the detected Bays. Since many of these Bays overlapped or completely contained the water bodies, we filtered these Bays out to investigate how far the Bays with no surface water are from the nearest body of water. The empirical cumulative distribution function (ECDF) of the distance from each Carolina Bay detection to a water body is shown in Figure 19. Nearly all the Carolina Bays that do not contain surface water are within 2 km of a water body, half of these Carolina Bays are within 1 km of a water body, and a quarter of these Carolina Bays are within 200 m of a water body.

3.6. Sedimentology of Carolina Bays

Cumulative distributions for the grain sizes of samples taken from Carolina Bay rims and basins are shown in Figure 20. Figure 21 shows a profile of hand augers taken across a Carolina Bay within Delaware. The sedimentological patterns found in Carolina Bays of different size and distant geographic regions are similar, but within a single Bay, there are differences depending on where in the Bay the sample is collected. Carolina Bay interiors are often filled with a mix of fine to medium sand, silt, and clay, with the highest clay content (up to 45%) occurring near the centers of the basins. The interior sands and silts typically show a light brown to light gray color, with some occurrences of an oxidized orangish-brown color. The interior clays typically show gray coloring with varves, indicating seasonal dynamics in sediment deposition. The thickness of the interior deposits range between 0.5 m and 2 m. The sand rims typically show an oxidized, brownish-orange color and grade from silt, fine and medium sand at the top, medium and coarse sand further down, and then coarse and very coarse sand at the bottom. However, at times, silty and clayey deposits were found at the bottom of a sand rim. The sand rims range between 1.5 m and 2.5 m thick and are always thicker than the deposits in the basin. Along the slopes of the rims, there is often mixing of the basin mud and the rim sand, however, the sand along the slopes shows less oxidation than in the rims. The sediment underlying Carolina Bays is typically a coarse to very coarse sand with significant gravel content (up to 44%), and it typically is an aquifer.
The sand found in rim samples is composed mostly of frosted quartz grains, and the clays found in the Bay basins are composed mostly of kaolinite, a product of the chemical weathering of granitic rocks. The coarser sediments underlying the Bays are typically composed of mostly quartz with some feldspars, mica, and heavy minerals, common for sediments that are products of the physical weathering of granitic rocks. Meteoric material was not found in any of the Bays sampled, and none of the sediments analyzed showed any signs of high-pressure/low-temperature modification, a typical characteristic of material found in impact craters.

3.7. Principal Component Analysis of Topographic Metrics

The PCA results are shown in Figure 22. Overall, elevation provided the most information out of each topographic metric, accounting for nearly 85% of the variance. Hill shade was the next most valuable metric, accounting for most of the remaining variance. Valley depth, aspect, profile curvature, planform curvature, and slope were all redundant and accounted for nearly zero percent of the variance.

3.8. Multi-Scale Detection, Aggregation, and Smoothing

Visual comparisons between unaggregated detections, aggregated detections, and smoothed detections of Carolina Bays from North Carolina are shown in Figure 23. Smooth, aggregated detections from Delmarva, the Carolinas, and Georgia are shown in Figure 24. This multi-scale aggregation approach resulted in a near complete catalogue of all Carolina Bays visible within the ACP elevation dataset (131,665 unaggregated detections, 23,458 aggregated polygons). Larger Bays get detected in the larger footprint images while smaller bays get detected in smaller footprint images. Bays that are counted multiple times or overlap one another are aggregated into a single polygon. Smoothing with PAEK and a smoothing tolerance of 1000 m helped refine the detection boxes to a more representative shape for Carolina Bays.

4. Discussion

Fast and accurate CNN-based object detection on LiDAR DEMs is a highly useful tool for geomorphic feature detection. This study succeeded in developing a Carolina Bay detector on data encompassing the entire extent of Carolina Bays across the ACP. The detection results (over 23,000 Carolina Bays) were analyzed to show that Carolina Bays are typically 0.0379 km2 to 0.221 km2 in area and 1.5 m to 3.97 m deep from the rim apex to the basin floor.
Carolina Bays are widespread throughout the ACP from New Jersey to Georgia, particularly along interfluves, and are absent outside of the unconsolidated sediments of the ACP. Carolina Bays north of 37° latitude have a smaller area distribution, are more circular, and are shallower than the more elliptical Carolina Bays south of 37° latitude. The data sources (often complex linear interpolation instead of LiDAR) in regions of Georgia and South Carolina have lower precision and accuracy compared to the data sources through the rest of the ACP, which limits the detector’s utility in those areas. As more LiDAR is collected and made publicly available in these regions, the detection results will be updated to reflect the higher quality DEMs.
Mask R-CNN and Faster R-CNN produce reliable results when compared to manually annotated features, with Mask R-CNN retaining the planform shape particularly well. Faster R-CNN is useful for larger datasets due to its higher inference speed and less-intensive annotation requirements. Faster R-CNN detection bounding boxes overestimate the size of Carolina Bays, but this overestimation is linear across the Carolina Bay size distribution. Yolov5 is a useful tool for quick development of feature detectors for educational and demonstration purposes.
At least a third of Carolina Bays have been converted to agricultural lands and almost half of all Carolina Bays are forested. Very few open water Carolina Bays remain today yet they comprise the largest of the Bays. Many Bays house smaller ponds that have seasonal fluctuations in water-levels. Almost all the Carolina Bays are within 2 km of a water body. The Carolina Bays are a plentiful yet diminishing source of habitat for various fauna and a variety of ecosystem services. Due to their proximity to many different types of water bodies, Carolina Bays have the potential to limit flashier floods, to filter nutrient runoff, and to limit sedimentation into larger bodies of water by absorbing snowmelt and/or rain.
As “isolated” wetlands, Carolina Bays were once federally protected by the Clean Water Rule of 2015 [57], however, this legislation was repealed and replaced by the Navigable Waters Protection Rule [58], which greatly narrowed the protections outlined in the Clean Water Rule. When Carolina Bays are left in their natural state, they can provide natural habitat for various plant and animal fauna [59,60,61,62]; they can provide an outlet for stormwater during floods [63]; they can sequester atmospheric carbon [64], and as richly natural features they can offer aesthetic, recreational, and even therapeutic value to society [65]. It is dismal that almost a third of all Carolina Bays have been converted to agricultural land, often clear-cut and drained. In the future, we hope to see new legislation that expands protections on wetlands and aims to restore the once plentiful wetlands in the US.
LiDAR-derived elevation in the form of a DEM was a reliable data source for developing a Carolina Bay detector. Including hillshade as an extra band in the training images could possibly enhance the detector, but other topographic metrics like slope and valley depth would be redundant and likely have no effect on the detector’s utility. Since elevation is the source dataset for each of these topographic metrics, with all the other metrics just being an operation on elevation, it is not surprising that the other metrics were redundant and added little new information according to the PCA.
Aggregating detections from multiple image sizes was an effective method for obtaining the most complete catalogue of the Carolina Bays. Smoothing the detection boxes with PAEK resulted in polygons that were more representative of the planform shape of Carolina Bays. A drawback is that after aggregation, individual Bays cannot be isolated for morphometric analysis—if bays are overlapping, they are treated as one polygon. However, for comprehensive mapping of all Carolina Bays of various planform sizes, this method is highly effective.
Contrary to the more extreme origin stories (meteorites, comets, glacial ice impact), the results from this study imply a more gradual origin involving aeolian and lacustrine dynamics. First, Carolina Bays are clearly restricted to a particular substrate—the unconsolidated sediments of the ACP. Second, the depth and planform size of Carolina Bays show no resemblance to impact craters. Using the detection results, an estimate of the diameter of each Carolina Bay can be calculated from the length of the diagonal of the bounding box. In addition, the maximum relief calculation provides an estimate of depth. The depth to diameter ratio (d/D) is a typical computation for known impact craters. In a study investigating 930 lunar impact craters with diameter ranges between 40 m and 10 km, the D/d parameter was found to range between 0.11 and 0.17 [66]. In another study investigating smaller diameter lunar impact craters, using a sample size of 849 craters with diameters less than 1 km, the mean d/D was found to be 0.13, with a standard error of 0.03 [67]. The estimate of d/D for the Carolina Bay detections for this study had a median of 0.00567, a 25th percentile of 0.00312, a 75th percentile of 0.00938, a minimum of 3.639 × 10−5, and a maximum of 0.0725. The median d/D ratio of Carolina Bays is almost two orders of magnitude less than the d/D of known lunar impact craters. Removing the Carolina Bay deposits (thickness estimates from this study: 0.5 m to 2 m for basin deposits, 1.5 m to 2.5 m for rim deposits) would not increase the d/D ratio enough to match the d/D ratio found in known impact craters.
In other words, the lower relief of Carolina Bays when compared to known impact craters is not due to sediment deposition after an impact event. A Carolina Bay with a diameter of 1 km and a d/D that matches smaller diameter lunar impact craters (0.13) would be 130 m deep, far from what is observed. Figure 25 compares a typical Carolina Bay profile (d/D = 0.008) with a shallower impact crater profile (d/D = 0.08), as well as a curve indicating a constant basin deposit thickness of 2.5 m.
Looking at the sedimentology and stratigraphic results, Carolina Bays show signs of aeolian and lacustrine processes, not signs of an impact origin. Their sand rims resemble, in texture and composition, relict parabolic dune deposits found throughout Delmarva and the southeastern ACP [19,68,69]. The presence of clayey deposits at the bottom of some sand rims indicate regression, or reduction in water-levels over time. In addition, many Bays possess multiple concentric sand rims, with the inner rims typically returning younger OSL ages when compared to the outer rims [12]. Their basins indicate low-energy lacustrine processes, with silt, clay, and finer sand deposits, often showing varves. The basin sediments likely were deposited after the sand rims formed, supported by ages returned from OSL-dating of sand rims and radiocarbon dating of basin sediments [12,13,15,17,18,19,20,21,22,23,24,25]. However, the occurrence of rim sediments mixed with basin sediments along the slopes of Bays could indicate possible coeval depositional dates in some cases for the basin and rim sediments, shoreline processes, or possibly flood events that caused transportation of the rim deposits inward into the basins. All this information together indicates the Bays were once active geomorphic features controlled by slow morphodynamic processes, not immediate results of a widespread impact event.
Carolina Bays have also been compared to thermokarst or thaw lakes in the Arctic [11,70], due to their similar appearance (oriented, elliptical depressions). The maximum relief of thaw lakes (rim apex to bottom of basin) often exceeds 10 m, much greater than the maximum relief results reported for Carolina Bays. Again, if the Carolina Bay basin deposits are removed, their maximum reliefs are only increased by a few meters, still much less than that of the typical thaw lake. The presence of permafrost, a prerequisite for thermokarst formation, as far south as the coastal plain of Georgia seems unlikely even during prior glacial maximums. It also is not supported by proxies nor is it supported by model predictions of past permafrost extents [71,72].
Another similar feature, found in great number in the high plains of Kansas, as well as in the West Siberian Plain, are playa-lunette systems [73,74,75,76,77,78]. These are depressions that form in arid climates, where unconsolidated sediment (typically loess in Kansas, and alluvial deposits in the West Siberian Plain) gets scoured out by wind, and if the climate remains dry and unconducive to vegetative growth, sediment can often get deposited on one end of the depression and form a dune-like mound known as a lunette [77]. Otherwise, in more humid climates, the depressions tend to fill with water which can also scour out a deeper depression, in addition to depositing clays within the playa [77]. There are over 20,000 of these playas in Kansas, with a median area of 0.0064 km2, a mean area of 0.0165 km2, a maximum area of 1.872 km2, and a minimum area of 0.0003 km2 [74], all of which follow quite closely but are slightly smaller than area parameters for Carolina bays. The median heights of the lunettes were reported as 3 m [78], similar to the heights of the sand rims found in Carolina bays. The similarities between the Kansan playa-lunette systems and Carolina Bays of great interest and deserve further attention in future research.

5. Conclusions

In this paper, we demonstrated the development of a Carolina Bay detector using convolutional neural networks and publicly available LiDAR data encompassing the entire Atlantic Coastal Plain (see Video S1 for demonstration video). The detector proved to be accurate and efficient, managing to map over 23,000 Carolina Bays, from southern New Jersey to northern Florida. In addition, the detection results allowed for the quantification of various morphometric, land use and land cover, and surface hydrologic characteristics. Last, detection results were used to guide field investigations into Carolina Bay sedimentology and stratigraphy. Together, the results from this study indicate that Carolina Bays are perhaps the dominant geomorphic feature of the Atlantic Coastal Plain and are likely the result of slow lacustrine and aeolian processes rather than the result of a catastrophic impact event. In conclusion, the descriptive results presented in this paper will guide future field and modelling investigations on the formation and dynamics of Carolina Bays, as well as serve as an example of the potential information that CNNs can be used to extract from abundant geomorphic features.
Future studies in geomorphic feature detection could greatly benefit from the use of CNNs, and scientists interested in this methodology are encouraged to visit https://github.com/mlundine/tensorflow_app (accessed on 17 September 2021) for guidance on constructing a custom detector with georeferenced imagery.

Supplementary Materials

The following video is available online at https://0-www-mdpi-com.brum.beds.ac.uk/article/10.3390/rs13183770/s1, Video S1: Carolina Bay Detection Demo.

Author Contributions

Project conceptualization, A.C.T. and M.A.L.; Software development, M.A.L.; Morphometric analysis, M.A.L.; Sediment analysis, M.A.L.; Geophysical data processing, M.A.L.; Resources, A.C.T.; Writing—original draft preparation, M.A.L.; Writing—review and editing, A.C.T.; Data visualization, M.A.L.; Supervision, A.C.T.; Funding acquisition, A.C.T. All authors have read and agreed to the published version of the manuscript.

Funding

Funding for this research was provided by a Unidel Foundation Graduate Fellowship as well as a Geological Society of America Graduate Student Research Grant.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Datasets used and constructed during this research are publicly available at github.com/mlundine/CarolinaBayDetection (accessed on 17 September 2021).

Acknowledgments

Special thanks to the Coastal Sediments, Hydrodynamics, and Engineering Laboratory; Andrew Ashton of Woods Hole Oceanographic Institute; Christopher Hein of Virginia Institute of Marine Science; Kelvin Ramsey, Jaime Tomlinson, Daniel Werner, and John Callahan of the Delaware Geological Survey for scientific insight and assistance during this research.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cooke, C.W.; Melton, F.A. Discussion of the Origin of the Supposed Meteorite Scars of South Carolina. J. Geol. 1934, 42, 88–104. [Google Scholar] [CrossRef]
  2. Eyton, J.R.; Parkhurst, J.I. A Re-Evaluation of the Extraterrestrial Origin of the Carolina Bays; Geography Graduate Student Association, University of Illinois: Urbana Champaign, IL, USA, 1975. [Google Scholar]
  3. Firestone, R.B.; West, A.; Kennett, J.P.; Becker, L.; Bunch, T.E.; Revay, Z.S.; Schultz, P.H.; Belgya, T.; Kennett, D.J.; Erlandson, J.M.; et al. Evidence for an extraterrestrial impact 12,900 years ago that contributed to the megafaunal extinctions and the Younger Dryas cooling. Proc. Natl. Acad. Sci. USA 2007, 104, 16016–16021. [Google Scholar] [CrossRef]
  4. Melton, F.A.; Schriver, W. The Carolina “Bays”: Are They Meteorite Scars? J. Geol. 1933, 41, 52–66. [Google Scholar] [CrossRef]
  5. Prouty, W.F. Carolina Bays and Their Origin. GSA Bull. 1952, 63, 167–224. [Google Scholar] [CrossRef]
  6. Zamora, A. A model for the geomorphology of the Carolina Bays. Geomorphology 2017, 282, 209–216. [Google Scholar] [CrossRef]
  7. Glenn, L.C. Some Notes on Darlington (S. C.), “Bays”. Science 1895, 2, 472–475. [Google Scholar] [CrossRef]
  8. Johnson, D.W. Role of Artesian Waters in Forming the Carolina Bays. Science 1937, 86, 255–258. [Google Scholar] [CrossRef] [PubMed]
  9. Johnson, D.W. The Origin of the Carolina Bays; Columbia University Press: New York, NY, USA, 1942. [Google Scholar]
  10. Kaczorowski, R.T. The Carolina Bays and Their Relationship to Modern Oriented Lakes. Unpublished Dissertation, University of South Carolina, Sumter, SC, USA, 1977. [Google Scholar]
  11. Kaczorowski, R.T. Origin of the Carolina Bays, Technical Report-Coastal Research Division; University of South Carolina: Sumter, SC, USA, 1976. [Google Scholar]
  12. Moore, C.R.; Brooks, M.J.; Mallinson, D.J.; Parham, P.R.; Ivester, A.H.; Feathers, J.K. The Quaternary evolution of Herndon Bay, a Carolina Bay on the Coastal Plain of North Carolina (USA): Implications for paleoclimate and oriented lake genesis. Southeast. Geol. 2016, 51, 145–171. [Google Scholar]
  13. Rodriguez, A.B.; Waters, M.N.; Piehler, M.F. Burning peat and reworking loess contribute to the formation and evolution of a large Carolina-bay basin. Quat. Res. 2012, 77, 171–181. [Google Scholar] [CrossRef]
  14. Smith, L.L. Solution depressions in sandy sediments of the Coastal Plain in South Carolina. J. Geol. 1931, 39, 641–652. [Google Scholar] [CrossRef]
  15. Thom, B.G. Carolina Bays in Horry and Marion Counties, South Carolina. GSA Bull. 1970, 81, 783–813. [Google Scholar] [CrossRef]
  16. Tuomey, M. Report on the Geology of South Carolina: Geological Survey of South Carolina; A.S. Johnston: Columbia, SC, USA, 1848. [Google Scholar]
  17. Tomlinson, J.L.; Ramsey, K.W. Stratigraphic, hydrologic, and climatic influences on the formation and spatial distribution of Carolina bays in central Delaware. In Proceedings of the 49th Annual Meeting of the Northeastern Section of the Geological Society of America, Lancaster, PA, USA, 23–25 March 2014. [Google Scholar]
  18. Brooks, M.J.; Taylor, B.E.; Grant, J.A. Carolina Bay geoarchaeology and Holocene landscape evolution on the Upper Coastal Plain of South Carolina. Geoarchaeology 1996, 11, 481–504. [Google Scholar] [CrossRef]
  19. Brooks, M.J.; Taylor, B.E.; Ivester, A.H. Carolina bays: Time capsules of culture and climate change. Southeast. Archaeol. 2010, 29, 146–163. [Google Scholar] [CrossRef]
  20. Brooks, M.J.; Taylor, B.E.; Stone, P.A.; Gardner, L.R. Pleistocene encroachment of the Wateree River sand sheet into Big Bay on the middle Coastal Plain of South Carolina. Southeast. Geol. 2001, 40, 241–257. [Google Scholar]
  21. Hussey, T.C. A 20,000 Year History of Vegetation and Climate at Clear Pond, Northeastern South Carolina. Unpublished Master’s Thesis, University of Maine, Orono, MA, USA, 1993. [Google Scholar]
  22. Ivester, A.H.; Brooks, M.J.; Taylor, B.E. Sedimentology and ages of Carolina bay sand rims. Geol. Soc. Am. Abstr. Programs 2007, 39, 5. [Google Scholar]
  23. Ivester, A.H.; Poplin, E.C.; Brooks, M.J.; Brook, G.A. Life on the edge: The formation of Mathis Lake and its human occupation. South Carol. Antiq. 2009, 41, 1–16. [Google Scholar]
  24. Ramsey, K.W.; Baxter, S.J. Radiocarbon Dates from Delaware: A Compilation; Delaware Geological Survey, University of Delaware: Newark, DE, USA, 1996; pp. 1–18. [Google Scholar]
  25. Watts, W.A. Late-Quaternary Vegetation History at White Pond on the Inner Coastal Plain of South Carolina. Quat. Res. 1980, 13, 187–199. [Google Scholar] [CrossRef]
  26. Richardson, C.J.; Gibbons, J.W. Pocosins, Carolina Bays and Mountain Bogs. In Biodiversity of the Southeastern United States—Lowland Terrestrial Communities; Martin, W.H., Boyce, S.G., Echternacht, A.C., Eds.; Wiley: New York, NY, USA, 1993; pp. 257–310. [Google Scholar]
  27. Piovan, S.; Hodgson, M.E. How many Carolina bays? An analysis of Carolina bays from USGS topographic maps at different scales. Cartogr. Geogr. Inf. Sci. 2016, 44, 310–326. [Google Scholar] [CrossRef]
  28. Fenstermacher, D.E.; Rabenhorst, M.C.; Lang, M.W.; Mccarty, G.W.; Needelman, B.A. Distribution, Morphometry, and Land Use of Delmarva Bays. Wetlands 2014, 34, 1219–1228. [Google Scholar] [CrossRef]
  29. Gevana, D.; Camacho, L.; Carandang, A.; Camacho, S.; Im, S. Land use characterization and change detection of a small mangrove area in Banacon Island, Bohol, Philippines using a maximum likelihood classification method. For. Sci. Technol. 2015, 11, 197–205. [Google Scholar] [CrossRef]
  30. Otukei, J.R.; Blaschke, T. Land cover change assessment using decision trees, support vector machines and maximum likelihood classification algorithms. Int. J. Appl. Earth Obs. Geoinf. 2010, 12 (Suppl. 1), 27–31. [Google Scholar] [CrossRef]
  31. Sekovski, I.; Stecchi, F.; Mancini, F.; Del Rio, L. Image classification methods applied to shoreline extraction on very high-resolution multispectral imagery. Int. J. Remote. Sens. 2014, 35, 3556–3578. [Google Scholar] [CrossRef]
  32. Rasmussen, C.; Zhao, J.; Ferraro, D.; Trembanis, A. Deep Census: AUV-Based Scallop Population Monitoring. In Proceedings of the IEEE International Conference on Computer Vision Workshops (ICCVW), Venice, Italy, 22–29 October 2017; pp. 2865–2873. [Google Scholar]
  33. Hou, F.; Lei, W.; Li, S.; Xi, J.; Xu, M.; Luo, J. Improved Mask R-CNN with distance guided intersection over union for GPR signature detection and segmentation. Autom. Constr. 2021, 121, 103414. [Google Scholar] [CrossRef]
  34. Bonhage, A.; Eltaher, M.; Raab, T.; Breuß, M.; Raab, A.; Schneider, A. A modified Mask region-based convolutional neural network approach for the automated detection of archaeological sites on high-resolution light detection and ranging-derived digital elevation models in the North German Lowland. Archaeol. Prospect. 2021, 28, 177–186. [Google Scholar] [CrossRef]
  35. Zhang, W.; Witharana, C.; Liljedahl, A.K.; Kanevskiy, M. Deep convolutional neural networks for automated characterization of Artic ice-wedge polygons in very high spatial resolution aerial imagery. Remote. Sens. 2018, 10, 1487. [Google Scholar] [CrossRef]
  36. Chen, Z.; Scott, T.R.; Bearman, S.; Anand, H.; Keating, D.; Scott, C.; Arrowsmith, J.R.; Das, J. Geomorphological analysis using unpiloted aircraft systems, structure from motion, and deep learning. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October–24 January 2021; pp. 1276–1283. [Google Scholar]
  37. Maxwell, A.E.; Pourmohammadi, P.; Poyner, J.D. Mapping the Topographic Features of Mining-Related Valley Fills Using Mask R-CNN Deep Learning and Digital Elevation Data. Remote Sens. 2020, 12, 547. [Google Scholar] [CrossRef]
  38. Guo, W.; Yang, W.; Zhang, H.; Hua, G. Geospatial Object Detection in High Resolution Satellite Images Based on Multi-Scale Convolutional Neural Network. Remote. Sens. 2018, 10, 131. [Google Scholar] [CrossRef]
  39. Bochkovskiy, A.; Wang, C.; Liao, H.M. YOLOv4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
  40. O’Mahony, N.; Campbell, S.; Carvalho, A.; Harapanahalli, S.; Hernandez, G.V.; Krpalkova, L.; Riordan, D.; Walsh, J. Deep Learning vs. Traditional Computer Vision. In Advances in Computer Vision; Springer International Publishing: Cham, Switzerland, 2019; pp. 128–144. [Google Scholar]
  41. Campagnolo, M.L.; Cerdeira, J.O. Contextual classification of remotely sensed images with integer linear programming. In CompIMAGE. Computational Modeling of Objects Represented in Images: Fundamentals, Methods, and Applications; Taylor and Francis: London, UK, 2007; pp. 123–128. [Google Scholar]
  42. de Jong, S.M.; Hornstra, T.; Maas, H.-G. An integrated spatial and spectral approach to the classification of Mediterranean land cover types: The SSC method. Int. J. Appl. Earth Obs. Geoinf. 2001, 3, 176–183. [Google Scholar] [CrossRef]
  43. Gao, Y.; Mas, J.F. A comparison of the performance of pixel-based and object-based classifications over images with various spatial resolutions. Online J. Earth Sci. 2008, 2, 27–35. [Google Scholar]
  44. Van de Voorde, T.; De Genst, W.; Canters, F.; Stephenne, N.; Wolff, E.; Binard, M. Extraction of land use/land correlated information from very high resolution data in urban and suburban areas, Remote Sensing in Transition. In Proceedings of the 23rd Symposium of the European Association of Remote Sensing Laboratories, Rotterdam, The Netherlands, 2–5 June 2003; pp. 237–244. [Google Scholar]
  45. Weih, R.; Riggan, N. Object-based classification vs. Pixel-based classification: Comparitive importance of multi-resolution imagery. Int. Arch. Photogramm. Remote. Sens. Spat. Inf. Sci. ISPRS Arch. 2010, 38, C7. [Google Scholar]
  46. Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. arXiv 2015, arXiv:1506.01497. [Google Scholar] [CrossRef] [PubMed]
  47. He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask R-CNN. arXiv 2018, arXiv:1703.06870. [Google Scholar]
  48. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
  49. OCM Partners. 2014 USGS CMGP Lidar: Sandy Restoration (Delaware and Maryland). Available online: https://inport.nmfs.noaa.gov/inport/item/49662 (accessed on 1 August 2019).
  50. Virginia Department of Mines, Minerals and Energy (DMME). Virginia GIS Clearinghouse. Available online: https://vgin.maps.arcgis.com/home/index.html (accessed on 1 January 2020).
  51. Maryland iMap. Maryland’s Mapping and GIS Portal Lidar Download. Available online: https://imap.maryland.gov/Pages/lidar-download.aspx (accessed on 1 January 2020).
  52. U.S. Geological Survey. The National Map. Available online: https://www.usgs.gov/core-science-systems/national-geospatial-program/national-map (accessed on 1 January 2020).
  53. Tzutalin; LabelImg. Git Code 2015. Available online: https://github.com/tzutalin/labelImg (accessed on 1 August 2019).
  54. Delaware Geological Survey (DGS). DGS Digital Datasets. Available online: https://www.dgs.udel.edu/data (accessed on 1 August 2019).
  55. U.S. Geological Survey Gap Analysis Program. GAP/LANDFIRE National Terrestrial Ecosystems 2011. U.S. Geological Survey. 2016. Available online: https://www.sciencebase.gov/catalog/item/573cc51be4b0dae0d5e4b0c5 (accessed on 1 January 2020).
  56. U.S. Geological Survey. National Hydrography Dataset. Available online: https://www.usgs.gov/core-science-systems/ngp/national-hydrography (accessed on 1 February 2021).
  57. U.S. Environmental Protection Agency and U.S. Army Corps of Engineers. Clean Water Rule: Definition of “Waters of the United States”. 2015. Available online: https://www.epa.gov/sites/default/files/2015-05/documents/technical_support_document_for_the_clean_water_rule_1.pdf (accessed on 1 August 2021).
  58. U.S. Environmental Protection Agency and U.S. Army Corps of Engineers. The Navigable Waters Protection Rule: Definition of “Waters of the United States”. 2020. Available online: https://www.federalregister.gov/documents/2020/04/21/2020-02500/the-navigable-waters-protection-rule-definition-of-waters-of-the-united-states (accessed on 1 August 2021).
  59. Howell, N.; Krings, A.; Braham, R.R. Guide to the littoral zone vascular flora of Carolina bay lakes (U.S.A.). Biodivers. Data J. 2016, 4, e7964. [Google Scholar]
  60. Semlitsch, R. Size does matter: The value of small isolated wetlands. Natl. Wetl. Newsl. 2000, 22, 5–6, 13–14. [Google Scholar]
  61. Spadafora, E.; Leslie, A.W.; Culler, L.E.; Smith, R.F.; Staver, K.W.; Lamp, W.O. Macroinvertebrate community convergence between natural, rehabilitated, and created wetlands. Restor. Ecol. 2016, 24, 463–470. [Google Scholar] [CrossRef]
  62. Van De Genachte, E.; Cammack, S. Carolina Bays of Georgia, Their Distribution, Condition, And Conservation; Georgia Natural Heritage Program, Wildlife Resources Division: SE Social Circle, GA, USA, 2002. [Google Scholar]
  63. Kauffman, G.J. Socioeconomic Value of Delaware Wetlands; University of Delaware, Water Resources Center: Newark, DE, USA, 2018. [Google Scholar]
  64. Fenstermacher, D.E. Carbon Storage and Potential Carbon Sequestration in Depressional Wetlands of the Mid-Atlantic Region. Unpublished Master’s Thesis, University of Maryland, College Park, MD, USA, 2012. [Google Scholar]
  65. White, M.P.; Alcock, I.; Grellier, J.; Wheeler, B.; Hartig, T.; Warber, S.L.; Bone, A.; Depledge, M.H.; Fleming, L.E. Spending at least 120 minutes a week in nature is associated with good health and wellbeing. Sci. Rep. 2019, 9, 1–11. [Google Scholar] [CrossRef]
  66. Stopar, J.D.; Robinson, M.S.; Barnouin, O.; McEwen, A.S.; Speyerer, E.J.; Henriksen, M.R.; Sutton, S.S. Relative depths of simple craters and the nature of the lunar regolith. Icarus 2017, 298, 34–48. [Google Scholar] [CrossRef]
  67. Sun, S.; Yue, Z.; Di, K. Investigation of the depth and diameter relationship of subkilometer-diameter lunar craters. Icarus 2018, 309, 61–68. [Google Scholar] [CrossRef]
  68. Carver, R.E.; Brook, G.E. Late Pleistocene paleowind directions. Paleogeogr. Paleoclimatol. Paleoecol. 1989, 74, 205–216. [Google Scholar] [CrossRef]
  69. Markewich, H.W.; Litwin, R.J.; Wysocki, D.A.; Pavich, M.J. Synthesis on Quaternary aeolian research in the unglaciated eastern United States. Aeolian Res. 2015, 17, 139–191. [Google Scholar] [CrossRef]
  70. Swezey, C.S. Quaternary Eolian Dunes and Sand Sheets in Inland Locations of the Atlantic Coastal Plain Province, USA. In Inland Dunes of North America. Dunes of the World; Lancaster, N., Hesp, P., Eds.; Springer: Cham, Switzerland, 2020. [Google Scholar]
  71. French, H.M.; Millar, S.W.S. Permafrost at the time of the Last Glacial Maximum (LGM) in North America. Boreas 2013, 43, 667–677. [Google Scholar] [CrossRef]
  72. Lindgren, A.; Hugelius, G.; Kuhry, P.; Christensen, T.R.; Vandenberghe, J. GIS-based Maps and Area Estimates of Northern Hemisphere Permafrost Extent during the Last Glacial Maximum. Permafr. Periglac. Process. 2016, 27, 6–16. [Google Scholar] [CrossRef]
  73. Bowen, M.W.; Johnson, W.C. Late Quaternary environmental reconstructions of playa-lunette system evolution on the central High Plains of Kansas, United States. GSA Bull. 2011, 124, 146–161. [Google Scholar] [CrossRef]
  74. Bowen, M.W.; Johnson, W.C.; Egbert, S.L.; Klopfenstein, S.T. A GIS-based Approach to Identify and Map Playa Wetlands on the High Plains, Kansas, USA. Wetlands 2010, 30, 675–684. [Google Scholar] [CrossRef]
  75. Goudie, A.; Kent, P.; Viles, H. Pan morphology, distribution and formation in Kazakhstan and neighbouring areas of the Russian federation. Desert 2016, 21, 1–13. [Google Scholar]
  76. Quillin, J.P.; Zartman, R.E.; Fish, E.B. Spatial distribution of playa basins on the Texas High Plains. Tex. J. Agric. Nat. Resour. 2005, 18, 1–14. [Google Scholar]
  77. Bowen, M.W. Spatial Distribution and Geomorphic Evolution of Playa-Lunette Systems of the Central High Plains of Kansas. Ph.D. Thesis, University of Kansas, Lawrence, KS, USA, 2011. [Google Scholar]
  78. Bowen, M.W.; Johnson, W.C.; King, D.A. Spatial distribution and geomorphology of lunette dunes on the High Plains of Western Kansas: Implications for geoarchaeological and paleoenvironmental research. Phys. Geogr. 2017, 39, 21–37. [Google Scholar] [CrossRef]
Figure 1. (a) Jones Lake, a Carolina Bay in North Carolina. (b) A flooded Carolina Bay in Delaware. (c) A Carolina Bay in Delaware with bald cypress trees. (d) LiDAR DEM of Carolina Bays on the eastern shore of Virginia. (e) LiDAR DEM of Carolina Bays in North Carolina.
Figure 1. (a) Jones Lake, a Carolina Bay in North Carolina. (b) A flooded Carolina Bay in Delaware. (c) A Carolina Bay in Delaware with bald cypress trees. (d) LiDAR DEM of Carolina Bays on the eastern shore of Virginia. (e) LiDAR DEM of Carolina Bays in North Carolina.
Remotesensing 13 03770 g001
Figure 2. Carolina Bays in central Delaware. (a) LiDAR DEM gridded at 10 m. (b) Aerial imagery from ESRI World Imagery basemap.
Figure 2. Carolina Bays in central Delaware. (a) LiDAR DEM gridded at 10 m. (b) Aerial imagery from ESRI World Imagery basemap.
Remotesensing 13 03770 g002
Figure 3. Traditional computer vision algorithms tested for Carolina Bay detection. (a) Input DEM. (b) Local minima detector (each white point is a local minima). (c) Laplacian of Gaussians blob detector (detected blobs and their radii are shown in yellow). (d) Scale invariant feature transform (SIFT; keypoints and their radii are shown in blue).
Figure 3. Traditional computer vision algorithms tested for Carolina Bay detection. (a) Input DEM. (b) Local minima detector (each white point is a local minima). (c) Laplacian of Gaussians blob detector (detected blobs and their radii are shown in yellow). (d) Scale invariant feature transform (SIFT; keypoints and their radii are shown in blue).
Remotesensing 13 03770 g003
Figure 4. (a) Input DEM. (b) Mask annotation. (c) K-means unsupervised two-class classifier. (d) GaussianNB supervised classifier. (e) Decision Tree supervised classifier. (f) Random Forest supervised classifier. (g) Quadratic Discriminant supervised classifier. (h) MLP supervised classifier. (i) AdaBoost supervised classifier.
Figure 4. (a) Input DEM. (b) Mask annotation. (c) K-means unsupervised two-class classifier. (d) GaussianNB supervised classifier. (e) Decision Tree supervised classifier. (f) Random Forest supervised classifier. (g) Quadratic Discriminant supervised classifier. (h) MLP supervised classifier. (i) AdaBoost supervised classifier.
Remotesensing 13 03770 g004
Figure 5. (a) National Map Elevation 10-m DEM availability by production method. (b) Boundary for mosaic of 10-m elevation data used for this study.
Figure 5. (a) National Map Elevation 10-m DEM availability by production method. (b) Boundary for mosaic of 10-m elevation data used for this study.
Remotesensing 13 03770 g005
Figure 6. Sediment sample locations across Delmarva.
Figure 6. Sediment sample locations across Delmarva.
Remotesensing 13 03770 g006
Figure 7. The general workflow for getting from the ACP DEM to Carolina Bay detections with information on spatial distribution, morphology, land-use and land-cover, and surface hydrology.
Figure 7. The general workflow for getting from the ACP DEM to Carolina Bay detections with information on spatial distribution, morphology, land-use and land-cover, and surface hydrology.
Remotesensing 13 03770 g007
Figure 8. Precision and recall curves for Faster R-CNN and Mask R-CNN Carolina Bay detectors.
Figure 8. Precision and recall curves for Faster R-CNN and Mask R-CNN Carolina Bay detectors.
Remotesensing 13 03770 g008
Figure 9. Kernel density estimation for (a) area (b) perimeter, and (c) maximum relief for the Delaware training and test annotations, Delaware Faster R-CNN detection at 60%, Delaware Mask R-CNN detections at 30%, and existing data from DGS.
Figure 9. Kernel density estimation for (a) area (b) perimeter, and (c) maximum relief for the Delaware training and test annotations, Delaware Faster R-CNN detection at 60%, Delaware Mask R-CNN detections at 30%, and existing data from DGS.
Remotesensing 13 03770 g009
Figure 10. Comparing annotated bounding boxes with detection bounding boxes. (a) Bounding box area. (b) Bounding box perimeter. (c) Bounding box easting centroid. (d) Bounding box northing centroid.
Figure 10. Comparing annotated bounding boxes with detection bounding boxes. (a) Bounding box area. (b) Bounding box perimeter. (c) Bounding box easting centroid. (d) Bounding box northing centroid.
Remotesensing 13 03770 g010
Figure 11. Delaware’s Carolina Bay detection counts at 60% threshold for various DEM tile sizes and overlap amounts compared to the annotation count.
Figure 11. Delaware’s Carolina Bay detection counts at 60% threshold for various DEM tile sizes and overlap amounts compared to the annotation count.
Remotesensing 13 03770 g011
Figure 12. (a) Carolina Bay detections across the ACP. (b) Heat map of Carolina Bay detections showing higher vs lower density areas.
Figure 12. (a) Carolina Bay detections across the ACP. (b) Heat map of Carolina Bay detections showing higher vs lower density areas.
Remotesensing 13 03770 g012
Figure 13. Comparing detection results with manually annotated polygons. Each plot shows an OLS fit with the x-axis quantity as the independent variable and the y-axis quantity as the dependent variable. One-to-one fits are plotted on the area and perimeter plots. (a) Area. (b) Perimeter. (c) Centroid longitude. (d) Centroid latitude.
Figure 13. Comparing detection results with manually annotated polygons. Each plot shows an OLS fit with the x-axis quantity as the independent variable and the y-axis quantity as the dependent variable. One-to-one fits are plotted on the area and perimeter plots. (a) Area. (b) Perimeter. (c) Centroid longitude. (d) Centroid latitude.
Remotesensing 13 03770 g013
Figure 14. (a) Distribution of maximum relief from Carolina Bay detections across the ACP. (b) Distribution of area from Carolina Bay detections across the ACP.
Figure 14. (a) Distribution of maximum relief from Carolina Bay detections across the ACP. (b) Distribution of area from Carolina Bay detections across the ACP.
Remotesensing 13 03770 g014
Figure 15. Polar plot of kernel density estimation of Carolina Bay detection major axis orientations.
Figure 15. Polar plot of kernel density estimation of Carolina Bay detection major axis orientations.
Remotesensing 13 03770 g015
Figure 16. General land-use and land-cover type of Carolina Bay detections by fraction of total area and fraction of total count.
Figure 16. General land-use and land-cover type of Carolina Bay detections by fraction of total area and fraction of total count.
Remotesensing 13 03770 g016
Figure 17. KDEs for fraction of Bay covered by lake/pond water and swamp/marsh water.
Figure 17. KDEs for fraction of Bay covered by lake/pond water and swamp/marsh water.
Remotesensing 13 03770 g017
Figure 18. Closest bodies of water to Carolina Bay detections by fraction of total detection count.
Figure 18. Closest bodies of water to Carolina Bay detections by fraction of total detection count.
Remotesensing 13 03770 g018
Figure 19. EDCF for distance to water body for Carolina Bay detections that did not intersect any NHD water bodies.
Figure 19. EDCF for distance to water body for Carolina Bay detections that did not intersect any NHD water bodies.
Remotesensing 13 03770 g019
Figure 20. Grain size cumulative mass fractions with standard errors by sample type. Top: Grain size cumulative mass fractions for samples with more than 10% silt/clay. Bottom: Grain size cumulative mass fractions for samples with less than 10% silt/clay.
Figure 20. Grain size cumulative mass fractions with standard errors by sample type. Top: Grain size cumulative mass fractions for samples with more than 10% silt/clay. Bottom: Grain size cumulative mass fractions for samples with less than 10% silt/clay.
Remotesensing 13 03770 g020
Figure 21. Profile of hand auger samples with descriptions taken at a Carolina Bay in Delaware. (a) Broader geographic location. (b) Location of samples plotted on a DEM. (c) Cross-section with sediment descriptions.
Figure 21. Profile of hand auger samples with descriptions taken at a Carolina Bay in Delaware. (a) Broader geographic location. (b) Location of samples plotted on a DEM. (c) Cross-section with sediment descriptions.
Remotesensing 13 03770 g021
Figure 22. Principal component analysis of various topographic metrics within Carolina Bay detections.
Figure 22. Principal component analysis of various topographic metrics within Carolina Bay detections.
Remotesensing 13 03770 g022
Figure 23. (a) Carolina Bay detections at various image footprint scales. (b) Aggregation of overlapping detections. (c) PAEK-smoothed polygons.
Figure 23. (a) Carolina Bay detections at various image footprint scales. (b) Aggregation of overlapping detections. (c) PAEK-smoothed polygons.
Remotesensing 13 03770 g023aRemotesensing 13 03770 g023b
Figure 24. Smooth, aggregated, multi-scale detections in (a) Central Delaware, (b) Southern Delaware, (c) Virginia, (d) North Carolina, (e) South Carolina, and (f) Georgia.
Figure 24. Smooth, aggregated, multi-scale detections in (a) Central Delaware, (b) Southern Delaware, (c) Virginia, (d) North Carolina, (e) South Carolina, and (f) Georgia.
Remotesensing 13 03770 g024aRemotesensing 13 03770 g024b
Figure 25. Comparing a Carolina Bay elevation profile with a hypothetical impact crater elevation profile. The sediment deposit thickness curve is plotted to show how much basin sediment fill has occurred in the Carolina Bay since its formation.
Figure 25. Comparing a Carolina Bay elevation profile with a hypothetical impact crater elevation profile. The sediment deposit thickness curve is plotted to show how much basin sediment fill has occurred in the Carolina Bay since its formation.
Remotesensing 13 03770 g025
Table 1. Training Settings for different CNNs.
Table 1. Training Settings for different CNNs.
ModelPre-Trained ModelBatch Size; Iterations; EpochsMaximum Image Size
Faster R-CNNFaster_rcnn_inception_v2_pets1; 40,193; 271024 × 1024
Mask R-CNNmask_rcnn_resnet101_atrous_coco1; 40,479; 271024 × 1024
Yolov5Yolov5s16; 36,018; 300640 × 640
Table 2. Area, perimeter, and max. relief middle quartiles for annotations and detections from Delaware.
Table 2. Area, perimeter, and max. relief middle quartiles for annotations and detections from Delaware.
Dataset; Sample SizeArea (km2) Middle QuartilesPerimeter (m) Middle QuartilesMaximum Relief (m) Middle Quartiles
Delaware bounding box annotations; 392125%: 0.013825%: 474.025%: 1.731
50%: 0.028150%: 678.050%: 2.348;
75%: 0.059175%: 984.075%: 3.128
Delaware Faster R-CNN detections at 0.60; 455725%: 0.015025%: 492.425%: 1.568
50%: 0.028650%: 684.550%: 2.230
75%: 0.057975%: 974.275%: 3.043
DGS dataset (mask annotations); 108525%: 0.010925%: 393.525%: 1.515
50%: 0.026250%: 662.750%: 2.171
75%: 0.060075%: 1132.975%: 2.990
Delaware Mask R-CNN detections at 0.60; 332825%: 0.091225%: 378.925%: 1.376
50%: 0.024550%: 628.150%: 2.132
75%: 0.061375%: 1096.575%: 3.046
Table 3. Area, max. relief, and length to width ratio middle quartiles for subsets of the Carolina Bay detections.
Table 3. Area, max. relief, and length to width ratio middle quartiles for subsets of the Carolina Bay detections.
State/RegionArea (km2) Middle QuartilesMaximum Relief (m) Middle QuartilesLength:Width Middle QuartilesSample Size
New Jersey25%: 0.02225%: 1.82825%: 1.051116
50%: 0.03950%: 2.50950%: 1.11
75%: 0.06775%: 3.50775%: 1.21
Delaware25%: 0.03025%: 1.76125%: 1.052408
50%: 0.05250%: 2.33250%: 1.11
75%: 0.09175%: 3.04575%: 1.21
Maryland25%: 0.02825%: 0.80025%: 1.054871
50%: 0.04950%: 1.34550%: 1.11
75%: 0.09175%: 2.41275%: 1.21
Virginia25%: 0.09725%: 2.18425%: 1.05312
50%: 0.26750%: 3.06450%: 1.11
75%: 0.62075%: 4.51175%: 1.22
North Carolina25%: 0.16925%: 2.78925%: 1.041017
50%: 0.37350%: 4.05050%: 1.09
75%: 0.91075%: 5.67575%: 1.17
South Carolina25%: 0.14325%: 2.82825%: 1.082383
50%: 0.26850%: 4.15150%: 1.15
75%: 0.60275%: 6.01675%: 1.26
Georgia25%: 0.12525%: 3.49725%: 1.051129
50%: 0.20050%: 4.84850%: 1.13
75%: 0.40675%: 6.87575%: 1.24
North of 37°25%: 0.02825%: 1.10025%: 1.058699
50%: 0.05050%: 1.96350%: 1.11
75%: 0.09275%: 2.83975%: 1.21
South of 37°25%: 0.14225%: 2.97625%: 1.064552
50%: 0.26450%: 4.31650%: 1.13
75%: 0.60875%: 6.19075%: 1.24
Entire ACP25%: 0.03825%: 1.50025%: 1.0513,251
50%: 0.08650%: 2.53350%: 1.12
75%: 0.22175%: 3.97075%: 1.22
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lundine, M.A.; Trembanis, A.C. Using Convolutional Neural Networks for Detection and Morphometric Analysis of Carolina Bays from Publicly Available Digital Elevation Models. Remote Sens. 2021, 13, 3770. https://0-doi-org.brum.beds.ac.uk/10.3390/rs13183770

AMA Style

Lundine MA, Trembanis AC. Using Convolutional Neural Networks for Detection and Morphometric Analysis of Carolina Bays from Publicly Available Digital Elevation Models. Remote Sensing. 2021; 13(18):3770. https://0-doi-org.brum.beds.ac.uk/10.3390/rs13183770

Chicago/Turabian Style

Lundine, Mark A., and Arthur C. Trembanis. 2021. "Using Convolutional Neural Networks for Detection and Morphometric Analysis of Carolina Bays from Publicly Available Digital Elevation Models" Remote Sensing 13, no. 18: 3770. https://0-doi-org.brum.beds.ac.uk/10.3390/rs13183770

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop