VTechWorks staff will be away for the Independence Day holiday from July 4-7. We will respond to email inquiries on Monday, July 8. Thank you for your patience.
 

Automated Mapping of Typical Cropland Strips in the North China Plain Using Small Unmanned Aircraft Systems (sUAS) Photogrammetry

dc.contributor.authorZhang, Jianyongen
dc.contributor.authorZhao, Yanlingen
dc.contributor.authorAbbott, A. Lynnen
dc.contributor.authorWynne, Randolph H.en
dc.contributor.authorHu, Zhenqien
dc.contributor.authorZou, Yuzhuen
dc.contributor.authorTian, Shuaishuaien
dc.contributor.departmentElectrical and Computer Engineeringen
dc.contributor.departmentForest Resources and Environmental Conservationen
dc.coverage.countryChinaen
dc.date.accessioned2019-10-14T12:21:31Zen
dc.date.available2019-10-14T12:21:31Zen
dc.date.issued2019-10-10en
dc.date.updated2019-10-11T15:54:27Zen
dc.description.abstractAccurate mapping of agricultural fields is needed for many purposes, including irrigation decisions and cadastral management. This paper is concerned with the automated mapping of cropland strips that are common in the North China Plain. These strips are commonly 3–8 m in width and 50–300 m in length, and are separated by small ridges that assist with irrigation. Conventional surveying methods are labor-intensive and time-consuming for this application, and only limited performance is possible with very high resolution satellite images. Small Unmanned Aircraft System (sUAS) images could provide an alternative approach to ridge detection and strip mapping. This paper presents a novel method for detecting cropland strips, utilizing centimeter spatial resolution imagery captured by sUAS flying at low altitude (60 m). Using digital surface models (DSM) and ortho-rectified imagery from sUAS data, this method extracts candidate ridge locations by surface roughness segmentation in combination with geometric constraints. This method then exploits vegetation removal and morphological operations to refine candidate ridge elements, leading to polyline-based representations of cropland strip boundaries. This procedure has been tested using sUAS data from four typical cropland plots located approximately 60 km west of Jinan, China. The plots contained early winter wheat. The results indicated an ability to detect ridges with comparatively high recall and precision (96.8% and 95.4%, respectively). Cropland strips were extracted with over 98.9% agreement relative to ground truth, with kappa coefficients over 97.4%. To our knowledge, this method is the first to attempt cropland strip mapping using centimeter spatial resolution sUAS images. These results have demonstrated that sUAS mapping is a viable approach for data collection to assist in agricultural land management in the North China Plain.en
dc.description.versionPublished versionen
dc.format.mimetypeapplication/pdfen
dc.identifier.citationZhang, J.; Zhao, Y.; Abbott, A.L.; Wynne, R.H.; Hu, Z.; Zou, Y.; Tian, S. Automated Mapping of Typical Cropland Strips in the North China Plain Using Small Unmanned Aircraft Systems (sUAS) Photogrammetry. Remote Sens. 2019, 11, 2343.en
dc.identifier.doihttps://doi.org/10.3390/rs11202343en
dc.identifier.urihttp://hdl.handle.net/10919/94570en
dc.language.isoenen
dc.publisherMDPIen
dc.rightsCreative Commons Attribution 4.0 Internationalen
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/en
dc.subjectautomated extractionen
dc.subjectridge detectionen
dc.subjectstrip mappingen
dc.subjectsmall unmanned aircraft systems (sUAS)en
dc.subjectsurface roughnessen
dc.subjectNorth China Plainen
dc.titleAutomated Mapping of Typical Cropland Strips in the North China Plain Using Small Unmanned Aircraft Systems (sUAS) Photogrammetryen
dc.title.serialRemote Sensingen
dc.typeArticle - Refereeden
dc.type.dcmitypeTexten

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
remotesensing-11-02343.pdf
Size:
44.07 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
Name:
license.txt
Size:
0 B
Format:
Item-specific license agreed upon to submission
Description: