Optical Satellite/Component Tracking and Classification via Synthetic CNN Image Processing for Hardware-in-the-Loop testing and validation of Space Applications using free flying drone platforms

dc.contributor.authorPeterson, Marco Anthonyen
dc.contributor.committeechairBlack, Jonathan T.en
dc.contributor.committeememberArtis, Harry Paten
dc.contributor.committeememberWilliams, Ryan K.en
dc.contributor.committeememberEngland, Scott L.en
dc.contributor.departmentAerospace and Ocean Engineeringen
dc.date.accessioned2022-04-22T08:00:22Zen
dc.date.available2022-04-22T08:00:22Zen
dc.date.issued2022-04-21en
dc.description.abstractThe proliferation of reusable space vehicles has fundamentally changed how we inject assets into orbit and beyond, increasing the reliability and frequency of launches. Leading to the rapid development and adoption of new technologies into the Aerospace sector, such as computer vision (CV), machine learning (ML), and distributed networking. All these technologies are necessary to enable genuinely autonomous decision-making for space-borne platforms as our spacecraft travel further into the solar system, and our missions sets become more ambitious, requiring true ``human out of the loop" solutions for a wide range of engineering and operational problem sets. Deployment of systems proficient at classifying, tracking, capturing, and ultimately manipulating orbital assets and components for maintenance and assembly in the persistent dynamic environment of space and on the surface of other celestial bodies, tasks commonly referred to as On-Orbit Servicing and In Space Assembly, have a unique automation potential. Given the inherent dangers of manned space flight/extravehicular activity (EVAs) methods currently employed to perform spacecraft construction and maintenance tasking, coupled with the current limitation of long-duration human flight outside of low earth orbit, space robotics armed with generalized sensing and control machine learning architectures is a tremendous enabling technology. However, the large amounts of sensor data required to adequately train neural networks for these space domain tasks are either limited or non-existent, requiring alternate means of data collection/generation. Additionally, the wide-scale tools and methodologies required for hardware in the loop simulation, testing, and validation of these new technologies outside of multimillion-dollar facilities are largely in their developmental stages. This dissertation proposes a novel approach for simulating space-based computer vision sensing and robotic control using both physical and virtual reality testing environments. This methodology is designed to both be affordable and expandable, enabling hardware in the loop simulation and validation of space systems at large scale across multiple institutions. While the focus of the specific computer vision models in this paper are narrowly focused on solving imagery problems found on orbit, this work can be expanded to solve any problem set that requires robust onboard computer vision, robotic manipulation, and free flight capabilities.en
dc.description.abstractgeneralThe lack of real-world imagery of space assets and planetary surfaces required to train neural networks to autonomously identify, classify, and perform decision-making in these environments is either limited, none existent, or prohibitively expensive to obtain. Leveraging the power of the unreal engine, motion capture, and theatre projections technologies combined with robotics, computer vision, and machine learning to provide a means to recreate these worlds for the purpose of optical machine learning testing and validation for space and other celestial applications. This dissertation also incorporates domain randomization methods to increase neural network performance for the above mentioned applications.en
dc.description.degreeDoctor of Philosophyen
dc.format.mediumETDen
dc.identifier.othervt_gsexam:34089en
dc.identifier.urihttp://hdl.handle.net/10919/109721en
dc.language.isoenen
dc.publisherVirginia Techen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectComputer Visionen
dc.subjectMachine Learningen
dc.subjectNeural networkingen
dc.subjectdronesen
dc.subjectmobile manipulationen
dc.subjectoptical simulationen
dc.subjectsynthetic dataen
dc.subjectvirtual realityen
dc.subjectObject trackingen
dc.titleOptical Satellite/Component Tracking and Classification via Synthetic CNN Image Processing for Hardware-in-the-Loop testing and validation of Space Applications using free flying drone platformsen
dc.typeDissertationen
thesis.degree.disciplineAerospace Engineeringen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.leveldoctoralen
thesis.degree.nameDoctor of Philosophyen

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Peterson_MA_D_2022.pdf
Size:
57.81 MB
Format:
Adobe Portable Document Format