VTechWorks staff will be away for the winter holidays starting Tuesday, December 24, 2024, through Wednesday, January 1, 2025, and will not be replying to requests during this time. Thank you for your patience, and happy holidays!
 

Enhancing Trust in Autonomous Systems without Verifying Software

dc.contributor.authorStamenkovich, Joseph Allanen
dc.contributor.committeechairPatterson, Cameron D.en
dc.contributor.committeememberSaad, Waliden
dc.contributor.committeememberHuang, Berten
dc.contributor.departmentElectrical and Computer Engineeringen
dc.date.accessioned2019-06-13T08:00:38Zen
dc.date.available2019-06-13T08:00:38Zen
dc.date.issued2019-06-12en
dc.description.abstractThe complexity of the software behind autonomous systems is rapidly growing, as are the applications of what they can do. It is not unusual for the lines of code to reach the millions, which adds to the verification challenge. The machine learning algorithms involved are often "black boxes" where the precise workings are not known by the developer applying them, and their behavior is undefined when encountering an untrained scenario. With so much code, the possibility of bugs or malicious code is considerable. An approach is developed to monitor and possibly override the behavior of autonomous systems independent of the software controlling them. Application-isolated safety monitors are implemented in configurable hardware to ensure that the behavior of an autonomous system is limited to what is intended. The sensor inputs may be shared with the software, but the output from the monitors is only engaged when the system violates its prescribed behavior. For each specific rule the system is expected to follow, a monitor is present processing the relevant sensor information. The behavior is defined in linear temporal logic (LTL) and the associated monitors are implemented in a field programmable gate array (FPGA). An off-the-shelf drone is used to demonstrate the effectiveness of the monitors without any physical modifications to the drone. Upon detection of a violation, appropriate corrective actions are persistently enforced on the autonomous system.en
dc.description.abstractgeneralAutonomous systems are surprisingly vulnerable, not just from malicious hackers, but from design errors and oversights. The lines of code required can quickly climb into the millions, and the artificial decision algorithms can be inscrutable and fully dependent upon the information they are trained on. These factors cause the verification of the core software running our autonomous cars, drones, and everything else to be prohibitively difficult by traditional means. Independent safety monitors are implemented to provide internal oversight for these autonomous systems. A semi-automatic design process efficiently creates error-free monitors from safety rules drones need to follow. These monitors remain separate and isolated from the software typically controlling the system, but use the same sensor information. They are embedded in the circuitry and act as their own small, task-specific processors watching to make sure a particular rule is not violated; otherwise, they take control of the system and force corrective behavior. The monitors are added to a consumer off-the-shelf (COTS) drone to demonstrate their effectiveness. For every rule monitored, an override is triggered when they are violated. Their effectiveness depends on reliable sensor information as with any electronic component, and the completeness of the rules detailing these monitors.en
dc.description.degreeMaster of Scienceen
dc.format.mediumETDen
dc.identifier.othervt_gsexam:20441en
dc.identifier.urihttp://hdl.handle.net/10919/89950en
dc.publisherVirginia Techen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectAutonomyen
dc.subjectRuntime Verificationen
dc.subjectField programmable gate arraysen
dc.subjectField Programmable Gate Arrayen
dc.subjectMonitoren
dc.subjectFormal Methodsen
dc.subjectUASen
dc.subjectDrone aircraften
dc.subjectSecurityen
dc.subjectLinear Temporal Logicen
dc.subjectLTLen
dc.subjectHigh-Level Synthesisen
dc.subjectHLSen
dc.subjectmonitoren
dc.subjectmodelen
dc.subjectcheckingen
dc.subjectdroneen
dc.subjectmalwareen
dc.subjectassuranceen
dc.subjectroboticsen
dc.subjectfirmwareen
dc.subjecthardwareen
dc.titleEnhancing Trust in Autonomous Systems without Verifying Softwareen
dc.typeThesisen
thesis.degree.disciplineComputer Engineeringen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.levelmastersen
thesis.degree.nameMaster of Scienceen

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Stamenkovich_JA_T_2019.pdf
Size:
5.09 MB
Format:
Adobe Portable Document Format

Collections