Environment Mapping in Larger Spaces

dc.contributor.authorCiambrone, Andrew Jamesen
dc.contributor.committeechairGracanin, Denisen
dc.contributor.committeememberNorth, Christopher L.en
dc.contributor.committeememberOgle, J. Todden
dc.contributor.departmentComputer Scienceen
dc.date.accessioned2017-02-09T18:28:34Zen
dc.date.available2017-02-09T18:28:34Zen
dc.date.issued2017-02-09en
dc.description.abstractSpatial mapping or environment mapping is the process of exploring a real world environment and creating its digital representation. To create convincing mixed reality programs, an environment mapping device must be able to detect a user's position and map the user's environment. Currently available commercial spatial mapping devices mostly use infrared camera to obtain a depth map which is effective only for short to medium distances (3-4 meters). This work describes an extension to the existing environment mapping devices and techniques to enable mapping of larger architectural environments using a combination of a camera, Inertial Measurement Unit (IMU), and Light Detection and Ranging (LIDAR) devices supported by sensor fusion and computer vision techniques. There are three main parts to the proposed system. The first part is data collection and data fusion using embedded hardware, the second part is data processing (segmentation) and the third part is creating a geometry mesh of the environment. The developed system was evaluated against its ability to determine the dimension of the room and of objects within the room. This low cost system can significantly expand the mapping range of the existing mixed reality devices such as Microsoft HoloLens device.en
dc.description.abstractgeneralMixed reality is the mixing of computer generated graphics and real world objects together to create an augmented view of the space. Environmental mapping, the process of creating a digital representation of an environment, is used in mixed reality applications so that its virtual objects can logically interact with the physical environment. Most of the current approaches to this problem work only for short to medium distances. This work describes an extension to the existing devices and techniques to enable mapping of larger architectural spaces. The developed system was evaluated against its ability to determine the dimension of the room and of objects within the room. With certain conditions the system was able to evaluate the dimensions of a room with an error less than twenty percent and is capable of determining the dimensions of objects with an error less than five percent in adequate conditions. This low cost system can significantly expand the mapping range of the existing mixed reality devices such as the Microsoft HoloLens device, allowing for more diverse mixed reality applications to be developed and used.en
dc.description.degreeMaster of Scienceen
dc.format.mediumETDen
dc.identifier.othervt_gsexam:9480en
dc.identifier.urihttp://hdl.handle.net/10919/74984en
dc.publisherVirginia Techen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectSensor Fusionen
dc.subjectEnvironment Mappingen
dc.subjectImage Processingen
dc.subjectComputer Visionen
dc.titleEnvironment Mapping in Larger Spacesen
dc.typeThesisen
thesis.degree.disciplineComputer Science and Applicationsen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.levelmastersen
thesis.degree.nameMaster of Scienceen

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Ciambrone_AJ_T_2017.pdf
Size:
4.94 MB
Format:
Adobe Portable Document Format

Collections