Whang, JooYoung2020-07-302020-07-302020-07-29vt_gsexam:26973http://hdl.handle.net/10919/99454In appreciation of High-Performance Computing, modern scientific simulations are scaling into millions and even billions of grid points. As we enter the exa-scale, new strategies are required for visualization and analysis. While Image-Based Rendering (IBR) has emerged as a viable solution to the asymmetry between data size and its storage and required rendering power, it is limited in its 2D image portrayal of 3D spatial objects. This work describes a novel technique to capture, represent, and render depth information in the context of 3D IBR. We tested the value of displacement by displacement map, shading by normal, and image angle interval with our technique. We ran an online user study of 60 participants to evaluate the value of adding depth information back to Image-Based Rendering and found significant benefits.ETDIn CopyrightVirtual RealityDisplacement mapImage-Based RenderingDepth CueImproving the Perception of Depth of Image-Based Objects in a Virtual EnvironmentThesis