Global Event and Trend Archive Research (GETAR) is supported by NSF (IIS-1619028 and 1619371) through 2019. It will devise interactive, integrated, digital library/archive systems coupled with linked and expert-curated webpage/tweet collections. This project will act as a supplement to GETAR by providing a Virtual Reality (VR) interface to visualize geospatial data and extrapolate meaningful information from it. It will primarily focus on visualizing tweets and images obtained from the GETAR data archive on a globe in a VR world. This will be accomplished using tools like Unity, HTC Vive, C# and Git. In order to ensure that the product meets the end user’s specification, this project will use an iterative workflow with a very short feedback loop. The feedback obtained from Dr. Fox and our team members will be used to make subsequent prototypes and the final product. Our project is intended to be used as a demo by school children interested in data analytics and data sciences. Additionally, this project can also be extended to add features to our end product. Our final product can display images and tweets on a globe in a VR world provided that they have location information. As part of our final deliverable, we delivered a report, a presentation, a video demo and a GitHub repository containing the source code for our project. During this project, our team learned that building a 3D application is very different from building a 2D desktop application. We also learned that it is crucial to meticulously document all parts of product development to assist future development.

VR, GETAR, Data exploration, Tweet, Video, Geolocation, Unity, HTC Vive