Visual data mining in immersive virtual reality: an approachable workflow for medical research and big data presentation
Abstract
Background: Visual data mining is a powerful but difficult to utilise methodology for big dataset analytics in medical research. Dimensionality reduction methods are commonly used for visual representation, however displaying data in 2D is not always conducive to the identification of nuanced trends or anomalies. Virtual reality (VR) technologies have become more accessible recently and offer the affordance of unparalleled immersivity for displaying big data.
Aims: We sought to develop an approachable and robust methodology for enabling medical science datasets to be interrogated in VR. The solution was to be applicable to a wide variety of dataset types and enable depiction of more dimensions, components, or individual variables versus monitor display.
Methods: We utilised Houdini (SideFX) to produce 3D data models (based on dimensionality reduction transformations) and Unreal Engine to curate the generated datasets. We implemented stereoscopic sound, haptic feedback, individual datapoint interactivity and database lookup capability. The simulation was tested using clustered and networked datasets. The Oculus Quest was used for testing.
Results: Datasets are easily generated and presented as a galaxy style-virtual environment. Visual, aural, and kinetic cues are utilised to accelerate interpretation of spatial relationships and interrogation of the dataset. Datapoints can be automatically linked to external databases to yield additional information to inform the mining process; a feature that is particularly useful for rapid interpretation of networked data for generation of new hypotheses.
Conclusions: The VR approach is highly novel however the methodology remains uncomplicated and low cost. We will work toward evaluation of the tool in the post-pandemic period.