Return to search

A Modular and Open-Source Framework for Virtual Reality Visualisation and Interaction in Bioimaging

Life science today involves computational analysis of a large amount and variety of data, such as volumetric data acquired by state-of-the-art microscopes, or mesh data from analysis of such data or simulations. The advent of new imaging technologies, such as lightsheet microscopy, has resulted in the users being confronted with an ever-growing amount of data, with even terabytes of imaging data created within a day. With the possibility of gentler and more high-performance imaging, the spatiotemporal complexity of the model systems or processes of interest is increasing as well. Visualisation is often the first step in making sense of this data, and a crucial part of building and debugging analysis pipelines. It is therefore important that visualisations can be quickly prototyped, as well as developed or embedded into full applications. In order to better judge spatiotemporal relationships, immersive hardware, such as Virtual or Augmented Reality (VR/AR) headsets and associated controllers are becoming invaluable tools.
In this work we present scenery, a modular and extensible visualisation framework for the Java VM that can handle mesh and large volumetric data, containing multiple views, timepoints, and color channels. scenery is free and open-source software, works on all major platforms, and uses the Vulkan or OpenGL rendering APIs. We introduce scenery's main features, and discuss its use with VR/AR hardware and in distributed rendering.
In addition to the visualisation framework, we present a series of case studies, where scenery can provide tangible benefit in developmental and systems biology: With Bionic Tracking, we demonstrate a new technique for tracking cells in 4D volumetric datasets via tracking eye gaze in a virtual reality headset, with the potential to speed up manual tracking tasks by an order of magnitude. We further introduce ideas to move towards virtual reality-based laser ablation and perform a user study in order to gain insight into performance, acceptance and issues when performing ablation tasks with virtual reality hardware in fast developing specimen. To tame the amount of data originating from state-of-the-art volumetric microscopes, we present ideas how to render the highly-efficient Adaptive Particle Representation, and finally, we present sciview, an ImageJ2/Fiji plugin making the features of scenery available to a wider audience.:Abstract
Foreword and Acknowledgements
Overview and Contributions
Part 1 - Introduction
1 Fluorescence Microscopy
2 Introduction to Visual Processing
3 A Short Introduction to Cross Reality
4 Eye Tracking and Gaze-based Interaction
Part 2 - VR and AR for System Biology
5 scenery — VR/AR for Systems Biology
6 Rendering
7 Input Handling and Integration of External Hardware
8 Distributed Rendering
9 Miscellaneous Subsystems
10 Future Development Directions
Part III - Case Studies
C A S E S T U D I E S
11 Bionic Tracking: Using Eye Tracking for Cell Tracking
12 Towards Interactive Virtual Reality Laser Ablation
13 Rendering the Adaptive Particle Representation
14 sciview — Integrating scenery into ImageJ2 & Fiji
Part IV - Conclusion
15 Conclusions and Outlook
Backmatter & Appendices
A Questionnaire for VR Ablation User Study
B Full Correlations in VR Ablation Questionnaire
C Questionnaire for Bionic Tracking User Study
List of Tables
List of Figures
Bibliography
Selbstständigkeitserklärung

Identiferoai:union.ndltd.org:DRESDEN/oai:qucosa:de:qucosa:72900
Date27 November 2020
CreatorsGünther, Ulrik
ContributorsSbalzarini, Ivo F., Sharpe, James, Technische Universität Dresden
Source SetsHochschulschriftenserver (HSSS) der SLUB Dresden
LanguageEnglish
Detected LanguageEnglish
Typeinfo:eu-repo/semantics/publishedVersion, doc-type:doctoralThesis, info:eu-repo/semantics/doctoralThesis, doc-type:Text
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0019 seconds