Return to search

Interactive Fusion and Tracking For Multi‐Modal Spatial Data Visualization

Yes / Scientific data acquired through sensors which monitor natural phenomena, as well as simulation data that imitate time‐identified events, have fueled the need for interactive techniques to successfully analyze and understand trends and patterns across space and time. We present a novel interactive visualization technique that fuses ground truth measurements with simulation results in real‐time to support the continuous tracking and analysis of spatiotemporal patterns. We start by constructing a reference model which densely represents the expected temporal behavior, and then use GPU parallelism to advect measurements on the model and track their location at any given point in time. Our results show that users can interactively fill the spatio‐temporal gaps in real world observations, and generate animations that accurately describe physical phenomena.

Identiferoai:union.ndltd.org:BRADFORD/oai:bradscholars.brad.ac.uk:10454/17884
Date06 1900
CreatorsElshehaly, Mai, Gračanin, D., Gad, M., Elmongui, H.G., Matković, K.
Source SetsBradford Scholars
LanguageEnglish
Detected LanguageEnglish
TypeArticle, Accepted manuscript
Rights© 2015 Wiley. This is the peer-reviewed version of the following article: Elshehaly M, Gračanin D, Gad M et al (2015) Interactive Fusion and Tracking For Multi‐Modal Spatial Data Visualization. Computer Graphics Forum. 34(3): 251-260, which has been published in final form at https://doi.org/10.1111/cgf.12637. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.

Page generated in 0.0019 seconds