Translator Disclaimer
28 January 2009 Contextual interaction for geospatial visual analytics on mobile devices
Author Affiliations +
Proceedings Volume 7256, Multimedia on Mobile Devices 2009; 72560H (2009)
Event: IS&T/SPIE Electronic Imaging, 2009, San Jose, California, United States
Limited display area creates unique challenges for information presentation and user exploration of data on mobile devices. Traditional scrolling, panning and zooming interfaces pose significant cognitive burdens on the user to assimilate the new context after each interaction. To overcome these limitations, we examine the uses of "focus + context" techniques, specifically for performing visual analytic tasks with geospatial data on mobile devices. In particular, we adapted the translucency-based "focus + context" technique called "blending lens" to mobile devices. The adaptation enhances the lens functionalities with dynamically changing features based on users' navigation intentions, for mobile interaction. We extend the concept of "spatial context" of this method to include relevant semantic content to aid spatial navigation and analytical tasks such as finding related data. With these adaptations, the lens can be used to view spatially clustered results of a search query, related data based on various proximity functions (such as distance, category and time) and other correlative information for immediate in-field analysis, all without losing the current geospatial context.
© (2009) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Avin Pattath, David S. Ebert, William Pike, and Richard A. May "Contextual interaction for geospatial visual analytics on mobile devices", Proc. SPIE 7256, Multimedia on Mobile Devices 2009, 72560H (28 January 2009);

Back to Top