PARVIS (Personal Augmented Reality for Information Visualization on Large Interactive Displays) combines large interactive displays and personal head-mounted Augmented Reality (AR) for information visualization to support data exploration and analysis.
In my Master’s thesis “Exploring Selection Management and Brushing and Linking for Mobile Cross-Device Interaction”, I developed a concept and prototype for combining multiple mobile devices to explore information visualizations. I focused on how to create, manage, and interact with these selections in mobile multi-device environments.
The goal of this project was to develop a set of multi-touch interaction techniques for star plot visualizations and ideally to incorporate additional data into a single star plot.
The goal of my Bachelor’s thesis was to establish a more variable communication with humanoid robots. Thus, I developed a method of dynamically expressing emotions and intentions associated with non-verbal gestures.
Through using mobile Augmented Reality, this transparent and foldable device supports travellers and tourists in their everyday tasks. It helps with to learning about a place on the go, become familiar with it, and even getting to know the language.