Browsing by Author "Sharlin, E."
Now showing 1 - 6 of 6
Results Per Page
Sort Options
Item Metadata only Designing the Car iWindow: Exploring Interaction through Vehicle Side Windows(ACM, 2013) Li, J.; Sharlin, E.; Greenberg, S.; Rounding, M.Interactive vehicle windows can enrich the commuting experience by being informative and engaging, strengthening the connection between passengers and the outside world. We propose a preliminary interaction paradigm to allow rich and un-distracting interaction experience on vehicle side windows. Following this paradigm we present a prototype, the Car iWindow, and discuss our preliminary design critique of the interaction, based on the installation of the iWindow in a car and interaction with it while commuting around our campus.Item Metadata only Interacting with Microseismic Visualizations(ACM, 2013) Mostafa, A.; Greenberg, S.; Brazil, E.; Sharlin, E.; Sousa, M.Microseismic visualization systems present complex 3D data of small seismic events within oil reservoirs to allow experts to explore and interact with that data. Yet existing systems suffer several problems: 3D spatial navigation and orientation is difficult, and selecting 3D data is challenging due to the problems of occlusion and lack of depth perception. Our work mitigates these problems by applying both proxemic interactions and a spatial input device to simplify how experts navigate through the visualization, and a painting metaphor to simplify how they select that information.Item Metadata only Interacting with Microseismic Visualizations: The Video.(2013) Mostafa, A.; Greenberg, S.; Brazil, E.; Sharlin, E.; Sousa, M.Microseismic visualization systems present complex 3D data of small seismic events within oil reservoirs to allow experts to explore and interact with that data. Yet existing systems suffer several problems: 3D spatial navigation and orientation is difficult, and selecting 3D data is challenging due to the problems of occlusion and lack of depth perception. Our work mitigates these problems by applying both proxemic interactions and a spatial input device to simplify how experts navigate through the visualization, and a painting metaphor to simplify how they select that information.Item Metadata only Interactive Two-Sided Transparent Displays: Designing for Collaboration(ACM, 2014) Li, J.; Greenberg, S.; Sharlin, E.; Jorge, J.Transparent displays can serve as an important collaborative medium supporting face-to-face interactions over a shared visual work surface. Such displays enhance workspace awareness: when a person is working on one side of a transparent display, the person on the other side can see the other's body, hand gestures, gaze and what he or she is actually manipulating on the shared screen. Even so, we argue that designing such transparent displays must go beyond current offerings if it is to support collaboration. First, both sides of the display must accept interactive input, preferably by at least touch and / or pen, as that affords the ability for either person to directly interact with the workspace items. Second, and more controversially, both sides of the display must be able to present different content, albeit selectively. Third (and related to the second point), because screen contents and lighting can partially obscure what can be seen through the surface, the display should visually enhance the actions of the person on the other side to better support workspace awareness. We describe our prototype FACINGBOARD-2 system, where we concentrate on how its design supports these three collaborative requirements.Item Metadata only PhoneEar: Interactions for Mobile Devices that Hear High-Frequency Sound-Encoded Data(ACM, 2015) Nittala, A. S.; Yang, X. D.; Bateman, S.; Sharlin, E.; Greenberg, S.We present PhoneEar, a new approach that enables mobile devices to understand the broadcasted audio and sounds that we hear every day using existing infrastructure. PhoneEar audio streams are embedded with sound-encoded data using nearly inaudible high frequencies. Mobile devices then listen for messages in the sounds around us, taking actions to ensure we don't miss any important info. In this paper, we detail our implementation of PhoneEar, describe a study demonstrating that mobile devices can effectively receive sound-based data, and describe the results of a user study that shows that embedding data in sounds is not detrimental to sound quality. We also exemplify the space of new interactions, through four PhoneEar-enabled applications. Finally, we discuss the challenges to deploying apps that can hear and react to data in the sounds around us.Item Metadata only A Shape-Shifting Wall Display that Supports Individual and Group Activities(University of Calgary, 2015) Takashima, K.; Greenberg, S.; Sharlin, E.; Kitamura, Y.