Browsing by Author "Marquardt, N."
Now showing 1 - 12 of 12
Results Per Page
Sort Options
Item Metadata only The Continuous Interaction Space: Interaction Techniques Unifying Touch and Gesture On and Above a Digital Surface(Springer, 2013) Marquardt, N.; Jota, R.; Greenberg, S.; Jorge, J.The rising popularity of digital table surfaces has spawned considerable interest in new interaction techniques. Most interactions fall into one of two modalities: 1) direct touch and multi-touch (by hand and by tangibles) directly on the surface, and 2) hand gestures above the surface. The limitation is that these two modalities ignore the rich interaction space between them. To move beyond this limitation, we first contribute a unification of these discrete interaction modalities called the continuous interaction space. The idea is that many interaction techniques can be developed that go beyond these two modalities, where they can leverage the space between them. That is, we believe that the underlying system should treat the space on and above the surface as a continuum, where a person can use touch, gestures, and tangibles anywhere in the space and naturally move between them. Our second contribution illustrates this, where we introduce a variety of interaction categories that exploit the space between these modalities. For example, with our Extended Continuous Gestures category, a person can start an interaction with a direct touch and drag, then naturally lift off the surface and continue their drag with a hand gesture over the surface. For each interaction category, we implement an example (or use prior work) that illustrates how that technique can be applied. In summary, our primary contribution is to broaden the design space of interaction techniques for digital surfaces, where we populate the continuous interaction space both with concepts and examples that emerge from considering this space as a continuum.Item Metadata only Cross-Device Interaction via Micro-mobility and F-formations(ACM, 2012) Marquardt, N.; Hinckley, K.; Greenberg, S.GroupTogether is a system that explores cross-device interaction using two sociological constructs. First, F-formations concern the distance and relative body orientation among multiple users, which indicate when and how people position themselves as a group. Second, micromobility describes how people orient and tilt devices towards one another to promote fine-grained sharing during co-present collaboration. We sense these constructs using: (a) a pair of overhead Kinect depth cameras to sense small groups of people, (b) low-power 8GHz band radio modules to establish the identity, presence, and coarse-grained relative locations of devices, and (c) accelerometers to detect tilting of slate devices. The resulting system supports fluid, minimally disruptive techniques for co-located collaboration by leveraging the proxemics of people as well as the proxemics of devices.Item Metadata only The Dark Patterns of Proxemic Sensing(IEEE, 2014) Boring, S.; Greenberg, S.; Vermeulen, J.; Dostal, J.; Marquardt, N.To be accepted and trusted by the public, proxemic sensing systems must respect people's conception of physical space, make it easy to opt in or out, and benefit users as well as advertisers and other vendors.Item Metadata only Designing User-, Hand-, and Handpart-Aware Tabletop Interactions with the TOUCHID Toolkit(ACM, 2011) Marquardt, N.; Diaz-Marino, R.; Boring, S.; Greenberg, S.Recent work in multi-touch tabletop interaction introduced many novel techniques that let people manipulate digital content through touch. Yet most only detect touch blobs. This ignores richer interactions that would be possible if we could identify (1) which part of the hand, (2) which side of the hand, and (3) which person is actually touching the surface. Fiduciary-tagged gloves were previously introduced as a simple but reliable technique for providing this information. The problem is that its low-level programming model hinders the way developers could rapidly explore new kinds of user- and handpart-aware interactions. We contribute the TouchID toolkit to solve this problem. It allows rapid prototyping of expressive multi-touch interactions that exploit the aforementioned characteristics of touch input. TouchID provides an easy-to-use event-driven API as well as higher-level tools that facilitate development: a glove configurator to rapidly associate particular glove parts to handparts; and a posture configurator and gesture configurator for registering new hand postures and gestures for the toolkit to recognize. We illustrate TouchID's expressiveness by showing how we developed a suite of techniques that exploits knowledge of which handpart is touching the surface.Item Metadata only The Fat Thumb: Using the Thumb's Contact Size for Single-Handed Mobile Interaction.(ACM, 2012) Boring, S.; Ledo, D.; Chen, X.; Marquardt, N.; Tang, A.; Greenberg, S.Modern mobile devices allow a rich set of multi-finger interactions that combine modes into a single fluid act, for example, one finger for panning blending into a two-finger pinch gesture for zooming. Such gestures require the use of both hands: one holding the device while the other is interacting. While on the go, however, only one hand may be available to both hold the device and interact with it. This mostly limits interaction to a single-touch (i.e., the thumb), forcing users to switch between input modes explicitly. In this paper, we contribute the Fat Thumb interaction technique, which uses the thumb's contact size as a form of simulated pressure. This adds a degree of freedom, which can be used, for example, to integrate panning and zooming into a single interaction. Contact size determines the mode (i.e., panning with a small size, zooming with a large one), while thumb movement performs the selected mode. We discuss nuances of the Fat Thumb based on the thumb's limited operational range and motor skills when that hand holds the device. We compared Fat Thumb to three alternative techniques, where people had to precisely pan and zoom to a predefined region on a map and found that the Fat Thumb technique compared well to existing techniques.Item Metadata only From Focus to Context and Back: Combining Mobile Projectors and Stationary Displays(University of Calgary, 2012) Weigel, M.; Tang, A.; Boring, S.; Marquardt, N.; Greenberg, S.Item Metadata only Gradual Engagement between Digital Devices as a Function of Proximity: From Awareness to Progressive Reveal to Information Transfer(ACM, 2012) Marquardt, N.; Ballendat, T.; Boring, S.; Greenberg, S.; Hinckley, K.The increasing number of digital devices in our environment enriches how we interact with digital content. Yet, cross-device information transfer -- which should be a common operation -- is surprisingly difficult. One has to know which devices can communicate, what information they contain, and how information can be exchanged. To mitigate this problem, we formulate the gradual engagement design pattern that generalizes prior work in proxemic interactions and informs future system designs. The pattern describes how we can design device interfaces to gradually engage the user by disclosing connectivity and information exchange capabilities as a function of inter-device proximity. These capabilities flow across three stages: (1) awareness of device presence/connectivity, (2) reveal of exchangeable content, and (3) interaction methods for transferring content between devices tuned to particular distances and device capabilities. We illustrate how we can apply this pattern to design, and show how existing and novel interaction techniques for cross-device transfers can be integrated to flow across its various stages. We explore how techniques differ between personal and semi-public devices, and how the pattern supports interaction of multiple users.Item Metadata only The HapticTouch Toolkit: Enabling Exploration of Haptic Interactions(ACM, 2012) Ledo, D.; Nacenta, M.; Marquardt, N.; Boring, S.; Greenberg, S.In the real world, touch based interaction relies on haptic feedback (e.g., grasping objects, feeling textures). Unfortunately, such feedback is absent in current tabletop systems. The previously developed Haptic Tabletop Puck (HTP) aims at supporting experimentation with and development of inexpensive tabletop haptic interfaces in a do-it-yourself fashion. The problem is that programming the HTP (and haptics in general) is difficult. To address this problem, we contribute the HapticTouch toolkit, which enables developers to rapidly prototype haptic tabletop applications. Our toolkit is structured in three layers that enable programmers to: (1) directly control the device, (2) create customized combinable haptic behaviors (e.g., softness, oscillation), and (3) use visuals (e.g., shapes, images, buttons) to quickly make use of these behaviors. In our preliminary exploration we found that programmers could use our toolkit to create haptic tabletop applications in a short amount of time.Item Metadata only Informing the Design of Proxemic Interactions(IEEE, 2012) Marquardt, N.; Greenberg, S.Proxemic interactions can help address six key challenges of ubicomp interaction design and how devices can sense or capture proxemic information via five dimensions-distance, orientation, movement, identity, and location.Item Metadata only Informing the Design of Proxemic Interactions(2011) Marquardt, N.; Greenberg, S.Item Metadata only ProjectorKit: Easing Rapid Prototyping of Interactive Applications for Mobile Projectors(ACM, 2013) Weigel, M.; Boring, S.; Steimle, J.; Marquardt, N.; Greenberg, S.; Tang, A.Researchers have developed interaction concepts based on mobile projectors. Yet pursuing work in this area - particularly in building projector-based interactions techniques within an application - is cumbersome and time-consuming. To mitigate this problem, we contribute ProjectorKit, a flexible open-source toolkit that eases rapid prototyping mobile projector interaction techniques.Item Metadata only Proxemic Interaction: The Video(2010) Ballendat, T.; Marquardt, N.; Greenberg, S.