Browsing by Author "Marquardt, Nicolai"
Now showing 1 - 20 of 34
Results Per Page
Sort Options
Item Metadata only Application Programming Interface (API) for the Haptic Tabletop Puck(5th Annual Students’ Union Undergraduate Research Symposium, 2010) Ledo, David; Marquardt, Nicolai; Nacenta, Miguel A.; Greenberg, SaulItem Open Access The Continuous Interaction Space: Integrating Gestures Above a Surface with Direct Touch(2009-04-29T19:57:06Z) Marquardt, Nicolai; Jota, Ricardo; Greenberg, Saul; Jorge, Joaquim A.The advent of touch-sensitive and camera-based digital surfaces has spawned considerable development in two types of hand-based interaction techniques. In particular, people can interact: 1) directly on the surface via direct touch, or 2) above the surface via hand motions. While both types have value on their own, we believe much more potent interactions are achievable by unifying interaction techniques across this space. That is, the underlying system should treat this space as a continuum, where a person can naturally move from gestures over the surface to touches directly on it and back again. We illustrate by example, where we unify actions such as selecting, grabbing, moving, reaching, and lifting across this continuum of space.Item Open Access The Continuous Interaction Space: Interaction Techniques Unifying Touch and Gesture On and Above a Digital Surface(2011-01-28T18:00:55Z) Jota, Ricardo; Marquardt, Nicolai; Greenberg, Saul; Jorge, JoaquimThe rising popularity of digital table surfaces has spawned considerable interest in new interaction techniques. Most interactions fall into one of two modalities: 1) direct touch and multi-touch (by hand and by tangibles) directly on the surface, and 2) hand gestures above the surface. The limitation is that these two modalities ignore the rich interaction space between them. To move beyond this limitation, we first contribute a unification of these discrete interaction modalities called the continuous interaction space. The idea is that many interaction techniques can be developed that go beyond these two modalities, where they can leverage the space between them. That is, we believe that the underlying system should treat the space on and above the surface as a continuum, where a person can use touch, gestures, and tangibles anywhere in the space and naturally move between them. Our second contribution illustrates this, where we introduce a variety of interaction categories that exploit the space between these modalities. For example, with our Extended Continuous Gestures category, a person can start an interaction with a direct touch and drag, and then naturally lift off the surface and continue their drag with a hand gesture over the surface. For each interaction category, we implement an example (or use prior work) that illustrates how that technique can be applied. In summary, our primary contribution is to broaden the design space of interaction techniques for digital surfaces, where we populate the continuous interaction space both with concepts and examples that emerge from considering this space as a continuum.Item Open Access Designing User-, Hand-, and Handpart-Aware Tabletop Interactions with the TOUCHID Toolkit(2011-07-12T20:07:47Z) Marquardt, Nicolai; Kiemer, Johannes; Ledo, David; Boring, Sebastian; Greenberg, SaulRecent work in multi-touch tabletop interaction introduced many novel techniques that let people manipulate digital content through touch. Yet most only detect touch blobs. This ignores richer interactions that would be possible if we could identify (1) which hand, (2) which part of the hand, (3) which side of the hand, and (4) which person is actually touching the surface. Fiduciary-tagged gloves were previously introduced as a simple but reliable technique for providing this information. The problem is that its lowlevel programming model hinders the way developers could rapidly explore new kinds of user- and handpartaware interactions. We contribute the TOUCHID toolkit to solve this problem. It allows rapid prototyping of expressive multi-touch interactions that exploit the aforementioned characteristics of touch input. TOUCHID provides an easy-to-use event-driven API. It also provides higher-level tools that facilitate development: a glove configurator to rapidly associate particular glove parts to handparts; and a posture configurator and gesture configurator for registering new hand postures and gestures for the toolkit to recognize. We illustrate TOUCHID’s expressiveness by showing how we developed a suite of techniques (which we consider a secondary contribution) that exploits knowledge of which handpart is touching the surface.Item Open Access Evaluation Strategies for HCI Toolkit Research(2017-09-27) Ledo, David; Houben, Steven; Vermeulen, Jo; Marquardt, Nicolai; Oehlberg, Lora; Greenberg, SaulToolkit research plays an important role in the field of HCI, as it can heavily influence both the design and implementation of interactive systems. For publication, the HCI community typically expects that research to include an evaluation component. The problem is that toolkit evaluation is challenging, as it is often unclear what ‘evaluating’ a toolkit means and what methods are appropriate. To address this problem, we analyzed 68 published toolkit papers. From that analysis, we provide an overview of, reflection on, and discussion of evaluation methods for toolkit contributions. We identify and discuss the value of four toolkit evaluation strategies, including the associated techniques each employs. We offer a categorization of evaluation strategies for toolkit researchers, along with a discussion of the value, potential biases, and trade-offs associated with each strategy.Item Open Access The Fat Thumb: Using the Thumb's Contact Size for Single-Handed Mobile Interaction(2011-12-02T21:11:14Z) Boring, Sebastian; Ledo, David; Chen, Xiang (Anthony); Marquardt, Nicolai; Tang, Anthony; Greenberg, SaulModern mobile devices allow a rich set of multi-finger interactions that combine modes into a single fluid act, for example, one finger for panning blending into a two-finger pinch gesture for zooming. Such gestures require the use of both hands: one holding the device while the other is interacting. While on the go, however, only one hand may be available to both hold the device and interact with it. This mostly limits interaction to a single-touch (i.e., the thumb), forcing users to switch between input modes explicitly. In this paper, we contribute the Fat Thumb interaction technique, which uses the thumb’s contact size as a form of simulated pressure. This adds a degree of freedom, which can be used, for example, to integrate panning and zooming into a single interaction. Contact size determines the mode (i.e., panning with a small size, zooming with a large one), while thumb movement performs the selected mode. We discuss nuances of the Fat Thumb based on the thumb’s limited operational range and motor skills when that hand holds the device. We compared Fat Thumb to three alternative techniques, where people had to pan and zoom to a predefined region on a map. Participants performed fastest with the least strokes using Fat Thumb.Item Open Access From Focus to Context and Back: Combining Mobile Projectors and Stationary Displays(2012-10-12T19:29:53Z) Weigel, Martin; Boring, Sebastian; Marquardt, Nicolai; Steimle, Jurgen; Greenberg, Saul; Tang, AnthonyFocus plus context displays combine high-resolution detail and lower-resolution overview using displays of different pixel densities. Historically, they employed two fixed-size displays of different resolutions, one embedded within the other. In this paper, we explore focus plus context displays using one or more mobile projectors in combination with a stationary display. The portability of mobile projectors as applied to focus plus context displays contributes in three ways. First, the projector’s projection on the stationary display can transition dynamically from being the focus of one’s interest (i.e. providing a high resolution view when close to the display) to providing context around it (i.e. providing a low resolution view beyond the display’s borders when further away from it). Second, users can dynamically reposition and resize a focal area that matches their interest rather than repositioning all content into a fixed high-resolution area. Third, multiple users can manipulate multiple foci or context areas without interfering with one other. A proof-of-concept implementation illustrates these contributions.Item Open Access Gradual Engagement between Digital Devices as a Function of Proximity: From Awareness to Progressive Reveal to Information Transfer(2012-04-20T17:29:40Z) Marquardt, Nicolai; Ballendat, Till; Boring, Sebastian; Greenberg, Saul; Hinckley, KenConnecting and information transfer between the increasing number of personal and shared digital devices in our environment – phones, tablets, and large surfaces – is tedious. One has to know which devices can communicate, what information they contain, and how information can be exchanged. Inspired by Proxemic Interactions, we introduce novel interaction techniques that allow people to naturally connect to and perform cross-device operations. Our techniques are based on the notion of gradual engagement between a person’s handheld device and the other devices surrounding them as a function of finegrained measures of proximity. They all provide awareness of device presence and connectivity, progressive reveal of available digital content, and interaction methods for transferring digital content between devices from a distance and from close proximity. They also illustrate how gradual engagement may differ when the other device seen is personal (such as a handheld) vs. semi-public (such as a large display). We illustrate our techniques within two applications that enable gradual engagement leading up to information exchange between digital devices.Item Open Access The Haptic Tabletop Puck: Tactile Feedback for Interactive Tabletops(2009-08-27T16:06:23Z) Marquardt, Nicolai; Nacenta, Miguel; Young, James; Carpendale, Sheelagh; Greenberg, Saul; Sharlin, EhudIn everyday life, our interactions with objects on real tables include how our fingertips feel those objects. In comparison, current digital interactive tables present a uniform touch surface that feels the same, regardless of what it presents visually. In this paper, we explore how tactile interaction can be used with digital tabletop surfaces. We present a simple and inexpensive device – the Haptic Tabletop Puck – that incorporates dynamic, interactive haptics into tabletop interaction. We created several applications that explore tactile feedback in the area of of haptic information visualization, haptic graphical interfaces, and computer supported collaboration. In particular, we focus on how a person may interact with the friction, height, texture and malleability of digital objects.Item Metadata only The Haptic Tabletop Puck: Tactile Feedback for Interactive Tabletops(ACM, 2009) Marquardt, Nicolai; Nacenta, Miguel A.; Young, Jim; Carpendale, Sheelagh; Greenberg, Saul; Sharlin, EhudIn everyday life, our interactions with objects on real tables include how our fingertips feel those objects. In comparison, current digital interactive tables present a uniform touch surface that feels the same, regardless of what it presents visually. In this paper, we explore how tactile interaction can be used with digital tabletop surfaces. We present a simple and inexpensive device -- the Haptic Tabletop Puck -- that incorporates dynamic, interactive haptics into tabletop interaction. We created several applications that explore tactile feedback in the area of haptic information visualization, haptic graphical interfaces, and computer supported collaboration. In particular, we focus on how a person may interact with the friction, height, texture and malleability of digital objects.Item Metadata only The Haptic Tabletop Puck: Tactile Feedback for Interactive Tabletops(ACM, 2009) Marquardt, Nicolai; Nacenta, Miguel A.; Young, Jim; Carpendale, Sheelagh; Greenberg, Saul; Sharlin, EhudIn everyday life, our interactions with objects on real tables include how our fingertips feel those objects. In comparison, current digital interactive tables present a uniform touch surface that feels the same, regardless of what it presents visually. In this paper, we explore how tactile interaction can be used with digital tabletop surfaces. We present a simple and inexpensive device -- the Haptic Tabletop Puck -- that incorporates dynamic, interactive haptics into tabletop interaction. We created several applications that explore tactile feedback in the area of haptic information visualization, haptic graphical interfaces, and computer supported collaboration. In particular, we focus on how a person may interact with the friction, height, texture and malleability of digital objects.Item Metadata only The Haptic Tabletop Puck: The Video(ACM, 2009) Marquardt, Nicolai; Nacenta, Miguel A.; Young, Jim; Carpendale, Sheelagh; Greenberg, Saul; Sharlin, EhudIn everyday life, our interactions with objects on real tables include how our fingertips feel those objects. In comparison, current digital interactive tables present a uniform touch surface that feels the same, regardless of what it presents visually. In this video, we demonstrate how tactile interaction can be used with digital tabletop surfaces. We present a simple and inexpensive device -- the Haptic Tabletop Puck -- that incorporates dynamic, interactive haptics into tabletop interaction. We created several applications that explore tactile feedback in the area of haptic information visualization, haptic graphical interfaces, and computer supported collaboration. In particular, we focus on how a person may interact with the friction, height, texture and malleability of digital objects.Item Open Access The HAPTIC TOUCH Toolkit: Enabling Exploration of Haptic Interactions(2011-09-26T15:10:23Z) Ledo, David; Nacenta, Miguel A.; Marquardt, Nicolai; Boring, Sebastian; Greenberg, SaulIn the real world, touch based interaction relies on haptic feedback (e.g., grasping objects, feeling textures). Unfortunately, such feedback is absent in current tabletop systems. The previously developed Haptic Tabletop Puck (HTP) aims at supporting experimentation with and development of inexpensive tabletop haptic interfaces. The problem is that programming the HTP is difficult due to interactions when coding its multiple hardware components. To address this problem, we contribute the HAPTICTOUCH toolkit, which allows developers to rapidly prototype haptic tabletop applications. Our toolkit is structured in three layers that enable programmers to: (1) directly control the device, (2) create customized combinable haptic behaviors (e.g. softness, oscillation), and (3) use visuals (e.g., shapes, images, buttons) to quickly make use of the aforementioned behaviors. Our preliminary study found that programmers could use the HAPTICTOUCH toolkit to create haptic tabletop applications in a short amount of time.Item Metadata only Informing the Design of Proxemic Interactions(University of Calgary, 2011) Marquardt, Nicolai; Greenberg, SaulProxemic Interactions envision interactive computer systems that exploit peoples’ and devices’ spatial relationships (proxemics) to provide more natural and seamless interactions with ubicomp technology. It builds upon fundamental proxemic theories about people’s understanding and use of the personal space around them. In this paper, we focus on how nuances of the proxemic theories and concepts of Proxemic Interaction can be applied to address six key challenges of ubicomp interaction design, where we consider how we can leverage information on fine grained proxemic relationships. We also discuss how previous proxemic aware systems addressed these challenges.Item Open Access Informing the Design of Proxemic Interactions(2011-07-13T17:22:27Z) Marquardt, Nicolai; Greenberg, SaulProxemic Interactions envision interactive computer systems that exploit peoples’ and devices’ spatial relationships (proxemics) to provide more natural and seamless interactions with ubicomp technology. It builds upon fundamental proxemic theories about people’s understanding and use of the personal space around them. In this paper, we focus on how nuances of the proxemic theories and concepts of Proxemic Interaction can be applied to address six key challenges of ubicomp interaction design, where we consider how we can leverage information on fine grained proxemic relationships. We also discuss how previous proxemic2aware systems addressed these challenges.Item Open Access ProjectorKit: Easing the Development of Interactive Applications for Mobile Projectors(2013-02-19) Weigel, Martin; Boring, Sebastian; Steimle, Jurgen; Marquardt, Nicolai; Greenberg, Saul; Tang, AnthonyResearchers have developed interaction concepts based on mobile projectors. Yet pursuing work in this area – particularly in applying projector-based techniques within an application – is cumbersome and time-consuming. To mitigate this problem, we generalize existing interaction techniques using mobile projectors. First, we identified five interaction primitives that serve as building blocks for a large set of applications. Second, these primitives were used to derive a set of principles that inform the design of a toolkit that ease and support software development for mobile projectors. Finally, we implemented these principles in a toolkit, called ProjectorKit, which we contribute to the community as a flexible open-source platform.Item Open Access Proxemic Interaction: The Video(2010-07-07T21:31:31Z) Ballendat, Till; Marquardt, Nicolai; Greenberg, SaulThis video illustrates how we used proxemic information to regulate interactions of people, digital devices, and non-digital artefacts with an interactive vertical display surface. It shows how proxemics can regulate implicit and explicit interaction techniques, as well as how proxemic interactions can be triggered by continuous movement, or by movement in and out of discrete proxemic regions. Our example application is an interactive media player that implicitly reacts to the approach and orientation of people and their personal devices, and that tailors explicit interaction methods to fit.Item Metadata only Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment(2010) Ballendat, Till; Marquardt, Nicolai; Greenberg, SaulItem Metadata only Proxemic Interactions in Ubiquitous Computing Ecologies(ACM, 2011) Marquardt, NicolaiAn important challenge in ubiquitous computing (ubicomp) is to create techniques that allow people to seamlessly and naturally connect to and interact with the increasing number of digital devices. I propose to leverage the knowledge of people's and devices' spatial relationships - called proxemics - in ubicomp interaction design. I introduce my work of proxemic interactions that consider fine-grained information of proxemics to mediate people's interactions with digital devices, such as large digital surfaces or portable personal devices. This research includes the design of development tools for programmers creating proxemic-aware systems, and the design and evaluation of such interactive ubicomp systems.Item Open Access Proxemic Interactions in Ubiquitous Computing Ecologies(2013-08-07) Marquardt, Nicolai; Greenberg, SaulIn this dissertation, I explore how the knowledge of people’s and devices’ spatial relationships – called proxemics – can be applied to the design of ubiquitous computing (ubicomp) interactions. Edward Hall’s proxemics theory describes how people use spatial relationships – such as varying their distance or orientation – to mediate their interactions with other people around them. But in spite of the opportunities presented by people’s natural understanding of proxemics, only a relatively small number of ubicomp installations incorporate proxemic information within interaction design. Therefore, my goal in this dissertation research is to inform the design of future proxemic-aware devices that – similar to people’s natural expectations and use of proxemics – allow increasing connectivity and interaction possibilities when in proximity to people, other devices, or objects. Towards this goal, I explore how the fine-grained knowledge of proxemic relationships between the entities in small-space ubicomp ecologies can be exploited in interaction design. In particular, I provide the following three major contributions: First, I operationalize proxemics for ubicomp interaction with the Proxemic Interactions framework that serves to guide the design of ubicomp applications. The framework describes how designers can consider fine-grained proxemic information to mediate people’s interactions with digital devices, such as large digital surfaces or portable personal devices. I identify five key dimensions of proxemic measures (distance, orientation, movement, identity, and location) to consider when designing proxemic-aware ubicomp systems. I also identify the gradual engagement design pattern as one particular strategy that allows designing system interactions that move from awareness, to reveal, to interaction. Second, I design the Proximity Toolkit allowing ubicomp developers to rapidly prototype proxemic-aware ubicomp systems. The toolkit simplifies the development process by supplying higher-level information about proxemic relationships between the entities in ubicomp ecologies through an event-driven API and visual inspection tools. Third, I explore the design of three case studies of proxemic-aware systems that react continuously to people’s and devices’ proxemic relationships. The case studies explore the application of proxemics in small-space ubicomp ecologies by considering first person-to-device, then device-to-device, and finally person-to-person & device-to-device proxemic relationships. Together, they validate the toolkit’s versatility and the application of the Proxemic Interactions framework.