Browsing by Author "Young, James"
Now showing 1 - 11 of 11
Results Per Page
Sort Options
Item Open Access Extracting Emotion From Movement: Representing Interactions as Glyphs(2009-11-03T17:24:07Z) Van Dale, Daniel; Young, James; Sharlin, EhudWe present a preliminary exploration of information visualization techniques for extracting social and emotive aspects of movement. Our glyph-based technique visualizes particular characteristics of a motion path or interaction sequence between two characters. In this paper we detail our efforts of creating glyphs that extract and expose underlying emotive and social aspects of the collocated physical interplay between a human and a robot.Item Open Access The Haptic Tabletop Puck: Tactile Feedback for Interactive Tabletops(2009-08-27T16:06:23Z) Marquardt, Nicolai; Nacenta, Miguel; Young, James; Carpendale, Sheelagh; Greenberg, Saul; Sharlin, EhudIn everyday life, our interactions with objects on real tables include how our fingertips feel those objects. In comparison, current digital interactive tables present a uniform touch surface that feels the same, regardless of what it presents visually. In this paper, we explore how tactile interaction can be used with digital tabletop surfaces. We present a simple and inexpensive device – the Haptic Tabletop Puck – that incorporates dynamic, interactive haptics into tabletop interaction. We created several applications that explore tactile feedback in the area of of haptic information visualization, haptic graphical interfaces, and computer supported collaboration. In particular, we focus on how a person may interact with the friction, height, texture and malleability of digital objects.Item Open Access A Mixed Reality Approach to Human-Robot Interaction(2006-02-15) Young, James; Sharlin, EhudThis paper offers a mixed reality approach to humanrobot interaction (HRI) which exploits the fact that robots are both digital and physical entities. We use mixed reality (MR) to integrate digital interaction into the physical environment, allowing users to interact with robots' ideas and thoughts directly within the shared physical interaction space. We also present a taxonomy which we use to organise and classify the various interaction techniques that this environment offers. We demonstrate this environment and taxonomy by detailing two interaction techniques, thought crumbs and bubblegrams, and to evaluate these techniques, we offer the design of an implementation prototype.Item Open Access Puppet Master: Designing Reactive Character Behavior by Demonstration(2008-06-09T20:23:52Z) Young, James; Igarashi, Takeo; Sharlin, EhudWe present Puppet Master, a system that enables designers to rapidly create interactive and autonomous character behaviors (e.g., of a virtual character or a robot) that react to a main character controlled by an end-user. The behavior is designed by demonstration, allowing non-technical artists to intuitively design the style, personality, and emotion of the character, traits which are very difficult to design using conventional programming approaches. During training, designers demonstrate paired behavior between the main and reacting characters. During run time, the end user controls the main character and the system synthesizes the motion of the reacting character using the given training data. The algorithm is an extension of image analogies [Hertzmann et al. 2001], modified to synthesize dynamic character behavior instead of an image. We introduce non-trivial extensions to the algorithm such as our selection of features, dynamic balancing between similarity metrics, and separate treatment of path trajectory and high-frequency motion texture. We implemented a prototype system using physical pucks tracked by a motion-capture system and conducted a user study demonstrating that novice users can easily and successfully design character personality and emotion using our system and that the resulting behaviors are meaningful and engaging.Item Open Access Snakey: A Tangible User Interface for Well Path Planning in the Context of Reservoir Engineering(2011-05-26T21:35:45Z) Harris, John; Young, James; Sultanum, Nicole; Lapides, Paul; Sharlin, Ehud; Costa Sousa, MarioWe present Snakey, a tangible user interface (TUI) designed for the field of reservoir engineering. The Snakey interface focuses on intuitive manipulation and interaction with 3D curves common to underground well path planning. Our paper discusses design goals and prototyping solutions relating to the physical materials, sensing technology, input/output mapping, and multi-modal information feedback of the Snakey TUI. The paper also discusses a design critique of the latest prototype interface performed by domain experts (experienced reservoir engineers) and concludes by outlining our findings regarding the next steps required to improve the current Snakey interface prototype.Item Open Access Style by Demonstration: Using Broomsticks and Tangibles to Show Robots How to Follow People(2010-10-13T16:50:27Z) Young, James; Ishii, Kentaro; Igarashi, Takeo; Sharlin, EhudThe style in which a robot moves, including its gait or locomotion style, can project strong messages, for example, it can be easy to distinguish a happy dog from an aggressive dog simply by how it is moving, and one can often tell if a colleague is stressed simply by the way they are walking. Defining the real-time interactive, stylistic aspects of robotic movements via programming can be difficult and time consuming. Instead, we propose to enable people to use their existing teaching skills to directly demonstrate to robots the desired style of robot movements; in this paper we present an initial style-bydemonstration (SBD) proof-of-concept that focuses on teaching a robot specific, interactive locomotion styles. We present a novel broomstick-robot interface for directly demonstrating locomotion style to a robot, and a design critique by experienced programmers that compares the designing of interactive, stylistic robotic locomotion by our Style-By-Demonstration (SBD) approach with traditional programming methods.Item Open Access Style-by-Demonstration: Using Broomsticks and Tangibles to Show Robots How to Follow People(2009-11-03T16:41:30Z) Young, James; Ishii, Kentaro; Igarashi, Takeo; Sharlin, EhudRobots are poised to enter our everyday environments such as our homes and offices. These contexts present unique human demands, including questions of the style and personality of the robot's actions. Style-oriented characteristics are difficult to define programmatically, and as such, are often out of reach from the designers involved in creating robotic technologies. This problem is particularly prominent for a robot’s interactive behaviors, those that must react accordingly to dynamic environments and actions of people. In this paper, we present the concept of programming robotic style by demonstration through the use of broomsticks and tangibles, such that non-technical designers can directly create the style of actions using their existing skill sets. We developed a working system as a proof-of-concept, and present two novel interfaces for directly demonstrating the style of motions to robots. Our current focus is on the style of a robot following a person, but we envision that simple physical interfaces like ours can be used by non-technical people to design the style of a wide range of robotic behaviors.Item Open Access Three Perspectives for Evaluating Human-Robot Interaction(2010-03-19T19:34:32Z) Young, James; Sung, JaYoung; Voida, Amy; Sharlin, Ehud; Igarashi, Takeo; Christensen, Henrik; Grinter, RebeccaThe experience of interacting with a robot has been shown to be very different in comparison to people's interaction experience with other technologies and artifacts, and often has a strong social or emotional component { a fact that raises concerns related to evaluation. In this paper we outline how this difference is due in part to the general complexity of robots' overall context of interaction, related to their dynamic presence in the real world and their tendency to invoke a sense of agency. A growing body of work in Human-Robot Interaction (HRI) focuses on exploring this overall context and tries to unpack what exactly is unique about interaction with robots, often through leveraging evaluation methods and frameworks designed for more-traditional HCI. We raise the concern that, due to these differences, HCI evaluation methods should be applied to HRI with care, and we present a survey of HCI evaluation techniques from the perspective of the unique challenges of robots. Further, we have developed a new set of tools to aid evaluators in targeting and unpacking the holistic human-robot interaction experience. Our technique surrounds the development of a map of interaction experience possibilities and, as part of this, we present a set of three perspectives for targeting specific components of interaction experience, and demonstrate how these tools can be practically used in evaluation.Item Open Access Toward Acceptable Domestic Robots: Lessons Learned from Social Psychology(2008-07-16T14:59:46Z) Young, James; Hawkins, Richard; Sharlin, Ehud; Igarashi, TakeoSocial psychology offers a perspective on the acceptance and adoption of technology that is not often considered in technical circles. In this paper, we discuss several adoption-of-technology models in respect to the acceptance of domestic robots. We raise several key points that we feel will be pivotal to how domestic users respond to robots, and provide a set of heuristics that roboticists and designers of robotic interfaces can use to consider and analyze their designs. Ultimately, understanding both how users respond to robots and the reasons behind their responses will enable designers to creating domestic robots that are accepted into homes.Item Open Access Using Touch and Toys for Multiple Robots Control: video report(2009-04-13T20:38:29Z) Guo, Cheng; Young, James; Sharlin, EhudInteraction with a remote team of robots in real time is a difficult human-robot interaction (HRI) problem exacerbated by the complications of unpredictable real-world environments, with solutions often resorting to a larger-than-desirable ratio of operators to robots. We present two innovative interfaces that allow a single operator to interact with a group of remote robots. Using a tabletop computer the user can configure and manipulate groups of robots directly by either using their fingers (touch) or by manipulating a set of physical toys (tangible user interfaces)Item Open Access What is Mixed Reality, Anyway? Considering the Boundaries of Mixed Reality in the Context of Robots(2010-04-08T16:21:38Z) Young, James; Sharlin, Ehud; Igarashi, TakeoMixed reality, as an approach in human-computer interaction, is often implicitly tied to particular implementation techniques (e.g., see-through device) and modalities (e.g., visual, graphical displays). In this paper we attempt to clarify the definition of mixed reality as a more abstract concept of combining the real and virtual worlds – that is, mixed reality is not a given technology but a concept that considers how the virtual and real worlds can be combined. Further, we use this discussion to posit robots as mixed-reality devices, and present a set of implications and questions for what this implies for mixed-reality interaction with robots.