Browsing by Author "Young, James E."
Now showing 1 - 8 of 8
Results Per Page
Sort Options
Item Open Access Ergonomic Music: Where Does it Fit In?(2007-02-02) van Gerven, Dustin; Young, James E.In this paper we present the idea of ergonomic instruments: musical devices that focus on minimising the learning curve of musicians by mapping inherent musical actions to the creation of music. We situate ergonomic instruments in the context of existing relevant work, and then discuss a new methodology towards the development of musical instruments including exploration of a musical goal which we call direct expression. Next, we offer a prototype of an ergonomic instrument called the Ergonomic Electronic Percussive Padboard (EEP Padboard) and include an exploratory evaluation with ideas for further development.Item Open Access Exploring social interaction between robots and people(2010) Young, James E.; Sharlin, EhudItem Open Access Inspector Baxter: The Social Aspects of Integrating a Robot as a Quality Inspector in an Assembly Line(2015-06-24) Banh, Amy; Rea, Daniel J.; Young, James E.; Sharlin, EhudWe are interested in the social implications of working alongside robots. In this paper we look at a humanoid robot quality inspector, acting alongside workers in an assembly line. This setting is viable in small scale assembly lines where human assembly workers provide flexible, rapid assembly. A robotic quality inspector could enhance the quality assurance process, but places the robot in a position of relative seniority to the assembly workers. We present the results of an initial in-lab pilot study designed with our industry collaborators. In our pilot, a humanoid robot visually inspected participants’ assembled products in a shared workspace and provided critiques that follow simple models of robotic social feedback. Our findings suggest that people’s opinions of the robot (trust, impression of intelligence, etc.) changed based on the robot’s social behaviors while it is judging the participant’s work. Additionally, people rated the robot more negatively if they disagreed with the robot’s opinions of their work, regardless of the robot social behavior and the value of its critique.Item Open Access Robot Expressionism Through Cartooning(2006-10-31) Young, James E.; Xin, Min; Sharlin, EhudWe present a new technique for human-robot interaction called robot expressionism through cartooning. We suggest that robots utilise cartoon-art techniques such as simplified and exaggerated facial expressions, stylised text, and icons for intuitive social interaction with humans. We discuss practical mixed reality solutions that allow robots to augment themselves or their surroundings with cartoon art content. Our effort is part of what we call robot expressionism, a conceptual approach to the design and analysis of robotic interfaces that focuses on providing intuitive insight into a robotic state as well as artistic quality of interaction. Our paper discusses a variety of ways that allow robots to express cartoon art, and details a test bed design, implementation, and preliminary evaluation. We describe our test bed, Jeeves, which uses a Roomba, an iRobot vacuum cleaner robot, and a mixed-reality system as a platform for rapid prototyping of cartoon-art interfaces. Finally, we present a set of interaction content scenarios which use the Jeeves prototype: trash roomba, the recycle police, and clean tracks, as well as initial user evaluation of our approach.Item Open Access Shared Presence and Collaboration Using a Co-Located Humanoid Robot(2015-06-24) Wentzel, Johann; Rea, Daniel J.; Young, James E.; Sharlin, EhudThis work proposes the concept of shared presence, where we enable a user to “become” a co-located humanoid robot while still being able to use their real body to complete tasks. The user controls the robot and sees with its vision and sen-sors, while still maintaining awareness and use of their real body for tasks other than controlling the robot. This shared presence can be used to accomplish tasks that are difficult for one person alone, for example, a robot manipulating a circuit board for easier soldering by the user, lifting and manipulat-ing heavy or unwieldy objects together, or generally having the robot conduct and complete secondary tasks while the user focuses on the primary tasks. If people are able to over-come the cognitive difficulty of maintaining presence for both themselves and a nearby remote entity, tasks that typi-cally require the use of two people could simply require one person assisted by a humanoid robot that they control. In this work, we explore some of the challenges of creating such a system, propose research questions for shared presence, and present our initial implementation that can enable shared presence. We believe shared presence opens up a new re-search direction that can be applied to many fields, including manufacturing, home-assistant robotics, and educationItem Open Access Sharing Spaces with Robots An Integrated Environment for Human-Robot Interaction(2006-02-15) Young, James E.; Sharlin, EhudIn this paper we offer an intelligent integrated environment for human-robot interaction. This environment takes advantage of the fact that robots are both digital and physical entities, thus improving human-robot interaction and communication. Using mixed reality, our approach brings digital information directly into the physical environment, allowing users to interact with robots ideas and thoughts directly within the shared physical interaction space. We also present a taxonomy which we use to organise and classify the various interaction techniques that this environment offers. Using this taxonomy, we demonstrate by detailing three interaction techniques, thought crumbs, decorations and bubblegrams. To evaluate these techniques, we offer the design of a realisable prototype.Item Open Access Touch and Toys - new techniques for interaction with a remote group of robots(2008-09-26T17:16:34Z) Guo, Cheng; Young, James E.; Sharlin, EhudInteraction with a remote team of robots in real time is a difficult human-robot interaction (HRI) problem exacerbated by the complications of unpredictable real- world environments, with solutions often resorting to a larger-than-desirable ratio of operators to robots. We present two innovative interfaces that allow a single operator to interact with a group of remote robots. Using a tabletop computer the user can configure and manipulate groups of robots directly by either using their fingers (touch) or by manipulating a set of physical toys (tangible user interfaces). We recruited participants to partake in an extensive user study that required them to interact with a small group of remote robots in simple tasks, and present our findings as a set of design considerations.Item Open Access The Use of Harr-like Features in Bubblegrams: a Mixed Reality Human-Robot Interaction Technique(2006-02-15) Young, James E.; Sharlin, Ehud; Boyd, Jeffrey E.We present the application of a vision algorithm based on Harr-like features in Bubblegrams - a new mixed reality-based human-robot interaction (HRI) technique. Bubblegrams allows humans and robots working on collocated synchronous tasks to interact directly by visually augmenting their shared physical environment. Bubblegrams uses comics-like interactive graphic balloons or bubbles that appear above the robot s body and allow intuitive interaction with the robot. Users wear light-weight mixed reality goggles that integrate displays and a camera, allowing the user to view and interact with the physical environment as well as with the virtual Bubblegrams interface linked to the robot s body. In order to efficiently link Bubblegrams in real-time to the physical robot we implemented a vision algorithm based on Harr-like features which is the main topic of this paper. This paper briefly details the design of the Bubblegrams interface, the hardware and software we use for the current prototype, and the full details of the vision algorithm.