Browsing by Author "Cockburn, A."
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item Metadata only Air pointing: Design and evaluation of spatial target acquisition with and without visual feedback(Elsevier, 2011) Cockburn, A.; Quinn, P.; Gutwin, C.; Ramos, G.; Looser, J.Sensing technologies such as inertia tracking and computer vision enable spatial interactions where users make selections by ‘air pointing’: moving a limb, finger, or device to a specific spatial region. In addition of expanding the vocabulary of possible interactions available, air pointing brings the potential benefit of enabling ‘eyes-free’ interactions, where users rely on proprioception and kinaesthesia rather than vision. This paper explores the design space for air pointing interactions, and presents tangible results in the form of a framework that helps designers understand input dimensions and resulting interaction qualities. The framework provides a set of fundamental concepts that aid in thinking about the air pointing domain, in characterizing and comparing existing solutions, and in evaluating novel techniques. We carry out an initial investigation to demonstrate the concepts of the framework by designing and comparing three air pointing techniques: one based on small angular ‘raycasting’ movements, one on large movements across a 2D plane, and one on movements in a 3D volume. Results show that large movements on the 2D plane are both rapid (selection times under 1 s) and accurate, even without visual feedback. Raycasting is rapid but inaccurate, and the 3D volume is expressive but slow, inaccurate, and effortful. Many other findings emerge, such as selection point ‘drift’ in the absence of feedback. These results and the organising framework provide a foundation for innovation and understanding of air pointing interaction.Item Open Access THE DESIGN AND EVOLUTION OF TURBO TURTLE, A COLLABORATIVE MICROWORLD FOR EXPLORING NEWTONIAN PHYSICS(1995-03-01) Cockburn, A.; Greenberg, S.This paper describes the evolution and on-going development of TurboTurtle, a dynamic multi-user microworld for the exploration of Newtonian physics. With TurboTurtle, students can alter the attributes of the simulation environment, such as gravity, friction, and presence or absence of walls. They can also manipulate the "turtle" (a movable ball) directly. Students can adjust its position, velocity and mass; change its kinetic and potential energy; and apply a force to it by strapping a rocket to its back. Students explore the microworld by manipulating these parameters, and learn concepts by studying the behaviours and interactions that occur. TurboTurtle has gone through three major evolutions. It began as a rudimentary command line extension to Logo, and became a dynamic simulation environment driven by a graphical user interface. Most recently, TurboTurtle has become "group-aware", where several students, each on their own computer, can simultaneous control the microworld and gesture around the shared display. In this final version, teachers can add structure to the group's activities by setting the simulation environment to an interesting state, which includes a set of problems and questions. The rationale behind the major design decisions in each step are presented. We also discuss the technical aspects of making TurboTurtle group-aware with a groupware toolkit called GroupKit.Item Open Access FROM AWARENESS TO TEAMROOMS, GROUPWEB AND TURBOTURTLE: EIGHT SNAPSHOTS OF RECENT WORK IN THE GROUPLAB PROJECT(1995-12-01) Greenberg, S.; Gutwin, C.; Cockburn, A.; Roseman, M.This report contains eight short papers that serve as snapshots of recent work by members and collaborators of the GroupLab team. All papers are concerned with groupware, and all but one of the systems described were implemented using GroupKit, our groupware toolkit. The first five papers are a suite of articles that considers how awareness of others can be supported in groupware systems. The papers cover theoretical considerations of awareness (#2), practical efforts in building systems and widgets to support awareness (#1,3,4) and evaluation of widgets to determine their effectiveness and usability (#5). Suite Overview: Supporting Awareness of Others in Groupware 1. Peepholes: Low Cost Awareness of One's community 2. Workspace Awareness for Groupware 3. Workspace Awareness Support With Radar Views 4. A Fisheye Text Editor for Relaxed-WYSIWIS Groupware 5. A Usability Study of Workspace Awareness Widgets The next three papers cover individual projects. TeamRooms (#6) is a groupware equivalent of a physical team room. Group members can stock the room with applications, and can enter the room at any time to continue their work individually or collectively. GroupWeb (#7) is a World Wide Web browser that is group-aware. People can share their views of pages in real time, can gesture around it with telepointers, and can add group annotations to a page with a groupware editor. TurboTurtle (#8) is a microworld for Newtonian physics designed for children. Children were observed using TurboTurtle, and their collaboration styles are analyzed. 6. TeamRooms: Groupware for Shared Electronic Spaces 7. GroupWeb: A WWW Browser as Real Time Groupware 8. Children's Collaboration Styles in a Newtonian MicroWorldItem Metadata only Understanding performance in touch selections: Tap, drag and radial pointing drag with finger, stylus and mouse(Elsevier, 2012) Cockburn, A.; Ahlstrom, D.; Gutwin, C.Touch-based interaction with computing devices is becoming more and more common. In order to design for this setting, it is critical to understand the basic human factors of touch interactions such as tapping and dragging; however, there is relatively little empirical research in this area, particularly for touch-based dragging. To provide foundational knowledge in this area, and to help designers understand the human factors of touch-based interactions, we conducted an experiment using three input devices (the finger, a stylus, and a mouse as a performance baseline) and three different pointing activities. The pointing activities were bidirectional tapping, one-dimensional dragging, and radial dragging (pointing to items arranged in a circle around the cursor). Tapping activities represent the elemental target selection method and are analysed as a performance baseline. Dragging is also a basic interaction method and understanding its performance is important for touch-based interfaces because it involves relatively high contact friction. Radial dragging is also important for touch-based systems as this technique is claimed to be well suited to direct input yet radial selections normally involve the relatively unstudied dragging action, and there have been few studies of the interaction mechanics of radial dragging. Performance models of tap, drag, and radial dragging are analysed. For tapping tasks, we confirm prior results showing finger pointing to be faster than the stylus/mouse but inaccurate, particularly with small targets. In dragging tasks, we also confirm that finger input is slower than the mouse and stylus, probably due to the relatively high surface friction. Dragging errors were low in all conditions. As expected, performance conformed to Fitts' Law. Our results for radial dragging are new, showing that errors, task time and movement distance are all linearly correlated with number of items available. We demonstrate that this performance is modelled by the Steering Law (where the tunnel width increases with movement distance) rather than Fitts' Law. Other radial dragging results showed that the stylus is fastest, followed by the mouse and finger, but that the stylus has the highest error rate of the three devices. Finger selections in the North-West direction were particularly slow and error prone, possibly due to a tendency for the finger to stick–slip when dragging in that direction.Item Open Access USING DISTORTION-ORIENTED DISPLAYS TO SUPPORT WORKSPACE AWARENESS(1996-01-01) Greenberg, S.; Cockburn, A.; Gutwin, C.Desktop conferencing systems are now moving away from strict view-sharing and towards relaxed "what-you-see-is-what-I-see" (relaxed-WYSIWIS) interfaces, where distributed participants in a real time session can view different parts of a shared visual workspace. As with strict view-sharing, people using relaxed-WYSIWIS require a sense of \fIworkspace awareness\fR-the up-to-the-minute knowledge about another person's interactions with the shared workspace. The problem is deciding how to provide a user with an appropriate level of awareness of what other participants are doing when they are working in different areas of the workspace. In this paper, we propose \fIdistortion oriented displays\fR as a novel way of providing this awareness. These displays, which employ magnification lenses and fisheye view techniques, show global context and local detail within a single window, providing both peripheral and detailed awareness of other participants' actions. Three prototype inventions are presented as examples of groupware distortion-oriented displays: the fisheye text viewer, the offset lens, and the Ihead-up lens.