Browsing by Author "Takashima, Kazuki"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item Open Access Applications of Interactive Topographic Maps: Tangibility with Improved Spatial Awareness and Readability(2019-07-02) Li, Hao; Sharlin, Ehud; Costa Sousa, Mario; Takashima, Kazuki; Chen, Zhangxing; Figueroa, Pablo; Willett, Wesley J.Traditional flat topographic maps are difficult to understand due to the distortion and compromise of the 3-dimensional (3D) spatial representation when it is folded into lower-dimension media (e.g. 2D). During the process, the x-y coordinate of a location can be captured but its physical elevation must be transformed using some visualization techniques, resulting in noticeable cognitive effort in comprehending the original geometric and geographic properties of the original terrain. In this manuscript-based dissertation, I present a collection of my past publications that aim to increase the readability of topographic maps by restoring the original spatiality of the terrain - including the elevations - with a physical map representation and then superimpose additional data visualization on top of it. In this way, the entire terrain topology is kept in a scaled physical representation, allowing users to view it with natural human perceptions. Additionally, user gestures can be tracked in real-time as a sketch-based input to allow novel dynamic interaction of the map interface and data manipulation of the spatial information. Through the chapters, I present the aforementioned concept, named interactive topographic interface, along with a few applications of it in different academic and industrial environments. I also report the design and results of a user study that compares the interface with traditional flat topographic maps. In the long-term, I hope that research mentioned in this dissertation inspires future interactive physical cartography to not only improve map comprehension but also facilitate better spatial and situational awareness over the map interface, resulting in an evolved map usefulness.Item Open Access Designing NeuroSimVR: A Stereoscopic Virtual Reality Spine Surgery Simulator(2017-11-01) Mostafa, Ahmed E.; Ryu, Won Hyung A.; Chan, Sonny; Takashima, Kazuki; Kopp, Gail; Costa Sousa, Mario; Sharlin, EhudThis paper contributes NeuroSimVR, a stereoscopic virtual reality spine surgery simulator that allows novice surgeons to learn and practice a spinal pedicle screw insertion (PSI) procedure using simplified interaction capabilities and 3D haptic user interfaces. By collaborating with medical experts and following an iterative approach, we provide characterization of the PSI task, and derive requirements for applying this procedure in a 3D immersive interactive simulation system. We describe how these requirements were realized in our NeuroSimVR prototype, and outline the educational benefits of our 3D interactive system for training the PSI procedure. We conclude the paper with the results of a preliminary evaluation of NeuroSimVR and reflect on our interface benefits and limitations.Item Open Access Mediating Experiential Learning in Interactive Immersive Environments(2018-01-22) Mostafa, Ahmed; Sharlin, Ehud; Costa Sousa, Mário; Chan, Sonny; Takashima, Kazuki; Boulanger, Pierre; El-Sheimy, NaserSimulation and immersive environments are gaining popularity in various contexts. Arguably, such interactive systems have the potential to benefit many users in a variety of education and training scenarios. However, some of these systems especially with the lack of skilled instructors are still faced by challenges of operational complexity, the incorporation of different technologies and features, and the limited availability of performance measures and feedback. Therefore, the design of these systems would benefit from integrating experiential aspects and essential educational aids. For example, users of such learning systems, especially the novice ones, can be better supported by a smoother learning curve, detailed guidance features, the availability of feedback and performance reporting, and the integration of engaging & reflective capabilities. In essence, we recognize a need to re-explore learning aids and how they impact design, usage, and overall learning experience in interactive immersive environments. The goal of this dissertation is to mediate experiential learning in interactive immersive environments. This includes exploring existing and novel learning aids that would facilitate learning with improved engagement and immersion, enrich learners with insightful reflections, better support novice users’ learning and training needs, and ultimately enhance the overall experience. To achieve this goal, we utilized existing learning models and simulation-based training approaches and proposed a framework of learning aids to mediate learning in interactive immersive environments. Working closely with domain expert collaborators, we designed, implemented and evaluated four new interactive immersive prototypes in an attempt to validate the practicality of our aids. The first prototype, NeuroSimVR, is a stereoscopic visualization augmented with educational aids to support how medical users learn about a common back surgery procedure. The second prototype, ReflectiveSpineVR, is an immersive virtual reality surgical simulation with innovative interaction history capabilities that aim to empower users’ memories and enable deliberate repetitive practice as needed. The third prototype, JackVR, is an interactive immersive training system, utilizing novel gamification elements, and aims to support oil-and-gas experts in the process of landing oil rigs. Our fourth prototype, RoboTeacher, involves a humanoid robot instructor for teaching people industrial assembly tasks. In our prototypes, we presented novel learning aids, visualization, and interaction techniques that are new to many of the current immersive learning tools. We conclude this dissertation with lessons learned and guidelines for designing with learning aids in future research directions that target interactive experiential environments.Item Open Access PLANWELL : Spatial Interface For Collaborative Petroleum-Well Planning: The Video(2016-01-19) Shekhar Nittala, Aditya; Li, Nico; Cartwright, Stephen; Takashima, Kazuki; Sharlin, Ehud; Costa Sousa, MarioWe present our prototype of PlanWell, a spatial augmented reality interface that facilitates collaborative field operations. PlanWell collaborative field operations. PlanWell allows a central overseer (in a command and control center) and a remote explorer (an outdoor user in the field) to explore and collaborate within a geographical area. > PlanWell provides the overseer with a tangible user interface (TUI) based on a 3D printout of surface geography which acts as a physical representation of the region to be explored. Augmented reality is used to dynamically overlay properties of the region as well as the presence of the remote explorer and their actions on to the 3D representation of the terrain. The overseer is able to perform the actions directly on the TUI and then the overseer’s actions are presented as dynamic AR visualizations superimposed on the explorer’s view in the field. Although our interface could applied to many domains, the PlanWell prototype was developed to facilitate petroleum engineering tasks such as well planning and coordination of drilling operations. This video illustrates the design and demonstrated the interaction techniques of our PlanWell prototype.Item Open Access A Shape-Shifting Wall Display that Supports Individual and Group Activities(2015-04-22) Takashima, Kazuki; Greenberg, Saul; Sharlin, Ehud; Kitamura, YoshifumiWe contribute a shape-shifting wall display that dynamically changes its physical shape to support particular individual and group activities. Our current prototype comprises three vertical slim screens, each mounted on a mobile robot. Different shapes are created by controlling the position and angle of each robot. Examples include: (1) a flat wall display for collaboratively sharing visual content; (2) three separated screens for individual work; (3) a concave screen that provides a more immersive and private experience; (4) a convex screen that supports a mix of individual and group work, (5) tilted screens, and others. Shape-changing is controlled explicitly or implicitly. Explicit control occurs when a collaborator indicates a desired shape via a hand gesture. Implicit control occurs when the system infers a change in the group’s context, e.g., as determined from a change in the screen’s contents, or by monitoring the spatial relations between participants around the display (via proxemics and F-formations). Several interaction scenarios illustrate how a shape-changing display is used in practice.