Browsing by Author "Shekhar Nittala, Aditya"
Now showing 1 - 4 of 4
Results Per Page
Sort Options
Item Open Access DanceShala - A Visual Feedback Interface for Dance Learning(2022-08-31) Mukherjee, Suvojit; Alim, Usman; Kenny, Sarah; Katz, Larry; Shekhar Nittala, AdityaDance is a beautiful art form that can be enjoyed by people irrespective of age. One can learn dance from a dance teacher in a dance studio. The visual feedback received in an in-person class from an instructor is one of the best ways to improve dance learning. Sometimes it is not possible for a person to attend dance classes due to time and location constraints. The alternate option for people to learn dance is to attend dance classes online or self-learning with the help of dance games (online video games where a player attempts to follow a pattern of dance steps shown on screen in time to music). Organized remote visual feedback can assist a learner to learn dance in such scenarios. However, online dance classes or dance games may not be sufficient for a new learner to learn dance because the feedback received is not always adequate. To make online dance learning more comprehensive for new learners, a visual feedback interface named ‘DanceShala’ is created which will deliver comparative visual feedback to students after comparing teacher and student movements. In this study, dance movements of the teacher and the student are recorded. After processing the recorded movement data, feedback is generated on the correctness of the student’s movements as compared to the teacher. The visual feedback is displayed through an interface which assists a student to identify the errors made when compared to the teacher video. In the last stage of the study, a survey is administered to understand the user perception about this interface. This research is an interdisciplinary study combining Computer Science, Kinesiology and Dance.Item Open Access Flying Frustum: A Spatial Interface for Enhancing Human- UAV Awareness(2015-06-09) Li, Nico; Cartwright, Stephen; Shekhar Nittala, Aditya; Sharlin, Ehud; Costa Sousa, MarioWe present Flying Frustum, a 3D spatial interface that enables control of semi-autonomous UAVs (Unmanned Aerial Vehicles) using pen interaction on a physical model of the terrain, and that spatially situates the information streaming from the UAVs onto the physical model. Our interface is based on a 3D printout of the terrain, which allows the operator to enter goals and paths to the UAV by drawing them directly on the physical model. In turn, the UAV’s streaming reconnaissance information is superimposed on the 3D printout as a view frustum, which is situated according to the UAV’s position and orientation on the actual terrain. We argue that Flying Frustum’s 3D spatially situated interaction can potentially help improve human-UAV awareness, allow a better operators-to-UAV ratio, and enhance the overall situational awareness. We motivate our design approach for Flying Frustum, discuss previous related work in CSCW and HRI, present our current prototype using both handheld and headset augmented reality interfaces, reflect on Flying Frustum’s strengths and weaknesses, and discuss our plans for future evaluation and prototype improvements.Item Open Access FLYING FRUSTUM: A Spatial Interface for Enhancing Human-UAV Awareness: The Video(2016-01-18) Li, Nico; Cartwright, Stephen; Shekhar Nittala, Aditya; Sharlin, Ehud; Costa Sousa, MarioWe present Flying Frustum, a 3D spatial interface that enables control of semi-autonomous UAV (Unmanned Aerial Vehicles) using pen interaction on a physical model of the terrain, and that spatially situates the information streaming from the UAVs onto the physical model. Our interface is based on a 3D printout of the terrain, which allows the operator to enter goals and paths to the UAV by drawing them directly on the physical model. In turn, the UAV’s streaming reconnaissance information is superimposed on the 3D printout as a view frustum, which is situated on the physical model according to the UAV’s location on the actual terrain. We argue that Flying Frustum’s 3D spatially situated interaction can help improve human- UAV awareness, allow a better operators-to-UAV ratio, and enhance the overall situational awareness. In this video we illustrate the design and demonstrate the proof-of-concept system of Flying Frustum.Item Open Access PLANWELL : Spatial Interface For Collaborative Petroleum-Well Planning: The Video(2016-01-19) Shekhar Nittala, Aditya; Li, Nico; Cartwright, Stephen; Takashima, Kazuki; Sharlin, Ehud; Costa Sousa, MarioWe present our prototype of PlanWell, a spatial augmented reality interface that facilitates collaborative field operations. PlanWell collaborative field operations. PlanWell allows a central overseer (in a command and control center) and a remote explorer (an outdoor user in the field) to explore and collaborate within a geographical area. > PlanWell provides the overseer with a tangible user interface (TUI) based on a 3D printout of surface geography which acts as a physical representation of the region to be explored. Augmented reality is used to dynamically overlay properties of the region as well as the presence of the remote explorer and their actions on to the 3D representation of the terrain. The overseer is able to perform the actions directly on the TUI and then the overseer’s actions are presented as dynamic AR visualizations superimposed on the explorer’s view in the field. Although our interface could applied to many domains, the PlanWell prototype was developed to facilitate petroleum engineering tasks such as well planning and coordination of drilling operations. This video illustrates the design and demonstrated the interaction techniques of our PlanWell prototype.