Investigating rhythmic accuracy using 3D spatial interaction for digital musical input : MoveMIDI.
Many human-computer interfaces exist which allow users to interact with music software to create and perform music. Some interfaces allow users to interact using movements of their body to create music. This thesis describes a form of movement interaction called 3D spatial interaction and evaluates its application for human control of music software by utilizing the MoveMIDI prototype and conceptual framework. MoveMIDI interprets a user's positional body movement relative to its virtual 3D environment to control music software. In a user study, the usability of the initial MoveMIDI prototype as a rhythmic input device was evaluated by measuring rhythmic accuracy of participants using the prototype, a drum interface with sticks, and a finger-drumming interface. The study revealed initial spatial unsureness of participants using MoveMIDI due to visualization issues and lack of haptic feedback. These issues prompted the creation of a follow-up prototype using head-mounted display 3D visualization and haptic feedback.