While completing my M.S. in computer science at Baylor University, I researched human-computer interaction with Dr. Poor. In our research we performed a user study evaluating users' rhythmic accuracy performing rhythms with different input devices including a digital drum pad set, a finger drumming pad controller, and MoveMIDI, my prototype which allowed for rhythmic interaction via spatial in-air arm gestures. In response to the study, I created a followup MoveMIDI prototype using VR technology and the Oculus Quest 1 VR headset. I completed my thesis on the topic of 3D spatial interfaces for musical input.
As I am currently trying to publish a conference paper based on the work in my thesis, I am not yet including a digital download or deeper content on this project page. I plan to update this page after publication, but I have included my thesis title and abstract below.
Investigating Rhythmic Accuracy Using 3D Spatial Interaction for Digital Musical Input : MoveMIDI
Many human-computer interfaces exist which allow users to interact with music software to create and perform music. Some interfaces allow users to interact using movements of their body to create music. This thesis describes a form of movement interaction called 3D spatial interaction and evaluates its application for human control of music software by utilizing the MoveMIDI prototype and conceptual framework. MoveMIDI interprets a user’s positional body movement relative to its virtual 3D environment to control music software. In a user study, the usability of the initial MoveMIDI prototype as a rhythmic input device was evaluated by measuring rhythmic accuracy of participants using the prototype, a drum interface with sticks, and a finger-drumming interface. The study revealed initial spatial unsureness of participants using MoveMIDI due to visualization issues and lack of haptic feedback. These issues prompted the creation of a follow-up prototype using head-mounted display 3D visualization and haptic feedback.