keyboard_arrow_up
A Perceptive Mobile Program to Analyze Finger Posture on the Piano using Machine Learning and Object Detection

Authors

Yuhang Zeng1 and Jonathan Sahagun2, 1USA, 2California State Polytechnic
University, USA

Abstract

I wanted to solve this problem due to noticing many new pianists having issues with their hand posture. Therefore, my project solves the problem by making an app that uses a phone's camera in order to track finger movements [1]. Since smartphones are extremely commonplace, this solution is available for most people. A firebase server is used in order process images from the app and return numbers back to the app in order to display it [2]. The app also tracks practicing sessions and displays data about the hands in every session. I had to optimize a lot of parts of the program, including determining the frequency in which a frame is sent to be analyzed to the server. I alsoexperimented with testing out the dif erent phone angles as well as the latency of the server. Overall, this is an innovative app for improving piano posture for new pianists by encouraging good practicing habits.

Keywords

Piano, Tracking, Computer Vision, Object Detection.

Full Text  Volume 14, Number 4