Project
# | Title | Team Members | TA | Documents | Sponsor |
---|---|---|---|---|---|
17 | IMUsic Runner-Up |
John Born Kuang Wang Miguel Torres |
Kristina Miller | design_document1.pdf design_document2.pdf final_paper2.pdf final_paper3.docx photo1.jpeg proposal1.pdf video |
|
# Project Title: IMUsic ### Members: Miguel Torres (miguelt2), John Born (jmborn2), Kuang Wang (kwang69) ## Problem Music and movement are big contributors of emotional expression within a performance. Traditionally, the music is dictated by the musician whose interaction with musical instruments requires trained dexterity and coordination (i.e. the flute requires remembering note fingerings). This forces composition and choreography of movement to be completed separately. With the advancement of computing technology, creating music has become more accessible to non-musicians by way of electronic instruments and computers. However, interacting with these newer instruments still requires the capability of pushing buttons, twisting knobs, and moving sliders during a performance. Unfortunately, this continues to exclude dancers from being an active participant in the composition in which they perform. ## Solution Overview In order to provide a more liberating approach to interface with electronic instruments, this project aims to make a set of wrist and ankle wearable devices whose orientation (pitch, yaw, and roll) will be mapped to parameters that control the sonic characteristics of a software synthesizer. The data will be transmitted wirelessly, via Wi-Fi, to a laptop using the Open Sound Control (OSC) protocol, which is a form of UDP transmission. The mapping will be executed within the Supercollider IDE, and will control characteristics such as pitch, volume, modulator frequencies of FM synthesis textures, or filter cutoff frequencies. This will allow dancers to directly control / contribute to their music. ## Solution Components - Physical Devices: - Wristbands and ankle bands - 3D-printing enclosure - Data Collection Module: - Collecting orientation data from Accelerometers, Gyroscope, Magnetometer to estimate orientation. - Planning to use Inertial Measurement Units (IMU): MPU-9250 9-Axis Gyro-Accel-Magnet chip. - Processing 9 DoF sensor fusion with orientation output (Yaw, Pitch, and Roll) later used to control the musical characteristics. - Wireless Transmission Module: - Wi-Fi capable microcontroller will be used to transmit the sensor data wirelessly. We are looking at the ESP8266 modules which can use I2C. - Musical Characteristics Control Module: - Open Sound Control (OSC) Protocol to communicate between devices and virtual synth - OSC is a URL-style communication protocol that transmits 32 bit, time-stamped data in real-time. - Supercollider, a real-time programming environment for audio synthesis. - Power System: - Rechargeable 3.7V LiPo Batteries with Buck-Boost converters to step voltage to 3.3V and 5V required for microcontrollers and PCB. ## Criterion for Success Creating music should feel intuitive and reliable. The synth should smoothly control sounds without unexpected jumps in pitch, filters, etc. For this we will be looking into possible solutions for smoothing the incoming signal by either hardware or software means. Considering that there are similar projects that utilize the IMUs we are looking to implement, we will take into advisement their approach such as using the Kalman Filter. Ideally, if we can find a hardware solution we will explore that further. Additionally, body movements should alter synth controls as if playing the performer were playing an instrument. Through the flexibility offered by the Supercollider IDE, we will aim to control the following 5 parameters of our custom synthesizer: pitch, volume, modulation, filter cutoffs, and panning. Furthermore, we will want to ensure that the music synthesis is real time and reliable. Using the OSC protocol, we expect that latency will not be an issue since the data transferred will occur in real-time. That being said, we will make aim to keep any filtering and smoothing that occurs pre-transmission to be no greater than 15ms. We will also take into consideration data losses and how that will affect the accuracy of the controls for the synth. We will monitor the data from the sensors and compare them to the data received over Wi-Fi to minimize how much data is lost. ## Competitors This project draws inspiration from the Mi.Mu Gloves, which are a wireless gesture-based interface for music production software. While the Mi.Mu Gloves use machine-learning algorithms to map out specific gestures to the Ableton music production software, our wrist and ankle devices will transmit real-time data to control the parameters of our custom software synthesizer. In addition, we will focus the devices on whole-body movement so that dance performances can be more easily incorporated rather than solely hand controls. |