Project

# Title Team Members TA Documents Sponsor
75 Mobile Gesture-Based Music Composition
Adam Marcott
Tyler Oneill
John Capozzo design_document0.pdf
final_paper0.pdf
proposal0.pdf
The aim of this project is to build a device that can capture motion and tactile data from a musician and generate MIDI (Musical Instrument Digital Interface) signals and sounds on a mobile platform more intuitively than a traditional graphical user interface (GUI) and without the need for specific hardware.

Unlike many other types of artists, musicians are unable to record their ideas for a new piece of music when they are away from their recording equipment. This is a unique problem faced by musicians. To prevent this large creative loss, we propose to build a mobile platform by which musicians can convert the motion of their hands and fingers into samples that can be used to compose music while mobile. Many current solutions exist, such as Air Beats, a glove that allows musicians to compose music digitally without interacting with a screen. There are simpler alternatives, such as the Xkey, but a MIDI keyboard requires a stable surface and ample space for movement in order to be an effective composition tool. Our device needs neither, as the gestures are compact and only need as much space as your hands occupy. There is yet to be a commercially successful gesture-based product as there is a high reliance on either expensive pre-existing technology with questionable reliability (digital paper technique), or a lack of throughput capability due to inexpressive language during pattern recognition. Our solution aims to solve both of these problems with current solutions by using a minimal set of viable hardware and an expressive language that can be used to quickly translate motion into synthesizable sound.

Benefits:

Able to compose music while on the move, or when other recording equipment is not available.
Ideal for a quick sketch to save creative ideas
Able to quickly demo the idea in your mind to other people

Features:

User Interface
An easy-to-learn and expressive gesture language that uses hand and finger motion to navigate menus, turn knobs, and play notes
Personalizable shortcuts and gestures can extend the language.
Interface APP runs on user's’ smartphone to communicate with the wearable hardware

Music Production Features:

A wearable hardware that can capture hand and finger movement and generate MIDI data.
Instant playback during the composition
Ability to draw from a sound sample library and attach to MIDI notes
Two or more devices can be connected and work together.

Development Goals:

Demonstrate what makes a "language" expressive, easy to learn, and powerful to use. Possible examples: vim, stenotyping, Cornell note taking, Dvorak keyboard layout, American Sign Language, etc.
Evaluate the kinesthetics of the hand to make it easy to keep a rapid pace without damage to your hand health (eg carpal tunnel)
Propose alternatives to the hand for impaired individuals


LED Cube

Michael Lin, Raymond Yeh

LED Cube

Featured Project

LED technology is more advanced and much more efficient than traditional incandescent light bulbs and as such our team decided we wanted to build a device related to LEDs. An LED cube is inherently aesthetically pleasing and ours will be capable of displaying 3D animations and lighting patterns with much increased complexity compared to any 2D display of comparable resolution. Environmental interaction will also be able to control the various lighting effects on the cube. Although our plan is for a visually pleasing cube, our implementation can easily be adapted for more practical applications such as displaying 3D models.