Project

# Title Team Members TA Documents Sponsor
43 Real-Time Sound Visualization
Lin Le
Qian Chen
Xinyue Yu
Dongwei Shi design_review
design_review
final_paper
other
other
other
presentation
video
We plan to design a sound visualization model by using a pitch detector to detect pitch and output with musical notation on the screen. Furthermore, we are going to store the melody and mimic piano sound on chips.

1. Detect Pitch.

We plan to make a pitch detector in hardware to detect sound in real time at a 10k sampling rate, and a LED light to indicate when it is ON or OFF. An autocorrelation analysis, center clipping, infinite peak clipping will be used to build up the detector.

2. Output music notation in real time.

Once a note has been detected, it will show on the screen at the right position. The previous notes on the music notation will move right. It will look like flowing music. The screen will be connected on a black board with detector in which we could display the sound in real time.

3. Store the melody.

We are going to store the melody in pitch into registers for future replay.

4. Mimic instruments sound.

we will use instrument sounds package, like guitar, piano, and violin to replay the melody on arduino. The mimic will not be in real time and only for replay mode.

Low Cost Myoelectric Prosthetic Hand

Michael Fatina, Jonathan Pan-Doh, Edward Wu

Low Cost Myoelectric Prosthetic Hand

Featured Project

According to the WHO, 80% of amputees are in developing nations, and less than 3% of that 80% have access to rehabilitative care. In a study by Heidi Witteveen, “the lack of sensory feedback was indicated as one of the major factors of prosthesis abandonment.” A low cost myoelectric prosthetic hand interfaced with a sensory substitution system returns functionality, increases the availability to amputees, and provides users with sensory feedback.

We will work with Aadeel Akhtar to develop a new iteration of his open source, low cost, myoelectric prosthetic hand. The current revision uses eight EMG channels, with sensors placed on the residual limb. A microcontroller communicates with an ADC, runs a classifier to determine the user’s type of grip, and controls motors in the hand achieving desired grips at predetermined velocities.

As requested by Aadeel, the socket and hand will operate independently using separate microcontrollers and interface with each other, providing modularity and customizability. The microcontroller in the socket will interface with the ADC and run the grip classifier, which will be expanded so finger velocities correspond to the amplitude of the user’s muscle activity. The hand microcontroller controls the motors and receives grip and velocity commands. Contact reflexes will be added via pressure sensors in fingertips, adjusting grip strength and velocity. The hand microcontroller will interface with existing sensory substitution systems using the pressure sensors. A PCB with a custom motor controller will fit inside the palm of the hand, and interface with the hand microcontroller.

Project Videos