Project

# Title Team Members TA Documents Sponsor
35 Acoustic Motion Tracking
Hojin Chun
Sean Nachnani
Yuchen He TA design_review
other
proposal
Group Members:
Sean Nachnani (nachnan2)
Kevin Chun (hchun8)

General Description:
The project idea is to use sound rather than video as a means of motion recognition. Current smart devices are limited to only using natural language processing to interpret a user's needs. We want to expand upon this further and allow devices to perform commands using simple gestures.
The current idea is to create a 4-input microphone array with an ADC that allows for at least a 48khz sample rate, and use a speaker that can reproduce sounds up to at least 24khz. We will start off by sending pseudo-random pulses across a large bandwidth and correlating the sent signal with the received input from the microphones. Given time we will switch to using FMCW (Frequency Modulated Continuous Waveform) radar as a basis for this approach. This will allow us to achieve accurate distance and velocity measurements, and potentially transmit in the inaudible range.
I have spent the last month prototyping this device using a raspberry Pi and a speaker array. I've gotten the pseudo random pulse approach to work, coding all the signal processing in Python, mainly with the PyAudio and SciPy libraries. The prototype's speaker array is currently sampling at 44.1khz and using a speaker that can play up to 20khz. I was able to achieve accurate measurements within the range of a normal living room (about the size of a smaller classroom in eceb).
We plan on building the microphone array using 4 MEMS microphones and appropriate ADCs to sample up to 48khz. This will allow us to play sounds up to 24khz, which will give us enough bandwidth to get accurate measurements. We'll also use a micro controller (most likely a raspberry pi) to sample from these microphones and perform the DSP needed. This system will be designed to be plugged into a regular power outlet.

Related Research Papers:
CAT: High-Precision Acoustic Motion Tracking http://www.cs.utexas.edu/~wmao/resources/papers/cat.pdf
FingerIO: Using Active Sonar for Fine-Grained Finger Tracking https://fingerio.cs.washington.edu/fingerio.pdf

Low Cost Myoelectric Prosthetic Hand

Michael Fatina, Jonathan Pan-Doh, Edward Wu

Low Cost Myoelectric Prosthetic Hand

Featured Project

According to the WHO, 80% of amputees are in developing nations, and less than 3% of that 80% have access to rehabilitative care. In a study by Heidi Witteveen, “the lack of sensory feedback was indicated as one of the major factors of prosthesis abandonment.” A low cost myoelectric prosthetic hand interfaced with a sensory substitution system returns functionality, increases the availability to amputees, and provides users with sensory feedback.

We will work with Aadeel Akhtar to develop a new iteration of his open source, low cost, myoelectric prosthetic hand. The current revision uses eight EMG channels, with sensors placed on the residual limb. A microcontroller communicates with an ADC, runs a classifier to determine the user’s type of grip, and controls motors in the hand achieving desired grips at predetermined velocities.

As requested by Aadeel, the socket and hand will operate independently using separate microcontrollers and interface with each other, providing modularity and customizability. The microcontroller in the socket will interface with the ADC and run the grip classifier, which will be expanded so finger velocities correspond to the amplitude of the user’s muscle activity. The hand microcontroller controls the motors and receives grip and velocity commands. Contact reflexes will be added via pressure sensors in fingertips, adjusting grip strength and velocity. The hand microcontroller will interface with existing sensory substitution systems using the pressure sensors. A PCB with a custom motor controller will fit inside the palm of the hand, and interface with the hand microcontroller.

Project Videos