Project

# Title Team Members TA Documents Sponsor
12 Hands-free DJ
Honorable Mention
Jie Du
Ningkai Wu
Yifei Teng
Jacob Bryan design_document0.pdf
final_paper0.pdf
other0.pdf
presentation0.pdf
proposal0.pdf
In a party, the DJ is the guy responsible for supplying everyone an endless stream of entertaining music. But we all know he deeply wants to join the party! So we’ll build a remote gesture controlled DJ console that every DJ can take into the action.

Formal description: Our system comprises two parts,
1. A compact device that straps to one’s hand and collects gesture information. The gesture can be used to navigate a playlist, change various effects, manipulate voice recorded from a microphone etc.
2. A phone app that implements the various signal processing functions and outputs the music. The app is driven by gesture data from the embedded device.

To send gesture data from the device to the app, we use the Bluetooth Low Energy protocol. The embedded device will contain a battery, an accelerometer, a gyroscope, a magnetometer, a barometer, a microphone, and a few buttons for testing. It fuses sensor data to estimate the pose of the hand, in the form of orientation and change in height. The microphone can detect one-shot events such as snapping a finger. We will define a custom protocol to stream these events along with continuously changing gesture data to the phone, which will make use of these data to perform signal processing tasks. In addition, the phone will record the user’s voice through a microphone and mix it into the final audio. The microchip on the embedded device will need to be reasonably powerful to perform sensor fusion and at the same time monitoring the microphone for the characteristic sound of finger snapping.

Here is a list of functionalities we propose (non exhaustive):
1. Vocoder to change the texture of one’s voice.
2. Pitch shifting
3. Looping at the snap of a finger (background beatboxing?)
4. Reverb/Chorus
5. Switching background music, or advancing to the next item in a playlist
6. Pause/Resume
7. Wah-wah effect
8. One-shot sound effect (laugh track etc.)

For the sensors we plan to use the MPU-9250, and for the microcontroller we plan to use the LPC1768 Cortex-M3 chip. Of course there will be relevant ADC and regulator circuits for the microphone and battery as well.

Filtered Back – Projection Optical Demonstration

Tori Fujinami, Xingchen Hong, Jacob Ramsey

Filtered Back – Projection Optical Demonstration

Featured Project

Project Description

Computed Tomography, often referred to as CT or CAT scans, is a modern technology used for medical imaging. While many people know of this technology, not many people understand how it works. The concepts behind CT scans are theoretical and often hard to visualize. Professor Carney has indicated that a small-scale device for demonstrational purposes will help students gain a more concrete understanding of the technical components behind this device. Using light rather than x-rays, we will design and build a simplified CT device for use as an educational tool.

Design Methodology

We will build a device with three components: a light source, a screen, and a stand to hold the object. After placing an object on the stand and starting the scan, the device will record three projections by rotating either the camera and screen or object. Using the three projections in tandem with an algorithm developed with a graduate student, our device will create a 3D reconstruction of the object.

Hardware

• Motors to rotate camera and screen or object

• Grid of photo sensors built into screen

• Light source

• Power source for each of these components

• Control system for timing between movement, light on, and sensor readings