# Title Team Members TA Documents Sponsor
57 Water Aliasing
Atreyee Roy
Siddharth Sharma
Luke Wendt design_document0.pdf

Atreyee Roy - aroy10
Siddharth Sharma - srsharm2


During the pitching session, our TA Luke Wendt talked about the Water Aliasing Project that we immediately took an interest in. The basic idea behind this project is to create an illusion by playing around with the frequency of water and the frequency of a strobe light. Even though many people have performed this experiment already, we plan to add some additional features to our design so it is more interactive for the user and does more than simply levitate or move up or down.

Hardware Description:

We will need to create the circuit for the strobe light that we plan to have on the side panels of our system so it can illuminate multiple columns of water. We will design a panel of LEDs that will be on for short bursts of time to have very sharp droplets visible. Ideally we plan to design this strobe light circuit with comparators, transistors (>10 A), and regulators, along with other generic components that we may need as we figure out the circuit. These will be mainly needed to create the short bursts of light (really narrow input signals). We plan to generate and control the frequency of our strobe light with a signal generator, so it is easy to change or maintain the frequency we need for any particular mode (up, down, still).

For the water, we plan to create 4 columns, each pipe from a water pump to recycle the water used, to be run at independent frequencies with small speakers. We looked into Piezoelectric benders (coin type speakers) that are tiny and their frequencies and start and stop times can be controlled with extreme precision. They also happen to be cheaper than other options.

Software Description:

We plan to integrate user interfacing in this project. What we envisioned is a simple app that has three buttons arranged one below the other, in 4 columns- one for each column of water. Through this app, we can control the up and down, or still, motion of each column of water independently. Then, simultaneously each column can have a different movement, based on the user’s wish. We plan to transmit a bluetooth signal with our app, which will be received by our controller to alter the frequencies of the speakers as required. This will be achieved by our code for the controller design.

We expect that once we can figure out how to transmit even one of these commands to the controller, the rest will be a matter of scaling and we can have the user play with the controls on the app to make the water flow as desired.

Important concerns regarding safety:

Since this is a water based project, we plan to do it on a small scale so it is easy to work on and portable eventually. We may encase the entire system in a glass box, with the strobe lights on the sides. We think this will also give it an aesthetic value, while being contained and safe, minimising spillage.


Our team consists of an Electrical Engineer and a Computer Engineer, with background in microelectronic circuit design, control systems, artificial intelligence, machine learning, along with a strong coding background.

VoxBox Robo-Drummer

Craig Bost, Nicholas Dulin, Drake Proffitt

VoxBox Robo-Drummer

Featured Project

Our group proposes to create robot drummer which would respond to human voice "beatboxing" input, via conventional dynamic microphone, and translate the input into the corresponding drum hit performance. For example, if the human user issues a bass-kick voice sound, the robot will recognize it and strike the bass drum; and likewise for the hi-hat/snare and clap. Our design will minimally cover 3 different drum hit types (bass hit, snare hit, clap hit), and respond with minimal latency.

This would involve amplifying the analog signal (as dynamic mics drive fairly low gain signals), which would be sampled by a dsPIC33F DSP/MCU (or comparable chipset), and processed for trigger event recognition. This entails applying Short-Time Fourier Transform analysis to provide spectral content data to our event detection algorithm (i.e. recognizing the "control" signal from the human user). The MCU functionality of the dsPIC33F would be used for relaying the trigger commands to the actuator circuits controlling the robot.

The robot in question would be small; about the size of ventriloquist dummy. The "drum set" would be scaled accordingly (think pots and pans, like a child would play with). Actuators would likely be based on solenoids, as opposed to motors.

Beyond these minimal capabilities, we would add analog prefiltering of the input audio signal, and amplification of the drum hits, as bonus features if the development and implementation process goes better than expected.

Project Videos