# Title Team Members TA Documents Sponsor
52 Coil Gun Control System and UI
Area Award: Teamwork and Collaboration
Adwaita Dani
Bryan Mbanefo
Felipe Fregoso
Luke Wendt design_document0.pdf
Members: Adwaita Dani (aadani2), Felipe Fregoso (fregoso2), Bryan Mbanefo (mbanefo2)

Project Desciption:
The coil gun constructed by groups in the previous semesters, Prof Reinhard and Chris Barth were able to design and implement the SCRs and related power circuits. Control systems for synchronizing of firing of the SCRs to accelerate the projectile through the barrel still needs to be implemented. There is also a need to design a user interface to operate the gun effectively.

Overview of proposed solution:
VCSELs (semiconductor lasers) couple with photodiodes have been installed on the barrel of the gun to accurately measure the velocity of the moving projectile. The data from these sensors will be used as inputs to a microcontroller/FPGA. An algorithm on the microcontroller/FPGA would give a trigger output to initialize energization of the magnetic coils on the barrel. We could also use a time-of-flight sensor to get an added set of data on the position of the projectile.
The UI would be implemented using a separate controller and an attached screen. It will show details of the shot such as power consumed, barrel velocity of projectile, etc.

VoxBox Robo-Drummer

Craig Bost, Nicholas Dulin, Drake Proffitt

VoxBox Robo-Drummer

Featured Project

Our group proposes to create robot drummer which would respond to human voice "beatboxing" input, via conventional dynamic microphone, and translate the input into the corresponding drum hit performance. For example, if the human user issues a bass-kick voice sound, the robot will recognize it and strike the bass drum; and likewise for the hi-hat/snare and clap. Our design will minimally cover 3 different drum hit types (bass hit, snare hit, clap hit), and respond with minimal latency.

This would involve amplifying the analog signal (as dynamic mics drive fairly low gain signals), which would be sampled by a dsPIC33F DSP/MCU (or comparable chipset), and processed for trigger event recognition. This entails applying Short-Time Fourier Transform analysis to provide spectral content data to our event detection algorithm (i.e. recognizing the "control" signal from the human user). The MCU functionality of the dsPIC33F would be used for relaying the trigger commands to the actuator circuits controlling the robot.

The robot in question would be small; about the size of ventriloquist dummy. The "drum set" would be scaled accordingly (think pots and pans, like a child would play with). Actuators would likely be based on solenoids, as opposed to motors.

Beyond these minimal capabilities, we would add analog prefiltering of the input audio signal, and amplification of the drum hits, as bonus features if the development and implementation process goes better than expected.

Project Videos