Project

# Title Team Members TA Documents Sponsor
75 Camera Gimbal System
Girish Chinnadurai Manivel
Harrison Liao
Ugur Akcal design_document1.pdf
final_paper1.pdf
other1.pdf
presentation1.pdf
proposal1.pdf
video1.mp4
# Camera Gimbal System

Team Members:
Girish Manivel (ggc2),
Harrison Liao (hzliao2)

# Problem

A major problem in video processing is footage that is shaky. If you take the forward direction as +y, right direction as +x, and up direction as +z; Shaky video footage is a result of the camera rotating around the +y and +x axes at minute steps. For example, if you take out your hand with your palm facing forward and pretend that it is a camera. Wave your hand as if you are waving hello. Moving your hand left and right is the camera rotating around the y axis also known as roll. If you move your hand up and down, bending at the wrist, it is the camera rotating about the x axis also known as pitch.

# Solution

Camera stabilization, countering the shift in pitch and roll, is the key to solving this issue. To do this, we want to make a camera gimbal. This will allow for stability in camera footage given an initial starting orientation of the camera. Once a button is pressed, the camera gimbal will take in an initial orientation from a gyroscope sensor. This reading will go to an encoder to the microcontroller. Two servo motors, controlled by the microcontroller, will be used to maintain the initial orientation by opposing the shift in pitch and roll, keeping the camera stable.

# Solution Components

## Power Subsystem

The purpose of this subsystem is to supply power to all other subsystems and to turn the device on and off.

Components:
2x AA battery EN91,
Dual AA battery holder 12BH322B-GR,
+5 Volt Regulator LM7805ACT-ND


## Control Subsystem

The purpose of this system is to actuate our motors in order to mimic a gyroscopic gimbal. We will use a microcontroller which interprets data from a gyroscopic sensor to set control inputs to motors.

Components:
Arduino Nano Microcontroller B003YVL34O,
2x Servo Motor HS-311,
Push Button MPB-43

## Sensor Subsystem

To understand the purpose of this subsystem, we need to first understand the mechanics of the device. We will have a user controlled handle where at the control end of the handle will house our Pitch movement, and directly above that another motor will control the Roll movement. The gyroscope will be attached at the base of these motors so that the far end of the handle will be the modular platform.

Components:
Gyroscope Sparkfun SEN-11977

# Criterion For Success

## Camera is stabilized from rotating around the +x axis (PITCH)

This can be tested by isolating one of the servo motors and seeing how they oscillate/turn as the gyroscope is moving around.

## Camera is stabilized from rotating around the +y axis (ROLL)

This can be tested by isolating one of the servo motors and seeing how they oscillate/turn as the gyroscope is moving around.

## User Interface (buttons) work (one button)

First button press: ON, and read gyroscope sensor.
Second button press: save gyroscope sensor reading as ‘accepted’ and Gimbal Mode on.
Third button press: Power off.

VoxBox Robo-Drummer

Craig Bost, Nicholas Dulin, Drake Proffitt

VoxBox Robo-Drummer

Featured Project

Our group proposes to create robot drummer which would respond to human voice "beatboxing" input, via conventional dynamic microphone, and translate the input into the corresponding drum hit performance. For example, if the human user issues a bass-kick voice sound, the robot will recognize it and strike the bass drum; and likewise for the hi-hat/snare and clap. Our design will minimally cover 3 different drum hit types (bass hit, snare hit, clap hit), and respond with minimal latency.

This would involve amplifying the analog signal (as dynamic mics drive fairly low gain signals), which would be sampled by a dsPIC33F DSP/MCU (or comparable chipset), and processed for trigger event recognition. This entails applying Short-Time Fourier Transform analysis to provide spectral content data to our event detection algorithm (i.e. recognizing the "control" signal from the human user). The MCU functionality of the dsPIC33F would be used for relaying the trigger commands to the actuator circuits controlling the robot.

The robot in question would be small; about the size of ventriloquist dummy. The "drum set" would be scaled accordingly (think pots and pans, like a child would play with). Actuators would likely be based on solenoids, as opposed to motors.

Beyond these minimal capabilities, we would add analog prefiltering of the input audio signal, and amplification of the drum hits, as bonus features if the development and implementation process goes better than expected.

Project Videos