Project

# Title Team Members TA Documents Sponsor
16 Handheld Rocket Tracker
Ben Olaivar
Manas Tiwari
Max Kramer
Sanjana Pingali final_paper1.pdf
other1.pdf
proposal3.pdf
video
# Handheld Rocket Tracker

Team Members:
- Ben Olaivar (olaivar3)
- Max Kramer (mdk5)
- Manas Tiwari (manast2)

# Problem

Locating a rocket after a launch can be difficult. When the rocket reaches apogee (peak height), it deploys parachutes and glides back to the ground, often landing several miles away from the launch site (check out this video from the Illinois Space Society). Some tracking solutions exist, such as altimeters and radio beacons, however they all suffer from similar issues of being clunky, unintuitive, or expensive. Radio beacons don’t send out their exact location, and are tracked by following the strength of their signal, which only gives the general direction of the beacon. Altimeters send out their exact location, but are costly ($380+) and often require a laptop to receive their position, which is inconvenient to carry during a search. A few handheld trackers exist, however they are costly ($475+), difficult to reconfigure, and unintuitive. Additionally, all of these solutions are limited to 1 device.

# Solution

We want to make a 2-part tracking system: A tracking beacon (referred to as a “puck” or “beacon”), and a handheld tracking device (referred to as “tracker”). The beacon will be placed inside the rocket, and will continuously transmit its coordinates. On the receiving end, the tracker will compare its own GPS location with the coordinates from the beacon. To make this intuitive, the tracker will display the direction (using an arrow on the screen), as well as the distance to the beacon.

# Solution Components

## Subsystem 1: Microcontroller Processor (both beacon and tracker)
This will house the codebase for this project. This will mainly be to display to the screen of the tracker and handle button inputs by the user.

## Subsystem 2: TRACKING SENSORS
This subsystem consists of all required sensors/peripherals required for acquiring the location and direction from the tracker to the beacon
- **GPS Module (both):** To get longitude and latitude values of both components
- **GPS Antenna (both):** For connecting to satellites.
- **Magnetometer(tracker):** For measuring the heading of the user.

## Subsystem 3: COMMUNICATION SYSTEM
The entire project depends on successful communication between the beacon(s) and the tracker. Therefore we will need the following components to set up an ability for the tracker to search out certain frequencies and for the beacon(s) to send out the same frequencies.
- **Transceiver (both):** Required generating signal between beacon and tracker
- **Antenna (both):** Mid-ranged antenna capable of transmitting/receiving signals between 3-5 miles. Can be replaced in future with better antennas.

## Subsystem 4: BATTERY AND POWER SUPPLY
Create a battery management system that supplies consistent 3.3V to the necessary sensors and MCU.
- **LiPo Batteries (tracker):** 3.7V. Compact, have long battery life, and are readily available.
- **Voltage Regulator (tracker):** Regulating voltage from battery pack to sensors/MCU (3.3V)
- **Battery Holder (tracker):** Holding batteries

## Subsystem 5: DATA DISPLAY
This will simply be the screen we use to display all needed information for the user to track their beacons using the tracker
- **E-Ink Display:** For displaying compass, frequency, and distance data

# Criterion For Success

- Primary Criterion: Demonstrate that the “Beacon” or “Puck” can be found by an end user being guided by the “Tracker”’s on-screen information

- Additional Criterion: Demonstrate the ability to change frequency at which the “Beacon” and “Tracker” Communicate

# Github Link

https://github.com/ben-olaivar/ECE445_software

VoxBox Robo-Drummer

Craig Bost, Nicholas Dulin, Drake Proffitt

VoxBox Robo-Drummer

Featured Project

Our group proposes to create robot drummer which would respond to human voice "beatboxing" input, via conventional dynamic microphone, and translate the input into the corresponding drum hit performance. For example, if the human user issues a bass-kick voice sound, the robot will recognize it and strike the bass drum; and likewise for the hi-hat/snare and clap. Our design will minimally cover 3 different drum hit types (bass hit, snare hit, clap hit), and respond with minimal latency.

This would involve amplifying the analog signal (as dynamic mics drive fairly low gain signals), which would be sampled by a dsPIC33F DSP/MCU (or comparable chipset), and processed for trigger event recognition. This entails applying Short-Time Fourier Transform analysis to provide spectral content data to our event detection algorithm (i.e. recognizing the "control" signal from the human user). The MCU functionality of the dsPIC33F would be used for relaying the trigger commands to the actuator circuits controlling the robot.

The robot in question would be small; about the size of ventriloquist dummy. The "drum set" would be scaled accordingly (think pots and pans, like a child would play with). Actuators would likely be based on solenoids, as opposed to motors.

Beyond these minimal capabilities, we would add analog prefiltering of the input audio signal, and amplification of the drum hits, as bonus features if the development and implementation process goes better than expected.

Project Videos