Project

# Title Team Members TA Documents Sponsor
49 Smart autochasing lamp
Feiyang Liu
Jincheng Yu
Yiyan Zhang
Luoyan Li design_document1.pdf
design_document2.pdf
final_paper1.pdf
other1.pptx
proposal2.pdf
proposal1.pdf
video
# **Team Members**

Feiyang Liu (feiyang5)

Yiyan Zhang (yiyanz3)

Jincheng Yu (jy54)


# **Problem**

When performing precise tasks on a desk, such as soldering or assembling LEGO, the position of the lamp can often be a source of frustration. Shadows cast by the hands can obscure the parts you're searching for, and tiny components in your hands may not be sufficiently illuminated, leading to discomfort and inefficiency. Furthermore, my ceiling light broke last week, and I've had to rely solely on a desk lamp for illumination. In such a dark environment, the brightness of the desk lamp is overwhelming and strains my eyes. There's a need for a desk lamp that can adjust its brightness and color temperature according to external light conditions. Additionally, the traditional ways of controlling desk lamps are inconvenient, often interrupting our workflow to make adjustments.

# **Solution**
We propose a smart desk lamp equipped with a camera and several servo motors forming a mechanical arm. This lamp can capture images and communicate with a computer for image processing. It can identify human hands and move the lamp closer and at an angle to the hands as they move, minimizing large shadows on the desk. Through a photoresistor, it can respond to changes in external light sources. The camera can also detect specific hand gestures, such as opening the thumb and forefinger to increase brightness or pinching them to decrease brightness. These gestures can also control the computer or play music, which I believe is simpler than voice input.

# **Subsystem**
## Mechanical Arm Subsystem:
Three servo motors and linear potentiometers ensure the basic movement of the mechanical arm, with additional circuits for these components. To avoid interference between light sources, a small aperture for the light-sensitive element will be located on the mechanical arm. This data is communicated to the central control subsystem.

## Lighting and Camera Subsystem:
The lighting bulb, adjustable in terms of color temperature and brightness, receives instructions from the central control system. A camera is positioned near the bulb for better target tracking. Captured information is sent to the central controller.
## Central Control Subsystem:
This system integrates the ESP32 module and necessary I/O modules. It needs to process images captured by the camera, determine how much each motor in the mechanical arm should move to track the bulb and be sensitive to specific gestures to adjust various parameters of the bulb. It can also communicate remotely with a computer to control specific programs.

# **Standard of Success**
When tracking mode is activated, the bulb moves to an appropriate position following the hand's movement.

As the ambient light changes, the bulb adjusts to the appropriate brightness and color temperature.

The lamp's switch and brightness can be adjusted through gestures.

Specific programs (like Spotify) can be opened on the computer through hand gestures.

VoxBox Robo-Drummer

Craig Bost, Nicholas Dulin, Drake Proffitt

VoxBox Robo-Drummer

Featured Project

Our group proposes to create robot drummer which would respond to human voice "beatboxing" input, via conventional dynamic microphone, and translate the input into the corresponding drum hit performance. For example, if the human user issues a bass-kick voice sound, the robot will recognize it and strike the bass drum; and likewise for the hi-hat/snare and clap. Our design will minimally cover 3 different drum hit types (bass hit, snare hit, clap hit), and respond with minimal latency.

This would involve amplifying the analog signal (as dynamic mics drive fairly low gain signals), which would be sampled by a dsPIC33F DSP/MCU (or comparable chipset), and processed for trigger event recognition. This entails applying Short-Time Fourier Transform analysis to provide spectral content data to our event detection algorithm (i.e. recognizing the "control" signal from the human user). The MCU functionality of the dsPIC33F would be used for relaying the trigger commands to the actuator circuits controlling the robot.

The robot in question would be small; about the size of ventriloquist dummy. The "drum set" would be scaled accordingly (think pots and pans, like a child would play with). Actuators would likely be based on solenoids, as opposed to motors.

Beyond these minimal capabilities, we would add analog prefiltering of the input audio signal, and amplification of the drum hits, as bonus features if the development and implementation process goes better than expected.

Project Videos