Project

# Title Team Members TA Documents Sponsor
39 Hand gesture controlled audio effects system
Sarthak Singh
Sergio Bernal
Zachary Baum
Zicheng Ma design_document1.pdf
design_document2.pdf
final_paper1.pdf
photo1.png
photo2.png
presentation1.pptx
proposal1.pdf
Team Members:
Sarthak Singh (singh94)
Zachary Baum (zbaum2)
Sergio Bernal (sergiob2)

Problem
In audio production, both amateur and professional settings lack intuitive, hands-free control over audio effects. This limitation restricts the creativity and efficiency of users, particularly in live performance scenarios or in situations where physical interaction with equipment is challenging.

Solution Overview
Our project aims to develop a gesture-controlled audio effects processor. This device will allow users to manipulate audio effects through hand gestures, providing a more dynamic and expressive means of audio control. The device will use motion sensors to detect gestures, which will then adjust various audio effect parameters in real-time.

Solution Components:

Gesture Detection Subsystem:
The Gesture Detection Subsystem in our audio effects system uses a camera to track hand movements and orientations. The camera will be connected to a Raspberry PI which then sends signals to our custom PCB. The system processes sensor data in real time, minimizing latency and filtering out inaccuracies. Users can customize gesture-to-effect mappings, allowing for personalized control schemes. This subsystem is integrated with the audio processing unit, ensuring that gestures are seamlessly translated into desired audio effect alterations.


Audio Processing Subsystem:

The Audio Processing Subsystem uses a DSP algorithm to modify audio signals in real time. It includes various audio effects like reverb and delay, which change based on the user's hand gestures detected by the Gesture Detection Subsystem. This part of the system allows users to customize these effects easily. The DSP works closely with the gesture system, making it easy for users to control audio effects simply through gestures. Specifically, we are using a STM32 microcontroller on a custom PCB to handle this subsystem.

Control Interface Subsystem:
The Control Interface Subsystem in our audio effects processor provides a user-friendly interface for displaying current audio effect settings and other relevant information. This subsystem includes a compact screen that shows the active audio effects, their parameters, and the intensity levels set by the gesture controls. It is designed for clarity and ease of use, ensuring that users can quickly glance at the interface to get the necessary information during live performances or studio sessions.

Power Subsystem:

The Power Subsystem for our audio effects processor is simple and direct. It plugs into a standard AC power outlet and includes a power supply unit that converts AC to the DC voltages needed for the processor, sensors, and control interface. This design ensures steady and reliable power, suitable for long use periods, without the need for batteries.
Criterion for Success:
Our solution will enable users to intuitively control multiple audio effects in real time through gestures. The device will be responsive, accurate, and capable of differentiating between a wide range of gestures. It will be compatible with a variety of audio equipment and settings, from studio to live performance.

Alternatives:

Existing solutions are predominantly foot-pedal or knob-based controllers. These are limiting in terms of the range of expression and require physical contact. Our gesture-based solution offers a more versatile and engaging approach, allowing for a broader range of expression and interaction with audio effects.

S.I.P. (Smart Irrigation Project)

Jackson Lenz, James McMahon

S.I.P. (Smart Irrigation Project)

Featured Project

Jackson Lenz

James McMahon

Our project is to be a reliable, robust, and intelligent irrigation controller for use in areas where reliable weather prediction, water supply, and power supply are not found.

Upon completion of the project, our device will be able to determine the moisture level of the soil, the water level in a water tank, and the temperature, humidity, insolation, and barometric pressure of the environment. It will perform some processing on the observed environmental factors to determine if rain can be expected soon, Comparing this knowledge to the dampness of the soil and the amount of water in reserves will either trigger a command to begin irrigation or maintain a command to not irrigate the fields. This device will allow farmers to make much more efficient use of precious water and also avoid dehydrating crops to death.

In developing nations, power is also of concern because it is not as readily available as power here in the United States. For that reason, our device will incorporate several amp-hours of energy storage in the form of rechargeable, maintenance-free, lead acid batteries. These batteries will charge while power is available from the grid and discharge when power is no longer available. This will allow for uninterrupted control of irrigation. When power is available from the grid, our device will be powered by the grid. At other times, the batteries will supply the required power.

The project is titled S.I.P. because it will reduce water wasted and will be very power efficient (by extremely conservative estimates, able to run for 70 hours without input from the grid), thus sipping on both power and water.

We welcome all questions and comments regarding our project in its current form.

Thank you all very much for you time and consideration!