Project

# Title Team Members TA Documents Sponsor
39 Hand gesture controlled audio effects system
Sarthak Singh
Sergio Bernal
Zachary Baum
Zicheng Ma design_document1.pdf
design_document2.pdf
final_paper1.pdf
photo1.png
photo2.png
presentation1.pptx
proposal1.pdf
Team Members:
Sarthak Singh (singh94)
Zachary Baum (zbaum2)
Sergio Bernal (sergiob2)

Problem
In audio production, both amateur and professional settings lack intuitive, hands-free control over audio effects. This limitation restricts the creativity and efficiency of users, particularly in live performance scenarios or in situations where physical interaction with equipment is challenging.

Solution Overview
Our project aims to develop a gesture-controlled audio effects processor. This device will allow users to manipulate audio effects through hand gestures, providing a more dynamic and expressive means of audio control. The device will use motion sensors to detect gestures, which will then adjust various audio effect parameters in real-time.

Solution Components:

Gesture Detection Subsystem:
The Gesture Detection Subsystem in our audio effects system uses a camera to track hand movements and orientations. The camera will be connected to a Raspberry PI which then sends signals to our custom PCB. The system processes sensor data in real time, minimizing latency and filtering out inaccuracies. Users can customize gesture-to-effect mappings, allowing for personalized control schemes. This subsystem is integrated with the audio processing unit, ensuring that gestures are seamlessly translated into desired audio effect alterations.


Audio Processing Subsystem:

The Audio Processing Subsystem uses a DSP algorithm to modify audio signals in real time. It includes various audio effects like reverb and delay, which change based on the user's hand gestures detected by the Gesture Detection Subsystem. This part of the system allows users to customize these effects easily. The DSP works closely with the gesture system, making it easy for users to control audio effects simply through gestures. Specifically, we are using a STM32 microcontroller on a custom PCB to handle this subsystem.

Control Interface Subsystem:
The Control Interface Subsystem in our audio effects processor provides a user-friendly interface for displaying current audio effect settings and other relevant information. This subsystem includes a compact screen that shows the active audio effects, their parameters, and the intensity levels set by the gesture controls. It is designed for clarity and ease of use, ensuring that users can quickly glance at the interface to get the necessary information during live performances or studio sessions.

Power Subsystem:

The Power Subsystem for our audio effects processor is simple and direct. It plugs into a standard AC power outlet and includes a power supply unit that converts AC to the DC voltages needed for the processor, sensors, and control interface. This design ensures steady and reliable power, suitable for long use periods, without the need for batteries.
Criterion for Success:
Our solution will enable users to intuitively control multiple audio effects in real time through gestures. The device will be responsive, accurate, and capable of differentiating between a wide range of gestures. It will be compatible with a variety of audio equipment and settings, from studio to live performance.

Alternatives:

Existing solutions are predominantly foot-pedal or knob-based controllers. These are limiting in terms of the range of expression and require physical contact. Our gesture-based solution offers a more versatile and engaging approach, allowing for a broader range of expression and interaction with audio effects.

Cypress Robot Kit

Todd Nguyen, Byung Joo Park, Alvin Wu

Cypress Robot Kit

Featured Project

Cypress is looking to develop a robotic kit with the purpose of interesting the maker community in the PSOC and its potential. We will be developing a shield that will attach to a PSoC board that will interface to our motors and sensors. To make the shield, we will design our own PCB that will mount on the PSoC directly. The end product will be a remote controlled rover-like robot (through bluetooth) with sensors to achieve line following and obstacle avoidance.

The modules that we will implement:

- Motor Control: H-bridge and PWM control

- Bluetooth Control: Serial communication with PSoC BLE Module, and phone application

- Line Following System: IR sensors

- Obstacle Avoidance System: Ultrasonic sensor

Cypress wishes to use as many off-the-shelf products as possible in order to achieve a “kit-able” design for hobbyists. Building the robot will be a plug-and-play experience so that users can focus on exploring the capabilities of the PSoC.

Our robot will offer three modes which can be toggled through the app: a line following mode, an obstacle-avoiding mode, and a manual-control mode. In the manual-control mode, one will be able to control the motors with the app. In autonomous modes, the robot will be controlled based off of the input from the sensors.