# Title Team Members TA Documents Sponsor
41 Robot Controller through Gestures
Eric Zhou
Guang Yin
Haoduo Yan
Hanyin Shao design_document1.pdf
# Robot Controller through Gestures

Team Members:

- Student 1 (yirui2)
- Student 2 (haoduoy2)
- Student 3 (guangy2)

## Problem

Traditionally, different robots are controlled by their specialized controller from different companies, and it takes time to learn how to smoothly and naturally use them.

## Solution

We plan to design a robot controller which can recognize human gestures and send corresponding commands to robots. Human Positioning System is in charge of reading the sensors placed on the body of the user and calculating the position of different body parts of the user and broadcast through Bluetooth. Gesture Controlling System will enable the gestures and movements of the user to be translated to actual robot controlling actions. Robot Feedback System is used to transmit the warnings of robots to signals that can be sensed by humans (e.g. vibration and light).

## Solution Components

### Human Positioning System

Multiple IMUs are deployed on the joints of the user to detect the relative positions, they will be used to read exact data (i.e. orientations in deg/rad). The system will output the positions of human body parts (e.g. legs, arms, fingers, etc.) after calculating using an embedded processor placed on the user. Then the board will broadcast the measured results through a wireless connection (Bluetooth). Although the result is mainly used to control the motion of a pre-built robot, the interface provided by the software should be general enough for other systems to easily utilize the data.

### Gesture Controlling System

After these data are sent to a more powerful device (e.g. PC) via a wireless connection. The movements of human body would be reconstructed and be used to recognize different gestures.

### Robot Feedback System

When robots enter special mode or encounter unexpected events, they can give some feedback on our controller. These messages can be demonstrated by led, vibration, sound, or even a small screen.

## Criterion for Success

Apart from the optional objective, the device should be able to light-weight enough to be placed on a human so that the wearer should not fatigue quickly. Also, the controller should have a meaningful output that can reflect the real-world motion of the user. The readings should be gathered by a middle system (e.g. PC) to convert gestures to corresponding, user-defined motion data and send them to the robot. The system should also provide smooth readings so that the controlled robot does not post a treat to any human nearby. It should also be equipped with kill switches that can be both user-controlled and automatically triggered so that the user can have complete control over the whole system.

To demo that our controller works for at least one type of robot, we will borrow robots from iRM robotics club.

Cypress Robot Kit

Todd Nguyen, Byung Joo Park, Alvin Wu

Cypress Robot Kit

Featured Project

Cypress is looking to develop a robotic kit with the purpose of interesting the maker community in the PSOC and its potential. We will be developing a shield that will attach to a PSoC board that will interface to our motors and sensors. To make the shield, we will design our own PCB that will mount on the PSoC directly. The end product will be a remote controlled rover-like robot (through bluetooth) with sensors to achieve line following and obstacle avoidance.

The modules that we will implement:

- Motor Control: H-bridge and PWM control

- Bluetooth Control: Serial communication with PSoC BLE Module, and phone application

- Line Following System: IR sensors

- Obstacle Avoidance System: Ultrasonic sensor

Cypress wishes to use as many off-the-shelf products as possible in order to achieve a “kit-able” design for hobbyists. Building the robot will be a plug-and-play experience so that users can focus on exploring the capabilities of the PSoC.

Our robot will offer three modes which can be toggled through the app: a line following mode, an obstacle-avoiding mode, and a manual-control mode. In the manual-control mode, one will be able to control the motors with the app. In autonomous modes, the robot will be controlled based off of the input from the sensors.