|41||Robot Controller through Gestures
|# Robot Controller through Gestures
- Student 1 (yirui2)
- Student 2 (haoduoy2)
- Student 3 (guangy2)
Traditionally, different robots are controlled by their specialized controller from different companies, and it takes time to learn how to smoothly and naturally use them.
We plan to design a robot controller which can recognize human gestures and send corresponding commands to robots. Human Positioning System is in charge of reading the sensors placed on the body of the user and calculating the position of different body parts of the user and broadcast through Bluetooth. Gesture Controlling System will enable the gestures and movements of the user to be translated to actual robot controlling actions. Robot Feedback System is used to transmit the warnings of robots to signals that can be sensed by humans (e.g. vibration and light).
## Solution Components
### Human Positioning System
Multiple IMUs are deployed on the joints of the user to detect the relative positions, they will be used to read exact data (i.e. orientations in deg/rad). The system will output the positions of human body parts (e.g. legs, arms, fingers, etc.) after calculating using an embedded processor placed on the user. Then the board will broadcast the measured results through a wireless connection (Bluetooth). Although the result is mainly used to control the motion of a pre-built robot, the interface provided by the software should be general enough for other systems to easily utilize the data.
### Gesture Controlling System
After these data are sent to a more powerful device (e.g. PC) via a wireless connection. The movements of human body would be reconstructed and be used to recognize different gestures.
### Robot Feedback System
When robots enter special mode or encounter unexpected events, they can give some feedback on our controller. These messages can be demonstrated by led, vibration, sound, or even a small screen.
## Criterion for Success
Apart from the optional objective, the device should be able to light-weight enough to be placed on a human so that the wearer should not fatigue quickly. Also, the controller should have a meaningful output that can reflect the real-world motion of the user. The readings should be gathered by a middle system (e.g. PC) to convert gestures to corresponding, user-defined motion data and send them to the robot. The system should also provide smooth readings so that the controlled robot does not post a treat to any human nearby. It should also be equipped with kill switches that can be both user-controlled and automatically triggered so that the user can have complete control over the whole system.
To demo that our controller works for at least one type of robot, we will borrow robots from iRM robotics club.