Project
# | Title | Team Members | TA | Documents | Sponsor |
---|---|---|---|---|---|
25 | Infinity Control Gauntlet |
Ashish Pabba Chris Schodde Ramakrishna Kanungo |
Yifan Chen | design_document1.pdf design_document2.pdf final_paper1.pdf final_paper2.pdf final_paper3.pdf other1.pdf presentation1.pdf proposal1.pdf |
|
# Team Ashish Pabba – apabba2, Chris Schodde - schodde3, Ramakrishna Kanungo – kanungo3 # Problem For certain applications, the issues with conventional input devices such as the mouse and keyboard or the conventional TV remote are numerous. In the context of 3D modeling and CAD applications, rotation/zooming/translation using a mouse is inconvenient and intuitive, while trying to type on a TV using the remote is often aggravating. There are alternatives such as joysticks for the former and Bluetooth keyboards for the latter, but both options still lack intuitiveness. # Solution Overview Our proposed solution to this problem is a glove that allows the user to translate hand motion and finger gestures to commands and actions specific to the application. The three key aspects of our project are gesture/motion recognition, sensor data collection/command computation, and transmission. Considering the scale, timeline, and expectations of the 445 design project, we intend to implement the gesture/motion recognition on the glove, and process it on our control subsystem, and then decide whether our full functionality or reduced functionality is feasible (as detailed in the Success Criteria section). # Criteria for Success High Level Goals: \ • Recognition of directional motion and gestures involving finger movements.\ • Mapping each of the actions to a command.\ • Transmitting command to Bluetooth dongle\ • Writing a driver to receive command and perform desired action\ • Attempt to minimize package size for aesthetic and functional value\ • Possible haptic feedback Reduced Contingency Functionality Our contingency plan in the case of a return to full-online instruction and consequent suspension of lab access is to target TV remote control with hand gestures and motions. In this case, our high-level goals are: \ • Decode remote IR signals to understand mapping of button presses to IR signal transmission\ • Recognition of directional motion and gestures involving finger movements.\ • Transmitting command in the form of an infrared signal akin to remote signal. Note: We might choose to reduce our project scope to the contingency functionality in case the driver programming proves to be too difficult time consuming. We assume that the driver is not essential to the electrical engineering-based scope of the design project. # Solution Components Subsystem #1: Making sure the glove has the right number of sensors laid out efficiently to detect a range of motions that can be translated to a certain operation. Further, wires from various sensors should not hinder motion and need to be converted into a harness that can be sent to the Control Unit. Subsystem #2: (Control Unit) This subsystem consists of a microcontroller (probably a Raspberry Pi) that is used to detect gestures based on the analog signals from the various sensors on the glove. Using a control loop/FSM, the gestures are detected, and the data is sent to the Host Device via Bluetooth. Subsystem #3: The host device, which could be either a TV or a PC that is running a CAD software, would need to receive data from the Control Unit and accordingly perform a certain task on that platform. |