# Title Team Members TA Documents Sponsor
36 EyeCU - Assistive Eyewear
Abishek Venkit
Irfan Suzali
Nikhil Mehta
Shuai Tang design_document1.pdf
Project Members: Irfan Suzali (isuzali2), Abishek Venkit (avenkit2), Nikhil Mehta (nikhilm3)

Title: EyeCU

Problem: Living a normal life can be difficult for the visually impaired. Although people with visual impairments can be largely independent in most scenarios, there are still cases where assistance is required in day-to-day activities. An example of this is reading the ingredients on a package of food, or identifying a mysterious object. We would like to provide this market with another layer of connection to their surroundings.

Solution Overview: Our solution is assistive eyewear for people with vision impairments. The eyewear will include a camera to capture the user’s field of view, a bluetooth module, and a battery to power the device. This visual information will be sent over bluetooth to the user’s smartphone, which will then compute the necessary contextual information about the scene in front of them, and output audio through the smartphone. This audio can be played on the phone’s speaker or through a pair of headphones connected to the smartphone. Additionally, we could add a microphone to the eyewear as well, allowing the user to input voice commands to specify what type of information they want about their surroundings. This feature is not essential to the function of our product, but could be an extra feature, given the time and need.

Solution Components:
Hardware Subsystem:
- Integrated Eyewear including camera, bluetooth, and battery
- Dedicated circuit to store and transmit images via bluetooth
- Low power circuitry for 24+ hour use

Software Subsystem:
- Bluetooth system to receive images
- AI/Computer Vision system on smartphone to analyze images and generate context info (May be connected to a cloud classification database like Azure)
- Text to speech to output audio information to user

Secondary Subsystem (optional):
- Microphone built into eyewear, also connected over bluetooth
- Natural language processing to understand commands from the user
- Used to specify what type of feedback the user wants on their field of vision.

Criterion for Success:
- A successful solution will improve the quality of life for the visually impaired in numerous ways, and will depend on the speed, accuracy, and usefulness of the glasses.
- The glasses must take clear, detailed photos of the surroundings to transmit to the phone.
- The processing of the images and response must be quick (under 2 seconds).
- The device allows the user to choose between different usage modes : text-to-speech, object classification, and possibly other modes.
- The feedback given to the user must be helpful (give them information otherwise unknown to them). For text, the device must recite the text back to the user (within 95% accuracy). For object classification, the device must recite the object back to the user along with its confidence (eg: “There is a bear in front of you with 20% confidence”).

Idea Post Link:

Control System and User Interface for Hydraulic Bike

Iain Brearton

Featured Project

Parker-Hannifin, a fluid power systems company, hosts an annual competition for the design of a chainless bicycle. A MechSE senior design team of mechanical engineers have created a hydraulic circuit with electromechanical valves, but need a control system, user interface, and electrical power for their system. The user would be able to choose between several operating modes (fluid paths), listed at the end.

My solution to this problem is a custom-designed control system and user interface. Based on sensor feedback and user inputs, the system would change operating modes (fluid paths). Additionally, the system could be improved to suggest the best operating mode by implementing a PI or PID controller. The system would not change modes without user interaction due to safety - previous years' bicycles have gone faster than 20mph.

Previous approaches to this problem have usually not included an electrical engineer. As a result, several teams have historically used commercially-available systems such as Parker's IQAN system (link below) or discrete logic due to a lack of technical knowledge (link below). Apart from these two examples, very little public documentation exists on the electrical control systems used by previous competitors, but I believe that designing a control system and user interface from scratch will be a unique and new approach to controlling the hydraulic system.

I am aiming for a 1-person team as there are 6 MechSE counterparts. I emailed Professor Carney on 10/3/14 and he thought the general concept was acceptable.

Operating modes, simplified:

Direct drive (rider's pedaling power goes directly to hydraulic motor)

Coasting (no power input, motor input and output "shorted")

Charge accumulators (store energy in expanding rubber balloons)

Discharge accumulators (use stored energy to supply power to motor)

Regenerative braking (use motor energy to charge accumulators)

Download Competition Specs:

Team using IQAN system (top right corner):

Team using discrete logic (page 19):