Project

# Title Team Members TA Documents Sponsor
30 Sensing Instrument for Generating Haptic Touch - SIGHT
Dip Patel
Jamiel Abed
John Lee
Dushyant Singh Udawat design_document1.pdf
final_paper1.pdf
photo1.jpg
photo2.jpg
presentation1.pptx
proposal1.pdf
video1.MOV
# SIGHT Team Members: Jamiel Abed (jabed2), Dip Patel (dippp2), Seung Lee (seungpl2)

## Problem

There are 39 million people that are visually impaired who may face hardships related to sensing objects near them. Currently the most common solutions to mobility would be a walking cane, a guide dog, or a human guide. A walking cane requires the person to thoroughly and constantly sweep for obstacles as well as having a limited range. The problems with a guide dog and a human guide would be that not everyone has access to those resources.

## Solution overview

I'm proposing we create a alternative tool for the visually impaired. The SIGHT would warn the user of a potential nearby obstacle that might pose a tripping or crashing hazard.

Using an array of ultrasonic sensors we create a zone that can detect these obstacles and send signals that will be routed to a mesh of haptic pads which will be placed on the user. The SIGHT will give directional haptic feedback that will let the user know in which direction the potential hazard lies. Using a doppler based filter we would also be able to only send haptic feedback if an object is approaching you.

The SIGHT will be better then the current alternatives since it will be more reliable and requires less physical effort from the user.

The POV of the user is reflected through the XY plane of the haptic mesh (i.e. Object that you see in the top left of your view is represented by the top left of the haptic mesh). The Z dimension of the user's POV (depth of objects relative to the user) is characterized by the strength of the haptic touch.

Case Examples: Approaching a wall-

The haptic touch will lightly activate on all pads if a wall is far but approaching you. As you get closer the pads will release a stronger touch indicating the wall is getting closer to you.

Standing in front of wall (Not moving)-

The haptic touch wouldn't not activate in this case since relative to you the wall is not moving.

## Solution Components

Ultrasonic Array: A square array of ultrasonic sensors (Likely 3x3)

Haptic Mesh: A square array of haptic motor pads (Likely 4x4)

Doppler Module: A doppler module that can detect relative velocities of objects for filtering purposes.

## Criterions for Success
Criterion 1: The ultrasonic sensors must to a degree accurately determine the general direction and depth of the hazard.

Criterion 2: The haptic mesh must work in conjunction with the filtered sensor data and accurately activate the appropriate haptic pads.

Criterion 3: The processing must be able to filter out objects that are stationary relative to you as well objects moving away from you using the doppler effect.

Interactive Proximity Donor Wall Illumination

Sungmin Jang, Anita Jung, Zheng Liu

Interactive Proximity Donor Wall Illumination

Featured Project

Team Members:

Anita Jung (anitaj2)

Sungmin Jang (sjang27)

Zheng Liu (zliu93)

Link to the idea: https://courses.engr.illinois.edu/ece445/pace/view-topic.asp?id=27710

Problem:

The Donor Wall on the southwest side of first floor in ECEB is to celebrate and appreciate everyone who helped and donated for ECEB.

However, because of poor lighting and color contrast between the copper and the wall behind, donor names are not noticed as much as they should, especially after sunset.

Solution Overview:

Here is the image of the Donor Wall:

http://buildingcampaign.ece.illinois.edu/files/2014/10/touched-up-Donor-wall-by-kurt-bielema.jpg

We are going to design and implement a dynamic and interactive illuminating system for the Donor Wall by installing LEDs on the background. LEDs can be placed behind the names to softly illuminate each name. LEDs can also fill in the transparent gaps in the “circuit board” to allow for interaction and dynamic animation.

And our project’s system would contain 2 basic modes:

Default mode: When there is nobody near the Donor Wall, the names are softly illuminated from the back of each name block.

Moving mode: When sensors detect any stimulation such as a person walking nearby, the LEDs are controlled to animate “current” or “pulses” flowing through the “circuit board” into name boards.

Depending on the progress of our project, we have some additional modes:

Pressing mode: When someone is physically pressing on a name block, detected by pressure sensors, the LEDs are controlled to

animate scattering of outgoing light, just as if a wave or light is emitted from that name block.

Solution Components:

Sensor Subsystem:

IR sensors (PIR modules or IR LEDs with phototransistor) or ultrasonic sensors to detect presence and proximity of people in front of the Donor Wall.

Pressure sensors to detect if someone is pressing on a block.

Lighting Subsystem:

A lot of LEDs is needed to be installed on the PCBs to be our lighting subsystem. These are hidden as much as possible so that people focus on the names instead of the LEDs.

Controlling Subsystem:

The main part of the system is the controlling unit. We plan to use a microprocessor to process the signal from those sensors and send signal to LEDs. And because the system has different modes, switching between them correctly is also important for the project.

Power Subsystem:

AC (Wall outlet; 120V, 60Hz) to DC (acceptable DC voltage and current applicable for our circuit design) power adapter or possible AC-DC converter circuit

Criterion for success:

Whole system should work correctly in each mode and switch between different modes correctly. The names should be highlighted in a comfortable and aesthetically pleasing way. Our project is acceptable for senior design because it contains both hardware and software parts dealing with signal processing, power, control, and circuit design with sensors.

Project Videos