Project

# Title Team Members TA Documents Sponsor
73 Indoor Navigation for the Visually Impaired
Akhil Alapaty
Kushagra Tiwary
Saleh Ahmad
David Hanley final_paper1.pdf
presentation1.pdf
proposal1.pdf
Problem: Blind people need to navigate within indoor and confined places apartments and apartment buildings to get from location A to location B. Typically these spaces are incredibly tight and GPS is not accurate enough to provide room-to-room navigation. Blind people are usually able to locally navigate through the use of sticks and dogs. Stairs, walls, and any hindrances that may occur in their path, however, make moving from one location to another in an unknown indoor environment a harder task because it requires some sort of direction. We aim to provide directional navigation to the visually impaired.

Solution: The blind person will input where he/she wants to go in the apartment (Living Room/Bathroom etc.) and the device will provide voice feedback with rough directions to the destination.

We create a “walkability map” of an apartment (we will first try to test this in an apartment) which includes all the areas a person can walk through (rooms, corridors, entrances etc.). Through a system of bluetooth sensors places in particular locations inside each of the rooms and appropriate corridors, we will be able to localize the blind person through a receiver on their neck/pocket. WIth the position of the bluetooth sensors and the receiver, we can pinpoint their location on the map and then provide them with directions with reasonable certainty.

Subsystems:

*see image for a clearer idea*

Bluetooth beacons and receivers:
The placement of these sensors around the apartment will be key to the accuracy of our model. Each of the beacons will emit a signal and we can compare the received signal values with the beacons and their respective coordinates to get the rough position estimate through the RSSI (value of the received radio signal). We can then solve the trilateration problem to get the coordinates of the receiver. We need to receive signal from at-least 3 such beacons at at any given point in time to get a good estimate of the position of the person within 1-2 meters.

This, as the TA pointed out, might be a lot considering the tight spaces and will compound over time. To mitigate the inaccuracies, we will need more beacons and provide better rough estimates of where, for example, a possible exit to another room will be instead of exact distances. Some of the minute detailing within the range of 1-2 meters will be left to the blind person (as we still expect them to use vision supplements like canes); however, we think this will be a huge upgrade to the “the door is to the left” since there will be greater context.

Microcontroller Module:
This module will be responsible for receiving the signals from each of the beacons and then transferring them to the the software module where the computation to solve the trilateration problem will happen.

Software Module:
This will be a software module that solves the trilateration problem in a 2D space. Since the space is small, this will not be too computationally expensive. Once we get the proposed coordinates, these will be translated on to the map. We can make use of any of the path finding algorithms (DFS/BFS) or complex ones like A*/RRT* (we want to keep it simple in our prototype) really depending on the performance. The use will then be provided feedback in the form of speech through speakers or earphones to turn right of left after a set amount of meters or steps.

Communication Module:
This module will be responsible for sending voice feedback to the user. Will be getting input from the software module.

Power Module:
This module will contain the power routing from a power source that is either a battery, outlet connection, or usb connection.


Criterion for Success:
A criterion for success would be for the system to work seamlessly with one person being able to use it to navigate from Room A to another using the rough voice directions provided by the device.

Reach Goals:
A reach goal of this would be to try this out in larger public indoor spaces where the compounding of the inaccuracies will become more of an issue. We could use more powerful beacons for localization or look at different sensors to help localize.

Link to original post:
https://courses.engr.illinois.edu/ece445/pace/view-topic.asp?id=32242

Interactive Proximity Donor Wall Illumination

Sungmin Jang, Anita Jung, Zheng Liu

Interactive Proximity Donor Wall Illumination

Featured Project

Team Members:

Anita Jung (anitaj2)

Sungmin Jang (sjang27)

Zheng Liu (zliu93)

Link to the idea: https://courses.engr.illinois.edu/ece445/pace/view-topic.asp?id=27710

Problem:

The Donor Wall on the southwest side of first floor in ECEB is to celebrate and appreciate everyone who helped and donated for ECEB.

However, because of poor lighting and color contrast between the copper and the wall behind, donor names are not noticed as much as they should, especially after sunset.

Solution Overview:

Here is the image of the Donor Wall:

http://buildingcampaign.ece.illinois.edu/files/2014/10/touched-up-Donor-wall-by-kurt-bielema.jpg

We are going to design and implement a dynamic and interactive illuminating system for the Donor Wall by installing LEDs on the background. LEDs can be placed behind the names to softly illuminate each name. LEDs can also fill in the transparent gaps in the “circuit board” to allow for interaction and dynamic animation.

And our project’s system would contain 2 basic modes:

Default mode: When there is nobody near the Donor Wall, the names are softly illuminated from the back of each name block.

Moving mode: When sensors detect any stimulation such as a person walking nearby, the LEDs are controlled to animate “current” or “pulses” flowing through the “circuit board” into name boards.

Depending on the progress of our project, we have some additional modes:

Pressing mode: When someone is physically pressing on a name block, detected by pressure sensors, the LEDs are controlled to

animate scattering of outgoing light, just as if a wave or light is emitted from that name block.

Solution Components:

Sensor Subsystem:

IR sensors (PIR modules or IR LEDs with phototransistor) or ultrasonic sensors to detect presence and proximity of people in front of the Donor Wall.

Pressure sensors to detect if someone is pressing on a block.

Lighting Subsystem:

A lot of LEDs is needed to be installed on the PCBs to be our lighting subsystem. These are hidden as much as possible so that people focus on the names instead of the LEDs.

Controlling Subsystem:

The main part of the system is the controlling unit. We plan to use a microprocessor to process the signal from those sensors and send signal to LEDs. And because the system has different modes, switching between them correctly is also important for the project.

Power Subsystem:

AC (Wall outlet; 120V, 60Hz) to DC (acceptable DC voltage and current applicable for our circuit design) power adapter or possible AC-DC converter circuit

Criterion for success:

Whole system should work correctly in each mode and switch between different modes correctly. The names should be highlighted in a comfortable and aesthetically pleasing way. Our project is acceptable for senior design because it contains both hardware and software parts dealing with signal processing, power, control, and circuit design with sensors.

Project Videos