Project

# Title Team Members TA Documents Sponsor
19 Wearable communication device for deaf/mute
Andrew Ko
Minho Lee
Yihan Ruan
Stasiu Chyczewski design_document1.pdf
final_paper1.pdf
photo1.jpeg
photo2.jpeg
presentation1.pptx
proposal1.pdf
Team members: Andrew Ko (hyunjun5), Minho Lee (minhol2), Yihan Ruan (yihanr2)

Wearable communication device for deaf/mute

Problem
- Most of the time socializing with other people without disabilities is not a pleasant experience for the deaf or mute people as most are not familiar with sign languages. To help with this, we thought of a wearable device that is a belt-like device which the user can simply wear on any clothes.

Solution Overview
- As mentioned, we will attach a stenographic keyboard and a speaker on the belt. We decided to go with the stenographic keyboard because once learned, it would be a lot faster to allow for real time conversation. The speaker attached would convert the text to speech for the listener.
- The way the user would listen is by reading. The extendable display module that normally resides, charging, on the side of the belt(wired) will feature speech to text with a built-in mic and display the words on the screen for the user. The display module can be handed over to the person talking.

Solution Components

[Stenographic keyboard]: After getting used to the stenographic keyboard, people can use this keyboard to write messages which will be converted into speech.

[Text-to-speech speaker]: It will take in the text from keystrokes and output the words to the speaker.

[Display module with speech recognition]: The built-in mic will take in the listeners words and convert them to text ultimately displaying them on the screen for the user to see. The interface of the screen will be split in half, one for the user and one for the listener.

[Supply]: A battery supply providing power for the display module

Criterion for Success

- Stenographic keyboard will allow the user to type as fast as one speaks and the text to speech algorithm would have to compute as fast to instantly output them through the speaker.
- The display module would also have to be as fast in converting the speech to text ultimately displaying the words on the screen without much delay.
- The user interface for the display module would have to be neat and clear on who said what and who typed what.
- The battery life of the device will need to be at least around 6 hours so the user can bring it around without needing to charge so often. We would also need to allow the user to power the device on/off to save battery life.
- The device needs to be light enough so that the user does not feel very inconvenient in wearing the device. If it gets too heavy, we will attach suspenders to the belt.

Interactive Proximity Donor Wall Illumination

Sungmin Jang, Anita Jung, Zheng Liu

Interactive Proximity Donor Wall Illumination

Featured Project

Team Members:

Anita Jung (anitaj2)

Sungmin Jang (sjang27)

Zheng Liu (zliu93)

Link to the idea: https://courses.engr.illinois.edu/ece445/pace/view-topic.asp?id=27710

Problem:

The Donor Wall on the southwest side of first floor in ECEB is to celebrate and appreciate everyone who helped and donated for ECEB.

However, because of poor lighting and color contrast between the copper and the wall behind, donor names are not noticed as much as they should, especially after sunset.

Solution Overview:

Here is the image of the Donor Wall:

http://buildingcampaign.ece.illinois.edu/files/2014/10/touched-up-Donor-wall-by-kurt-bielema.jpg

We are going to design and implement a dynamic and interactive illuminating system for the Donor Wall by installing LEDs on the background. LEDs can be placed behind the names to softly illuminate each name. LEDs can also fill in the transparent gaps in the “circuit board” to allow for interaction and dynamic animation.

And our project’s system would contain 2 basic modes:

Default mode: When there is nobody near the Donor Wall, the names are softly illuminated from the back of each name block.

Moving mode: When sensors detect any stimulation such as a person walking nearby, the LEDs are controlled to animate “current” or “pulses” flowing through the “circuit board” into name boards.

Depending on the progress of our project, we have some additional modes:

Pressing mode: When someone is physically pressing on a name block, detected by pressure sensors, the LEDs are controlled to

animate scattering of outgoing light, just as if a wave or light is emitted from that name block.

Solution Components:

Sensor Subsystem:

IR sensors (PIR modules or IR LEDs with phototransistor) or ultrasonic sensors to detect presence and proximity of people in front of the Donor Wall.

Pressure sensors to detect if someone is pressing on a block.

Lighting Subsystem:

A lot of LEDs is needed to be installed on the PCBs to be our lighting subsystem. These are hidden as much as possible so that people focus on the names instead of the LEDs.

Controlling Subsystem:

The main part of the system is the controlling unit. We plan to use a microprocessor to process the signal from those sensors and send signal to LEDs. And because the system has different modes, switching between them correctly is also important for the project.

Power Subsystem:

AC (Wall outlet; 120V, 60Hz) to DC (acceptable DC voltage and current applicable for our circuit design) power adapter or possible AC-DC converter circuit

Criterion for success:

Whole system should work correctly in each mode and switch between different modes correctly. The names should be highlighted in a comfortable and aesthetically pleasing way. Our project is acceptable for senior design because it contains both hardware and software parts dealing with signal processing, power, control, and circuit design with sensors.

Project Videos