Project

# Title Team Members TA Documents Sponsor
15 Driver Sleep Detection and Alarming System
Area Award: Computer Vision
Chenyang Xu
Xiangyu Chen
Yixiao Nie
design_document0.pdf
final_paper0.pdf
presentation0.ppt
proposal0.pdf
While driving alone on highway or over a long period of time, it's very easy for the driver to fall asleep and may cause accident. Therefore, we came up with an idea to develop a driver anti-sleep alarm system which could effectively solve the problem.
Our system will use a camera (Kinect) to track the eyes of driver, send information back to a microcontroller, and sound alarms when dangerous signals appear. Therefore our system will have 4 main parts: camera, microcontroller, power supply, and sound warning system.
In the following paragraphs we‚??ll further discuss the image processing and face recognition algorithms and hardware (power supply system and sound warning system).

Algorithms:
This project will use the Kinect camera, which has three modes: RGB mode, depth mode, and IR mode. The RGB mode is used for daytime detection, while the IR mode is used for night detection. The depth mode will not be used.
Basically, the emphasis of the algorithm is detecting the eye motion of the driver. When the driver fells sleepy, their lips will approach closer with each other. The camera should be able to detection this change and then make decision whether the driver is sleepy. During the daytime, the RGB mode of the camera works perfectly for detection. However, when it is at night, RGB mode may perform poor because of the trivial light. In this case we may use the IR mode to detect driver‚??s eyes. Another way to handle this is by using histogram equalization, an algorithm that expands color comparison. In addition, in order to run the algorithm, we may need to use an ARM board which has GPU that can handle image processing in a fast speed

Hardware: The hardware system is composed of three parts. First, we will try to use beagleboard-xm as the little computer to interact with Kinect and control the LEDs and sound system. We will install some Linux system on beagleboard and kinect SDK and implement our algorithms to do face recognition and eye lips closing detection.
The second part is the power supply unit for Beagle board, Kinect camera, and PCB board microprocessor. Since the Kinect camera is connected to Beagle board, it retrieves power directly from Beagle board. Next, the Beagle board is going to retrieve its power from the power supply unit we design, which is approximately 10 W (with camera) according to technical data provided by Beagle‚??s website. The power unit includes a voltage filter, a DC-DC converter and the corresponding microcontroller. The power unit has one USB receptor and one USB output port. The output is used to power up the camera and Beagle board. The receptor is used to receive power from a pre-purchased inverter that can connect to car power supply. The third part is the feedback alarming system including sounds and light. The light warning is made up by 5 regular red LED lights, which are powered by designed PCB. Also, they are able to adjust frequency depending on drivers‚?? condition. The sound warning is implemented with a speaker or a buzzer. It is also powered by another DC-DC converter with different voltage on the PCB board. The car power supply can provide sufficient power above 90 W. The hardware PCB board receives control signal from Beagle, retrieves power from inverter and sends out power to Beagle and camera. Predicted board has a microcontroller for controlling DC-DC converter and communicating with alarming system.

Possible challenge includes efficiency of algorithms for face and eye lips detection during day time and night time. The signal communication and interaction between Beagleboard and camera, warning system will be another major obstacle. Moreover, the stabilization and filter of the voltage of power supply from car is also considered as an intricate experimental task.

Drum Tutor Lite

Zhen Qin, Yuanheng Yan, Xun Yu

Drum Tutor Lite

Featured Project

Team: Yuanheng Yan, Zhen Qin, Xun Yu

Vision: Rhythm games such as guitar hero are much easier than playing the actual drums. We want to make a drum tutor that makes playing drums as easy as guitar hero. The player is not required to read a sheet music.

Description: We will build a drum add-on that will tutor people how to play the drums. We will make a panel for visual queue of the drum and beats in a form similar to guitar hero game. The panel can be a N*10 (N varying with the drum kit) led bar array. Each horizontal bar will be a beat and each horizontal line above the bottom line will represent the upcoming beats.

There will be sensors on each drum that will fire when the drum heads is hit. The drums will be affixed with ring of light that provides the timing and accuracy of the player according to the sensors.

Of course with a flip of a switch, the drum could be a simple light up drum: when the player hits the drum, that particular drum will light up giving cool effects.

The system will be on a microprocessor. Or for more versatile uses, it could be connected to the computer. And a app will be written for the tutor.

Project Videos