Project

# Title Team Members TA Documents Sponsor
15 Driver Sleep Detection and Alarming System
Area Award: Computer Vision
Chenyang Xu
Xiangyu Chen
Yixiao Nie
design_document0.pdf
final_paper0.pdf
presentation0.ppt
proposal0.pdf
While driving alone on highway or over a long period of time, it's very easy for the driver to fall asleep and may cause accident. Therefore, we came up with an idea to develop a driver anti-sleep alarm system which could effectively solve the problem.
Our system will use a camera (Kinect) to track the eyes of driver, send information back to a microcontroller, and sound alarms when dangerous signals appear. Therefore our system will have 4 main parts: camera, microcontroller, power supply, and sound warning system.
In the following paragraphs we‚??ll further discuss the image processing and face recognition algorithms and hardware (power supply system and sound warning system).

Algorithms:
This project will use the Kinect camera, which has three modes: RGB mode, depth mode, and IR mode. The RGB mode is used for daytime detection, while the IR mode is used for night detection. The depth mode will not be used.
Basically, the emphasis of the algorithm is detecting the eye motion of the driver. When the driver fells sleepy, their lips will approach closer with each other. The camera should be able to detection this change and then make decision whether the driver is sleepy. During the daytime, the RGB mode of the camera works perfectly for detection. However, when it is at night, RGB mode may perform poor because of the trivial light. In this case we may use the IR mode to detect driver‚??s eyes. Another way to handle this is by using histogram equalization, an algorithm that expands color comparison. In addition, in order to run the algorithm, we may need to use an ARM board which has GPU that can handle image processing in a fast speed

Hardware: The hardware system is composed of three parts. First, we will try to use beagleboard-xm as the little computer to interact with Kinect and control the LEDs and sound system. We will install some Linux system on beagleboard and kinect SDK and implement our algorithms to do face recognition and eye lips closing detection.
The second part is the power supply unit for Beagle board, Kinect camera, and PCB board microprocessor. Since the Kinect camera is connected to Beagle board, it retrieves power directly from Beagle board. Next, the Beagle board is going to retrieve its power from the power supply unit we design, which is approximately 10 W (with camera) according to technical data provided by Beagle‚??s website. The power unit includes a voltage filter, a DC-DC converter and the corresponding microcontroller. The power unit has one USB receptor and one USB output port. The output is used to power up the camera and Beagle board. The receptor is used to receive power from a pre-purchased inverter that can connect to car power supply. The third part is the feedback alarming system including sounds and light. The light warning is made up by 5 regular red LED lights, which are powered by designed PCB. Also, they are able to adjust frequency depending on drivers‚?? condition. The sound warning is implemented with a speaker or a buzzer. It is also powered by another DC-DC converter with different voltage on the PCB board. The car power supply can provide sufficient power above 90 W. The hardware PCB board receives control signal from Beagle, retrieves power from inverter and sends out power to Beagle and camera. Predicted board has a microcontroller for controlling DC-DC converter and communicating with alarming system.

Possible challenge includes efficiency of algorithms for face and eye lips detection during day time and night time. The signal communication and interaction between Beagleboard and camera, warning system will be another major obstacle. Moreover, the stabilization and filter of the voltage of power supply from car is also considered as an intricate experimental task.

Amphibious Spherical Explorer

Kaiwen Chen, Junhao Su, Zhong Tan

Amphibious Spherical Explorer

Featured Project

The amphibious spherical explorer (ASE) is a spherical robot for home monitoring, outdoor adventure or hazardous environment surveillance. Due to the unique shape of the robot, ASE can travel across land, dessert, swamp or even water by itself, or be casted by other devices (e.g. slingshot) to the mission area. ASE has a motion-sensing system based on Inertial Measurement Unit (IMU) and rotary magnetic encoder, which allows the internal controller to adjust its speed and attitude properly. The well-designed control system makes the robot free of visible wobbliness when it is taking actions like acceleration, deceleration, turning and rest. ASE is also a platform for research on control system design. The parameters of the internal controller can be assigned by an external control panel in computer based on MATLAB Graphic User Interface (GUI) which communicates with the robot via a WiFi network generated by the robot. The response of the robot can be recorded and sent back to the control panel for further analysis. This project is completely open-sourced. People who are interested in the robot can continue this project for more interesting features, such as adding camera for real-time surveillance, or controller design based on machine learning.

Project Videos