Project

# Title Team Members TA Documents Sponsor
13 Haptic Headset
Danny Pellikan
Isabella Huang
Tasho Madondo
Luoyan Li design_document1.pdf
design_document2.pdf
final_paper1.pdf
proposal1.pdf
proposal2.pdf
# Haptic Headset

Team Members:
- Tasho Madondo (madondo2)
- Isabella Huang (xhuang93)
- Danny Pellikan (djp8)

# Problem

Hearing is one of our most essential senses. Hearing is the only sensory system that allows us to know what is going on everywhere in our environment at once. This property of hearing offers great advantages for survival as most alerts can be heard before they are ever seen. Deaf individuals, and those hard of hearing, have lost those advantages; Due to this, they lack the awareness of their environment offered with sound. We aim to mitigate some of the struggles of those with hearing loss.

# Solution

As a solution, rather than relying on the sense of sound, they can use the sense of feeling to get information they need from their immediate surroundings with directional haptic feedback. Haptic feedback is the use of vibration to convey information to the user (for example play station controllers or phone notifications). The idea is to place individual vibration motors along the outer rings on each side of over-ear headphones or ear mufflers. When a loud enough sound is played from any direction to the user, each individual motor vibrates in a way to give the user a sense of directional feedback. The goal of this device is to give the user heads up on where to look to see where the sound came from regardless of how little they can hear from their surroundings.

# Solution Components

## Subsystem 1: Audio Sensing/Directionality/Sound Detection

The device will use microphones to pick up the sound from the surrounding environment. We currently have 1 idea for audio/directionality detection.
Method 1 Multiple Unidirectional Microphones: This method uses multiple small unidirectional microphones pointing in each direction on each ear to pick up the audio of the surrounding environment. Each sensor would then correspond to a direction so that, when triggered, the appropriate vibration motors will trigger corresponding to that sensor. The position of the sound sensors would be as follows: each earpiece (Left and Right) will have 9 sound sensors corresponding to the 8 directions around the ear (Front, Up, Down, Back, Front-Up, Front-Down, Back-Up, Back-Down) as well as the direction directly away from the ear (directly to the left or directly to the right)

Diagram of Outer Piece with Unidirectional Microphones - [https://mediaspace.illinois.edu/media/t/1_khyavyq1](url)

## Subsystem 2: Haptic Feedback

The information about a sound and where it is coming from is relayed through haptic feedback from the vibration motors along the ear. Vibration motors will be placed along the ring of each earpiece on both sides of the headphones. Each earpiece (left and right) will have 8 vibration motors around the ear (Front, Up, Down, Back, Front-Up, Front-Down, Back-Up, Back-Down). Based on the sensor's read, the corresponding vibration motors will trigger to give the impression of direction from the user. For example: Sound coming from directly to the left, will trigger the vibration motors on the left earpiece; Sound coming from above and behind, will trigger the Back-Up, Up, and Back vibration motors on both the left and right earpiece; Sound coming from above and in front but to the right, will trigger the right earpiece's Front-Up, Front, and Up vibration motors.

Diagram of Inner Piece with Vibration Motors - [https://mediaspace.illinois.edu/media/t/1_k664rq6s](url)

## Subsystem 3: Analog to Digital Microcontroller

This system will be used for taking the analog input from the unidirectional microphones and converting to a signal for the vibration motors. Consider the number of sensors being used we will most likely need an amplifier to for each microphone and analog to digital converter for the microcontroller.

# Criterion For Success

1. Audio Sensing: Sound sensors are able to pick up loud sound from the surrounding environment and determine the direction of the sound based on the trigger sensors.

2. Haptic Feedback: When given a direction, the appropriate vibration motors will trigger to inform the user of the direction.

3. Comfortable Fitting: The device fits well and comfortably on the user.

4. User Efficiency: User can effectively tell where external sound is coming from through the haptic feedback.

# More Diagrams of Device

Diagram of Device position of human head - [https://mediaspace.illinois.edu/media/t/1_byyz2p7u](url)

Diagram of Device attachment on over-ear headphones - [https://mediaspace.illinois.edu/media/t/1_bua29b7m](url)

Decentralized Systems for Ground & Arial Vehicles (DSGAV)

Mingda Ma, Alvin Sun, Jialiang Zhang

Featured Project

# Team Members

* Yixiao Sun (yixiaos3)

* Mingda Ma (mingdam2)

* Jialiang Zhang (jz23)

# Problem Statement

Autonomous delivery over drone networks has become one of the new trends which can save a tremendous amount of labor. However, it is very difficult to scale things up due to the inefficiency of multi-rotors collaboration especially when they are carrying payload. In order to actually have it deployed in big cities, we could take advantage of the large ground vehicle network which already exists with rideshare companies like Uber and Lyft. The roof of an automobile has plenty of spaces to hold regular size packages with magnets, and the drone network can then optimize for flight time and efficiency while factoring in ground vehicle plans. While dramatically increasing delivery coverage and efficiency, such strategy raises a challenging problem of drone docking onto moving ground vehicles.

# Solution

We aim at tackling a particular component of this project given the scope and time limitation. We will implement a decentralized multi-agent control system that involves synchronizing a ground vehicle and a drone when in close proximity. Assumptions such as knowledge of vehicle states will be made, as this project is aiming towards a proof of concepts of a core challenge to this project. However, as we progress, we aim at lifting as many of those assumptions as possible. The infrastructure of the lab, drone and ground vehicle will be provided by our kind sponsor Professor Naira Hovakimyan. When the drone approaches the target and starts to have visuals on the ground vehicle, it will automatically send a docking request through an RF module. The RF receiver on the vehicle will then automatically turn on its assistant devices such as specific LED light patterns which aids motion synchronization between ground and areo vehicles. The ground vehicle will also periodically send out locally planned paths to the drone for it to predict the ground vehicle’s trajectory a couple of seconds into the future. This prediction can help the drone to stay within close proximity to the ground vehicle by optimizing with a reference trajectory.

### The hardware components include:

Provided by Research Platforms

* A drone

* A ground vehicle

* A camera

Developed by our team

* An LED based docking indicator

* RF communication modules (xbee)

* Onboard compute and communication microprocessor (STM32F4)

* Standalone power source for RF module and processor

# Required Circuit Design

We will integrate the power source, RF communication module and the LED tracking assistant together with our microcontroller within our PCB. The circuit will also automatically trigger the tracking assistant to facilitate its further operations. This special circuit is designed particularly to demonstrate the ability for the drone to precisely track and dock onto the ground vehicle.

# Criterion for Success -- Stages

1. When the ground vehicle is moving slowly in a straight line, the drone can autonomously take off from an arbitrary location and end up following it within close proximity.

2. Drones remains in close proximity when the ground vehicle is slowly turning (or navigating arbitrarily in slow speed)

3. Drone can dock autonomously onto the ground vehicle that is moving slowly in straight line

4. Drone can dock autonomously onto the ground vehicle that is slowly turning

5. Increase the speed of the ground vehicle and successfully perform tracking and / or docking

6. Drone can pick up packages while flying synchronously to the ground vehicle

We consider project completion on stage 3. The stages after that are considered advanced features depending on actual progress.

Project Videos