Project

# Title Team Members TA Documents Sponsor
58 Ultrasonic Spatial Awareness Device for the Visually Impaired
Adam Auten
Robert Kummerer
Yuan Chih Wu
Zipeng Wang appendix0.gz
appendix0.zip
design_document0.pdf
final_paper0.pdf
presentation0.pptx
proposal0.pdf
Problem:

The World Health Organization estimates that 285 million people worldwide are visually impaired, 40 million of whom are totally blind. The two major mobility aids for the visually-impaired currently are white canes and guide dogs. White canes only allow detection of objects in the direct path of the user, and are therefore very limited in information feedback. Guide dogs can interact more with the user and the environment, making them more useful in certain locations; however, most blind people still use canes at least sometimes, and many still use canes entirely, for reasons of price, care, and in case of some people, allergies. There is, then, a market niche for providing a more useful feedback device to the visually-impaired user about the state of their environment at a lower cost.

Solution:

For our project, we propose a wearable, hands-free device that uses a haptic feedback belt and ultrasonic rangefinders to give the wearer a sense of obstructions in their immediate surroundings. A belt of 8 narrow beam width ultrasonic range-finders worn about the midsection create a coarse 360 degree map of the surroundings, that is then communicated to the wearer through the haptic feedback belt.

To further enhance spatial awareness of the wearer, we’ve noticed a common problem among many people of being unable to continuously walk in a straight line without veering in one direction. To solve that problem, our design will include a magnetometer to sense geomagnetic north, and relay that direction through the haptic belt. This would help the wearer to walk in straight lines by letting them know when they began to turn in one direction or the other; this is useful in navigating sidewalks alongside busy roads in grid-like cities, which are often the case for metropolitan areas with high amounts of traffic. This will be a user-activated mode switch, so that the user can control the occurrence of this function.

As conspicuity is one of the weaknesses of both the white canes and guide dogs, we aim to keep the device as discreet as possible. The haptic belt can be worn under clothing.

Power:

The sensory device will necessarily be battery powered, and will need a long battery life for maximum usability. We aim for a battery life to be usable for at least 4 hours on a single charge.

Uniqueness:

Although this problem has been tackled before (see Blind Eye from Spring 2016), our project differentiates itself through the following points:
• Haptic feedback instead of audio feedback (which we think is a better design choice, since audio feedback obstructs usage of another vital sense for a visually-impaired person)
• Turning off sensor feedback until your environment changes (for ex. Prevent constant constant haptic feedback when you’re sitting with your back to a chair)
• Feedback about cardinal direction (unexplored by previous projects related to this area)

Components:

There are ultrasonic range-finders available on DigiKey at low cost (HC-SR04). We intend to mount the microcontroller on a PCB that will act as a central processing unit for the eight ultrasonic sensors placed across the front of the belt and to drive the haptic feedback belt.

The haptic feedback belt will consist of an array of 8 eccentric rotating mass actuators (ERM’s) due to their low cost and availability. These would be controlled by an haptic motor controller IC to have precise control of the stimulus and allow easy integration with the application processor.

Team Members:

Adam Auten (auten2)
Robert Kummerer (rkummer2)
Yuan Chih Wu (ywu77)

Decentralized Systems for Ground & Arial Vehicles (DSGAV)

Mingda Ma, Alvin Sun, Jialiang Zhang

Featured Project

# Team Members

* Yixiao Sun (yixiaos3)

* Mingda Ma (mingdam2)

* Jialiang Zhang (jz23)

# Problem Statement

Autonomous delivery over drone networks has become one of the new trends which can save a tremendous amount of labor. However, it is very difficult to scale things up due to the inefficiency of multi-rotors collaboration especially when they are carrying payload. In order to actually have it deployed in big cities, we could take advantage of the large ground vehicle network which already exists with rideshare companies like Uber and Lyft. The roof of an automobile has plenty of spaces to hold regular size packages with magnets, and the drone network can then optimize for flight time and efficiency while factoring in ground vehicle plans. While dramatically increasing delivery coverage and efficiency, such strategy raises a challenging problem of drone docking onto moving ground vehicles.

# Solution

We aim at tackling a particular component of this project given the scope and time limitation. We will implement a decentralized multi-agent control system that involves synchronizing a ground vehicle and a drone when in close proximity. Assumptions such as knowledge of vehicle states will be made, as this project is aiming towards a proof of concepts of a core challenge to this project. However, as we progress, we aim at lifting as many of those assumptions as possible. The infrastructure of the lab, drone and ground vehicle will be provided by our kind sponsor Professor Naira Hovakimyan. When the drone approaches the target and starts to have visuals on the ground vehicle, it will automatically send a docking request through an RF module. The RF receiver on the vehicle will then automatically turn on its assistant devices such as specific LED light patterns which aids motion synchronization between ground and areo vehicles. The ground vehicle will also periodically send out locally planned paths to the drone for it to predict the ground vehicle’s trajectory a couple of seconds into the future. This prediction can help the drone to stay within close proximity to the ground vehicle by optimizing with a reference trajectory.

### The hardware components include:

Provided by Research Platforms

* A drone

* A ground vehicle

* A camera

Developed by our team

* An LED based docking indicator

* RF communication modules (xbee)

* Onboard compute and communication microprocessor (STM32F4)

* Standalone power source for RF module and processor

# Required Circuit Design

We will integrate the power source, RF communication module and the LED tracking assistant together with our microcontroller within our PCB. The circuit will also automatically trigger the tracking assistant to facilitate its further operations. This special circuit is designed particularly to demonstrate the ability for the drone to precisely track and dock onto the ground vehicle.

# Criterion for Success -- Stages

1. When the ground vehicle is moving slowly in a straight line, the drone can autonomously take off from an arbitrary location and end up following it within close proximity.

2. Drones remains in close proximity when the ground vehicle is slowly turning (or navigating arbitrarily in slow speed)

3. Drone can dock autonomously onto the ground vehicle that is moving slowly in straight line

4. Drone can dock autonomously onto the ground vehicle that is slowly turning

5. Increase the speed of the ground vehicle and successfully perform tracking and / or docking

6. Drone can pick up packages while flying synchronously to the ground vehicle

We consider project completion on stage 3. The stages after that are considered advanced features depending on actual progress.

Project Videos