Project

# Title Team Members TA Documents Sponsor
17 Sensory Awareness Device for Bars and Restaurants
Carl Wolff
Evan Lindquist
Megan Heinhold
Amr Ghoname design_document3.pdf
final_paper2.pdf
photo2.jpg
photo1.PNG
presentation1.pdf
proposal1.docx
# Sensory Awareness Device for Bars and Restaurants

Team Members:
- Megan Heinhold (meganjh3)
- Evan Lindquist (evanl3)
- Carl Wolff (cwolff2)

# Problem

There are many people who suffer from conditions that affect their ability to operate in certain environments such as ADD, epilepsy, and sensory processing disorder. Those affected by these conditions often look to avoid certain triggers such as loud noises, flashing lights, and crowded areas. Beyond common sense predictions, there isn’t a reliable way of gauging how many of these triggers will be present at a bar or restaurant.

# Solution

Describe your design at a high-level, how it solves the problem, and introduce the subsystems of your project.
Our solution to this issue is a small device that can be purchased by a bar or restaurant to gauge its sensory attributes. The device will require a one time set up and then can be left to passively acquire real-time data on light and sound levels within the establishment. The device will regularly upload this information to an app so that any individual can monitor possible triggers and general ambiance of a location prior to going inside. This will allow individuals to find optimal environments for them based on preferences for ambient noise and light levels.

The only comparable service we could find is Google’s “Popular Times” function which shows the predicted crowd levels for establishments throughout the day. Our device improves on this by providing live data distributed into different categories as well as the aforementioned “crowdedness level”.

# Solution Components

## Subsystem 1 - Power

Our device will receive its power from a wall outlet. Due to the safety concerns of working with high voltage, we would like additional guidance from a mentor TA on how to properly go about this. Our current plan is to use a commercial adapter that steps down the voltage from 120V to ~5V for usage with our other hardware subsystems.

## Subsystem 2 - Sensor Block

This subsystem will contain all of the sensors used to acquire data about the environment. It will consist of photoresistors that will be able to report data on the ambient light level as well as the presence of any lighting effects (such as strobe lights); microphones to detect the overall sound level (combination of human noise, music, and assorted background noises); and temperature sensors to report the ambient temperature.

## Subsystem 3 - Microcontroller

This subsystem will regularly poll the sensors within the sensor block to capture data about the environment. It will handle safety matters (strobe lights and excessive noise) in pseudo-real time by analyzing the data from the sensor block to detect flashing lights and/or noise above a certain decibel threshold. If it detects any safety hazards, it will immediately output a hazard signal. The device will also track the running average of sensor readings that are not safety related. It will output these averages at a reduced frequency (say once every ten minutes). The MCU will send the averages and hazard signals to the Wifi module so that they can be sent to the app.

## Subsystem 4 - Wifi Module

This subsystem will allow our device to transmit the data it collects to our app via Wifi.

## Subsystem 5 - App

The app will allow users to view the current ambient levels of a specific bar, as well as any safety alerts that have been recorded in the past few days. For the purposes of this course, we will focus on a web application.

### App Backend

The app backend will need to communicate directly with our devices, receiving the transmitted data and storing any data we wish to keep for historical purposes.

### App Frontend

The app frontend will serve two different personas: that of the bar/restaurant and that of the business patron. Each of these users will be able to login and update their establishment’s general information and view real time data of their choice, respectively.

# Criterion For Success

The device should require an initial set-up and then be almost entirely maintenance free.
Identify medical safety hazards, specifically strobing lights and sound levels over a safe threshold; report these hazards immediately.
Average compiled sensor data to determine ambient light, sound, and temperature levels over 10 minute time intervals.
Transmit this data to an app, and have it displayed in an easily accessible format.

Decentralized Systems for Ground & Arial Vehicles (DSGAV)

Mingda Ma, Alvin Sun, Jialiang Zhang

Featured Project

# Team Members

* Yixiao Sun (yixiaos3)

* Mingda Ma (mingdam2)

* Jialiang Zhang (jz23)

# Problem Statement

Autonomous delivery over drone networks has become one of the new trends which can save a tremendous amount of labor. However, it is very difficult to scale things up due to the inefficiency of multi-rotors collaboration especially when they are carrying payload. In order to actually have it deployed in big cities, we could take advantage of the large ground vehicle network which already exists with rideshare companies like Uber and Lyft. The roof of an automobile has plenty of spaces to hold regular size packages with magnets, and the drone network can then optimize for flight time and efficiency while factoring in ground vehicle plans. While dramatically increasing delivery coverage and efficiency, such strategy raises a challenging problem of drone docking onto moving ground vehicles.

# Solution

We aim at tackling a particular component of this project given the scope and time limitation. We will implement a decentralized multi-agent control system that involves synchronizing a ground vehicle and a drone when in close proximity. Assumptions such as knowledge of vehicle states will be made, as this project is aiming towards a proof of concepts of a core challenge to this project. However, as we progress, we aim at lifting as many of those assumptions as possible. The infrastructure of the lab, drone and ground vehicle will be provided by our kind sponsor Professor Naira Hovakimyan. When the drone approaches the target and starts to have visuals on the ground vehicle, it will automatically send a docking request through an RF module. The RF receiver on the vehicle will then automatically turn on its assistant devices such as specific LED light patterns which aids motion synchronization between ground and areo vehicles. The ground vehicle will also periodically send out locally planned paths to the drone for it to predict the ground vehicle’s trajectory a couple of seconds into the future. This prediction can help the drone to stay within close proximity to the ground vehicle by optimizing with a reference trajectory.

### The hardware components include:

Provided by Research Platforms

* A drone

* A ground vehicle

* A camera

Developed by our team

* An LED based docking indicator

* RF communication modules (xbee)

* Onboard compute and communication microprocessor (STM32F4)

* Standalone power source for RF module and processor

# Required Circuit Design

We will integrate the power source, RF communication module and the LED tracking assistant together with our microcontroller within our PCB. The circuit will also automatically trigger the tracking assistant to facilitate its further operations. This special circuit is designed particularly to demonstrate the ability for the drone to precisely track and dock onto the ground vehicle.

# Criterion for Success -- Stages

1. When the ground vehicle is moving slowly in a straight line, the drone can autonomously take off from an arbitrary location and end up following it within close proximity.

2. Drones remains in close proximity when the ground vehicle is slowly turning (or navigating arbitrarily in slow speed)

3. Drone can dock autonomously onto the ground vehicle that is moving slowly in straight line

4. Drone can dock autonomously onto the ground vehicle that is slowly turning

5. Increase the speed of the ground vehicle and successfully perform tracking and / or docking

6. Drone can pick up packages while flying synchronously to the ground vehicle

We consider project completion on stage 3. The stages after that are considered advanced features depending on actual progress.

Project Videos