Project

# Title Team Members TA Documents Sponsor
22 Smart Drone Delivery Improvement
Rahul Joshi
Raymond Hoagland
Sachin Weerasooriya
Luke Wendt appendix0.docx
design_document0.pdf
final_paper0.pdf
other0.pdf
presentation0.pptx
proposal0.pdf
When you order a package that is shipped via ground, it is either left at your door or you must be present to sign off on the package. Services like Amazon PrimeAir look to speed up delivery time with drone shipping, which claims to be able to come to your door in 30 minutes. If you look at Video 2 in this link, you will see that the shipment is loaded and the drone takes off, converts to a plane, flies to the vicinity of the landing local, converts back to a drone, and lands on a marker put out by the recipient. This drone then deposits the package and returns to the factory for its next package. Here lies a major flaw; in the event that something valuable is being shipped, it would be desirable for their to be a confirmation that someone is available to pickup the package that is being delivered.



This is where we step in. The problem we want to address is we will assume that the drone is in the area of the delivery spot. We will use image processing to ID the user specified landing spot. The key difference will be: instead of landing, the drone will notify the owner that the package is ready for pickup and hover above the landing spot for a fixed amount of time. The drone will then wait for a confirmation from the user that it is safe to drop off the package. If this message isn't received after a set amount of time, the drone will return to the warehouse with the package and will try to return the package later. While the drone waits, it will constantly scan the surroundings to see if any unidentified threats are approaching the drone. If it detects a threat is too close, the drone will take off and hover at a higher elevation to protect the package contents. It will stay there until either the recipient gives the okay for delivery or until the time limit is up and the drone will return to the factory.

Hardware needed:
- A drone
- A small camera
- Digital Signal Processor (DSP) Chip
- Proportional-Integral-Derivative (PID) Controller
- Raspberry Pi
- USB wifi dongle
- Smart phone

We will use image processing to ID our landing spot and the PID controller to send the necessary feedback to the motors in order to descend the drone and lookout for potential hazards. For our prototype, will then use the raspberry pi and dongle to connect to the resident's WiFi and send a message to a smartphone app indicating the package has arrived. The drone will then wait for confirmation from the app. The mechanical mechanism to physically lower the package will be out of our scope. Rather we will have an LED or some way to indicate that the package has been dropped.

Decentralized Systems for Ground & Arial Vehicles (DSGAV)

Mingda Ma, Alvin Sun, Jialiang Zhang

Featured Project

# Team Members

* Yixiao Sun (yixiaos3)

* Mingda Ma (mingdam2)

* Jialiang Zhang (jz23)

# Problem Statement

Autonomous delivery over drone networks has become one of the new trends which can save a tremendous amount of labor. However, it is very difficult to scale things up due to the inefficiency of multi-rotors collaboration especially when they are carrying payload. In order to actually have it deployed in big cities, we could take advantage of the large ground vehicle network which already exists with rideshare companies like Uber and Lyft. The roof of an automobile has plenty of spaces to hold regular size packages with magnets, and the drone network can then optimize for flight time and efficiency while factoring in ground vehicle plans. While dramatically increasing delivery coverage and efficiency, such strategy raises a challenging problem of drone docking onto moving ground vehicles.

# Solution

We aim at tackling a particular component of this project given the scope and time limitation. We will implement a decentralized multi-agent control system that involves synchronizing a ground vehicle and a drone when in close proximity. Assumptions such as knowledge of vehicle states will be made, as this project is aiming towards a proof of concepts of a core challenge to this project. However, as we progress, we aim at lifting as many of those assumptions as possible. The infrastructure of the lab, drone and ground vehicle will be provided by our kind sponsor Professor Naira Hovakimyan. When the drone approaches the target and starts to have visuals on the ground vehicle, it will automatically send a docking request through an RF module. The RF receiver on the vehicle will then automatically turn on its assistant devices such as specific LED light patterns which aids motion synchronization between ground and areo vehicles. The ground vehicle will also periodically send out locally planned paths to the drone for it to predict the ground vehicle’s trajectory a couple of seconds into the future. This prediction can help the drone to stay within close proximity to the ground vehicle by optimizing with a reference trajectory.

### The hardware components include:

Provided by Research Platforms

* A drone

* A ground vehicle

* A camera

Developed by our team

* An LED based docking indicator

* RF communication modules (xbee)

* Onboard compute and communication microprocessor (STM32F4)

* Standalone power source for RF module and processor

# Required Circuit Design

We will integrate the power source, RF communication module and the LED tracking assistant together with our microcontroller within our PCB. The circuit will also automatically trigger the tracking assistant to facilitate its further operations. This special circuit is designed particularly to demonstrate the ability for the drone to precisely track and dock onto the ground vehicle.

# Criterion for Success -- Stages

1. When the ground vehicle is moving slowly in a straight line, the drone can autonomously take off from an arbitrary location and end up following it within close proximity.

2. Drones remains in close proximity when the ground vehicle is slowly turning (or navigating arbitrarily in slow speed)

3. Drone can dock autonomously onto the ground vehicle that is moving slowly in straight line

4. Drone can dock autonomously onto the ground vehicle that is slowly turning

5. Increase the speed of the ground vehicle and successfully perform tracking and / or docking

6. Drone can pick up packages while flying synchronously to the ground vehicle

We consider project completion on stage 3. The stages after that are considered advanced features depending on actual progress.

Project Videos