Project

# Title Team Members TA Documents Sponsor
21 Player Tracking Camera
Aksh Gupta
Oreoluwa Sunmola
Shivang Charan
Feiyu Zhang design_document1.pdf
design_document2.pdf
final_paper2.pdf
final_paper3.pdf
final_paper4.pdf
proposal1.pdf
- **Team :** Shivang Charan (scharan2), Aksh Gupta (ag26), Oreoluwa Sunmola (asunmo2)
- **Player Tracking Camera**
- **Problem** - Watching and rewatching amazing highlights by athletes has become quite popular across the internet. These viral highlights have primarily been reserved for organized sports teams with dedicated film teams - as regular people generally do not have a cameraman to film the pickup games they play.
- **Solution Overview** - In order to solve this problem, we intend on creating a moving stand that will keep the intended target in the view of their camera. This stand will be adjustable to fit most existing cameras. By setting up small beacons around the playing area we can calculate the position of a person running - which would then be used to rotate the camera to capture the subjects’ movements in real-time. Functionally, this would eliminate the need to have a cameraman follow the game’s movements and would automate the filming process.
- **Solution Components**
- **[Camera Module]** - The user will place the camera module on the edge of their playing area. The camera module will consist of a camera and a mount for the camera. The mount will be connected to a servo such that the camera rotates in the direction of the subject.
- **[Bluetooth Transceivers]** - There will need to be two bluetooth transceivers. The first will be on the player the camera is tracking. This will interface with the BLE beacons around the playing field to relay a position estimate to the second bluetooth transceiver - located on the camera module.
- **[Microcontroller]** - The microcontroller will have to take in the positional data from the Bluetooth transceivers and calculate the degrees the camera has to move and run the servos accordingly.
- **[BLE Beacons]** - The beacons serve two purposes. The first is to create the boundaries where the camera will be active e.g (the four corners of a basketball court where the game will be played). The second purpose of the beacons will be to serve as a reference for the position of the subject we are tracking so that we can get the coordinates of the subject using the beacons and rotate the camera module accordingly.
- **Criterion for Success** - For this project to be successful,
- BLE beacons should be able to communicate the relative distance of subject wearing Bluetooth Transceiver
- We should be able to pinpoint the position of the subject using the beacons as reference points.
- The camera module should be able to accurately follow a fast subject and keep it at the center of the frame while the subject is within the boundary set up by the beacons.

Decentralized Systems for Ground & Arial Vehicles (DSGAV)

Mingda Ma, Alvin Sun, Jialiang Zhang

Featured Project

# Team Members

* Yixiao Sun (yixiaos3)

* Mingda Ma (mingdam2)

* Jialiang Zhang (jz23)

# Problem Statement

Autonomous delivery over drone networks has become one of the new trends which can save a tremendous amount of labor. However, it is very difficult to scale things up due to the inefficiency of multi-rotors collaboration especially when they are carrying payload. In order to actually have it deployed in big cities, we could take advantage of the large ground vehicle network which already exists with rideshare companies like Uber and Lyft. The roof of an automobile has plenty of spaces to hold regular size packages with magnets, and the drone network can then optimize for flight time and efficiency while factoring in ground vehicle plans. While dramatically increasing delivery coverage and efficiency, such strategy raises a challenging problem of drone docking onto moving ground vehicles.

# Solution

We aim at tackling a particular component of this project given the scope and time limitation. We will implement a decentralized multi-agent control system that involves synchronizing a ground vehicle and a drone when in close proximity. Assumptions such as knowledge of vehicle states will be made, as this project is aiming towards a proof of concepts of a core challenge to this project. However, as we progress, we aim at lifting as many of those assumptions as possible. The infrastructure of the lab, drone and ground vehicle will be provided by our kind sponsor Professor Naira Hovakimyan. When the drone approaches the target and starts to have visuals on the ground vehicle, it will automatically send a docking request through an RF module. The RF receiver on the vehicle will then automatically turn on its assistant devices such as specific LED light patterns which aids motion synchronization between ground and areo vehicles. The ground vehicle will also periodically send out locally planned paths to the drone for it to predict the ground vehicle’s trajectory a couple of seconds into the future. This prediction can help the drone to stay within close proximity to the ground vehicle by optimizing with a reference trajectory.

### The hardware components include:

Provided by Research Platforms

* A drone

* A ground vehicle

* A camera

Developed by our team

* An LED based docking indicator

* RF communication modules (xbee)

* Onboard compute and communication microprocessor (STM32F4)

* Standalone power source for RF module and processor

# Required Circuit Design

We will integrate the power source, RF communication module and the LED tracking assistant together with our microcontroller within our PCB. The circuit will also automatically trigger the tracking assistant to facilitate its further operations. This special circuit is designed particularly to demonstrate the ability for the drone to precisely track and dock onto the ground vehicle.

# Criterion for Success -- Stages

1. When the ground vehicle is moving slowly in a straight line, the drone can autonomously take off from an arbitrary location and end up following it within close proximity.

2. Drones remains in close proximity when the ground vehicle is slowly turning (or navigating arbitrarily in slow speed)

3. Drone can dock autonomously onto the ground vehicle that is moving slowly in straight line

4. Drone can dock autonomously onto the ground vehicle that is slowly turning

5. Increase the speed of the ground vehicle and successfully perform tracking and / or docking

6. Drone can pick up packages while flying synchronously to the ground vehicle

We consider project completion on stage 3. The stages after that are considered advanced features depending on actual progress.

Project Videos