Project

# Title Team Members TA Documents Sponsor
21 Campus Tour Guide by AI-Powered Autonomous System
Bob Jin
Hao Ren
Weiang Wang
Yuntong Gu
design_document1.pdf
design_document2.pdf
design_document3.pdf
final_paper2.pdf
final_paper3.pdf
proposal1.pdf
Simon Hu
This [link](https://accurate-ringer-067.notion.site/Campus-Tour-Guide-by-AI-Powered-Autonomous-System-f4d17e16378740e2948f5bef4afd7315?pvs=4) contains the html version of our project description.



# Team Members


* Hao Ren 3200110807 haor2
* Xuanbo Jin 3200110464 xuanboj2
* Weiang Wang 3200111302 weiangw2
* Yuntong Gu 3200110187 yuntong7



> 💡 Note: this doc provides an overview of the project “Campus Tour Guide by AI-Powered Autonomous System”. We start by re-iterating the problem. We then present our proposal and solution. We also draft an initial plan to help build `v0`solution.

# đź‘€ Problem

Anyone entering a place for the first time, like an university, can be quite challenging. Knowing where you are, how to get to your destination, how to optimize your routes, knowing factors that will influence your routes can be complicated. Having a real-time interactive system that guides people through this process is needed. It has been possible yet not able to scale because it’s not open-sourced, and its hardware isn’t standardized, and is expensive. The interaction isn’t versatile enough to adapt well under the ever-changing applications. A cheap and versatile solution is needed.

------

# đź’­ Proposal

## Solution Overview

Our solution utilizes autonomous UAV to guide our clients, sensing them and the environment, such as obstacles and drone’s location with a sensor module, controlled by a control unit which orchestrate a series of tasks. Our solution is cheap, open-sourced, and versatile to meet the need of a generalized and sustainable long-term solution for our campus and many other applications.

## Solution Components

Our solution contains the following parts: a sensor subsystem, a control subsystem, a mobility subsystem, an inter-connect module.

### Sensor Subsystem

- Identify obstacles
- Identify the person to lead, exclude the other people
- GPS location

### Control Subsystem

- Deploy routes

### Mobility Subsystem

- A drone

### Inter-connect Module

- Inter-communication of control unit, peripheral sensors, and the drone
- Supply power to the sensor module and control unit.

## Criteria for Success

### Milestone 1

- drone can be controlled and moved independently
- GPS can sense the location
- Sensors can be powered

### Milestone 2

- Drone can be controlled by control subsystem
- control subsystem can receive signal from GPS module and sensors
- Routes can be output (not necessarily by moving the drones)

### Milestone 3

- Without obstacle, the system can follow the human
- Without obstacle, the system can fly from A to B and slow down / stop when human is too far away
- System can identify obstacle and plan a route to avoid them

### Milestone 4

- With obstacle, the system can fly from A to B and slow down / stop when human is too far away
- The starting point and ending destination pairs can be selected, e.x. 5 pairs of (A,B) is available.

### Milestone 5 [optional]

- An easy web app which sends signal to the system
- System can receive our instruction (vocal) and design a destination and lead the clients
- Support interactive chatting mode to help understand the surroundings

## Alternatives

*SKYCALL* currently provides a similar version of guiding tour for MIT. But that project isn’t open-sourced and the hardware are not cheap enough, or easy-to-maintain. Our solution is different in that we provide

- Cheap solution
- Open sourced solution (software + hardware), each component will be documented
- Unnecessary functionality will give its way to generality
- Versatile enough to support our campus (which is drastically different to MIT)

------

# 🛫 Division of Work

- Xuanbo Jin: Xuanbo excels at software works. He should do the algorithm part of the design and also takes part in the firmware integration.
- Yuntong Gu: Yutong’s strong background at electrical engineering makes him a great candidate to test the validity of different hardware and connect them to the object. He should also helps the communication between each components.
- Weiang Wang: Enabled by weiang’s strong background in electrical engineering, he should actively helps the communication and interfaces between components.
- Hao Ren: Hao can do assorted works. Hao should actively do the software and firmware part of the work. Hao should explore the validity of possible direction and iterate the version of the projects properly. Hao should organize the roadmap and update it frequently, examining the priority of each part by experimentation and analysis.

Cloud-controlled quadcopter

Featured Project

Idea:

To build a GPS-assisted, cloud-controlled quadcopter, for consumer-friendly aerial photography.

Design/Build:

We will be building a quad from the frame up. The four motors will each have electronic speed controllers,to balance and handle control inputs received from an 8-bit microcontroller(AP),required for its flight. The firmware will be tweaked slightly to allow flight modes that our project specifically requires. A companion computer such as the Erle Brain will be connected to the AP and to the cloud(EC2). We will build a codebase for the flight controller to navigate the quad. This would involve sending messages as per the MAVLink spec for sUAS between the companion computer and the AP to poll sensor data , voltage information , etc. The companion computer will also talk to the cloud via a UDP port to receive requests and process them via our code. Users make requests for media capture via a phone app that talks to the cloud via an internet connection.

Why is it worth doing:

There is currently no consumer-friendly solution that provides or lets anyone capture aerial photographs of them/their family/a nearby event via a simple tap on a phone. In fact, present day off-the-shelf alternatives offer relatively expensive solutions that require owning and carrying bulky equipment such as the quads/remotes. Our idea allows for safe and responsible use of drones as our proposed solution is autonomous, has several safety features, is context aware(terrain information , no fly zones , NOTAMs , etc.) and integrates with the federal airspace seamlessly.

End Product:

Quads that are ready for the connected world and are capable to fly autonomously, from the user standpoint, and can perform maneuvers safely with a very simplistic UI for the common user. Specifically, quads which are deployed on user's demand, without the hassle of ownership.

Similar products and comparison:

Current solutions include RTF (ready to fly) quads such as the DJI Phantom and the Kickstarter project, Lily,that are heavily user-dependent or user-centric.The Phantom requires you to carry a bulky remote with multiple antennas. Moreover,the flight radius could be reduced by interference from nearby conditions.Lily requires the user to carry a tracking device on them. You can not have Lily shoot a subject that is not you. Lily can have a maximum altitude of 15 m above you and that is below the tree line,prone to crashes.

Our solution differs in several ways.Our solution intends to be location and/or event-centric. We propose that the users need not own quads and user can capture a moment with a phone.As long as any of the users are in the service area and the weather conditions are permissible, safety and knowledge of controlling the quad are all abstracted. The only question left to the user is what should be in the picture at a given time.