Project

# Title Team Members TA Documents Sponsor
27 Oxygen Delivery Robot
Honorable Mention
Aidan Dunican
Nazar Kalyniouk
Rutvik Sayankar
Selva Subramaniam design_document3.pdf
final_paper1.pdf
photo1.png
photo2.png
presentation1.pptx
proposal2.pdf
# Oxygen Delivery Robot

Team Members:
- Rutvik Sayankar (rutviks2)
- Aidan Dunican (dunican2)
- Nazar Kalyniouk (nazark2)

# Problem

Children's interstitial and diffuse lung disease (ChILD) is a collection of diseases or disorders. These diseases cause a thickening of the interstitium (the tissue that extends throughout the lungs) due to scarring, inflammation, or fluid buildup. This eventually affects a patient’s ability to breathe and distribute enough oxygen to the blood.

Numerous children experience the impact of this situation, requiring supplemental oxygen for their daily activities. It hampers the mobility and freedom of young infants, diminishing their growth and confidence. Moreover, parents face an increased burden, not only caring for their child but also having to be directly involved in managing the oxygen tank as their child moves around.


# Solution

Given the absence of relevant solutions in the current market, our project aims to ease the challenges faced by parents and provide the freedom for young children to explore their surroundings. As a proof of concept for an affordable solution, we propose a three-wheeled omnidirectional mobile robot capable of supporting filled oxygen tanks in the size range of M-2 to M-9, weighing 1 - 6kg (2.2 - 13.2 lbs) respectively (when full). Due to time constraints in the class and the objective to demonstrate the feasibility of a low-cost device, we plan to construct a robot at a ~50% scale of the proposed solution. Consequently, our robot will handle simulated weights/tanks with weights ranging from 0.5 - 3 kg (1.1 - 6.6 lbs).

The robot will have a three-wheeled omni-wheel drive train, incorporating two localization subsystems to ensure redundancy and enhance child safety. The first subsystem focuses on the drivetrain and chassis of the robot, while the second subsystem utilizes ultra-wideband (UWB) transceivers for triangulating the child's location relative to the robot in indoor environments. As for the final subsystem, we intend to use a camera connected to a Raspberry Pi and leverage OpenCV to improve directional accuracy in tracking the child.

As part of the design, we intend to create a PCB in the form of a Raspberry Pi hat, facilitating convenient access to information generated by our computer vision system. The PCB will incorporate essential components for motor control, with an STM microcontroller serving as the project's central processing unit. This microcontroller will manage the drivetrain, analyze UWB localization data, and execute corresponding actions based on the information obtained.

# Solution Components

## Subsystem 1: Drivetrain and Chassis

This subsystem encompasses the drive train for the 3 omni-wheel robot, featuring the use of 3 H-Bridges (L298N - each IC has two H-bridges therefore we plan to incorporate all the hardware such that we may switch to a 4 omni-wheel based drive train if need be) and 3 AndyMark 245 RPM 12V Gearmotors equipped with 2 Channel Encoders. The microcontroller will control the H-bridges. The 3 omni-wheel drive system facilitates zero-degree turning, simplifying the robot's design and reducing costs by minimizing the number of wheels. An omni-wheel is characterized by outer rollers that spin freely about axes in the plane of the wheel, enabling sideways sliding while the wheel propels forward or backward without slip. Alongside the drivetrain, the chassis will incorporate 3 HC-SR04 Ultrasonic sensors (or three bumper-style limit switches - like a Roomba), providing a redundant system to detect potential obstacles in the robot's path.

## Subsystem 2: UWB Localization
This subsystem suggests implementing a module based on the DW1000 Ultra-Wideband (UWB) transceiver IC, similar to the technology found in Apple AirTags. We opt for UWB over Bluetooth due to its significantly superior accuracy, attributed to UWB's precise distance-based approach using time-of-flight (ToF) rather than meer signal strength as in Bluetooth.

This project will require three transceiver ICs, with two acting as "anchors" fixed on the robot. The distance to the third transceiver (referred to as the "tag") will always be calculated relative to the anchors. With the transceivers we are currently considering, at full transmit power, they have to be at least 18" apart to report the range. At minimum power, they work when they are at least 10 inches. For the "tag," we plan to create a compact PCB containing the transceiver, a small coin battery, and other essential components to ensure proper transceiver operation. This device can be attached to a child's shirt using Velcro.

## Subsystem 3: Computer Vision
This subsystem involves using the OpenCV library on a Raspberry Pi equipped with a camera. By employing pre-trained models, we aim to enhance the reliability and directional accuracy of tracking a young child. The plan is to perform all camera-related processing on the Raspberry Pi and subsequently translate the information into a directional command for the robot if necessary. Given that most common STM chips feature I2C buses, we plan to communicate between the Raspberry Pi and our microcontroller through this bus.

## Division of Work:
Given that we already have a 3 omni wheel robot, it is a little bit smaller than our 50% scale but it allows us to immediately begin work on UWB localization and computer vision until a new iteration can be made. Simultaneously, we'll reconfigure the drive train to ensure compatibility with the additional systems we plan to implement, and the ability to move the desired weight. To streamline the process, we'll allocate specific tasks to individual group members – one focusing on UWB, another on Computer Vision, and the third on the drivetrain. This division of work will allow parallel progress on the different aspects of the project.

# Criterion For Success

Omni-wheel drivetrain that can drive in a specified direction.
Close-range object detection system working (can detect objects inside the path of travel).
UWB Localization down to an accuracy of < 1m.

## Current considerations

We are currently in discussion with Greg at the machine shop about switching to a four-wheeled omni-wheel drivetrain due to the increased weight capacity and integrity of the chassis. To address the safety concerns of this particular project, we are planning to implement the following safety measures:
- Limit robot max speed to <5 MPH
- Using Empty Tanks/ simulated weights. At NO point ever will we be working with compressed oxygen. Our goal is just to prove that we can build a robot that can follow a small human.
- We are planning to work extensively to design the base of the robot to be bottom-heavy & wide to prevent the tipping hazard.

Decentralized Systems for Ground & Arial Vehicles (DSGAV)

Mingda Ma, Alvin Sun, Jialiang Zhang

Featured Project

# Team Members

* Yixiao Sun (yixiaos3)

* Mingda Ma (mingdam2)

* Jialiang Zhang (jz23)

# Problem Statement

Autonomous delivery over drone networks has become one of the new trends which can save a tremendous amount of labor. However, it is very difficult to scale things up due to the inefficiency of multi-rotors collaboration especially when they are carrying payload. In order to actually have it deployed in big cities, we could take advantage of the large ground vehicle network which already exists with rideshare companies like Uber and Lyft. The roof of an automobile has plenty of spaces to hold regular size packages with magnets, and the drone network can then optimize for flight time and efficiency while factoring in ground vehicle plans. While dramatically increasing delivery coverage and efficiency, such strategy raises a challenging problem of drone docking onto moving ground vehicles.

# Solution

We aim at tackling a particular component of this project given the scope and time limitation. We will implement a decentralized multi-agent control system that involves synchronizing a ground vehicle and a drone when in close proximity. Assumptions such as knowledge of vehicle states will be made, as this project is aiming towards a proof of concepts of a core challenge to this project. However, as we progress, we aim at lifting as many of those assumptions as possible. The infrastructure of the lab, drone and ground vehicle will be provided by our kind sponsor Professor Naira Hovakimyan. When the drone approaches the target and starts to have visuals on the ground vehicle, it will automatically send a docking request through an RF module. The RF receiver on the vehicle will then automatically turn on its assistant devices such as specific LED light patterns which aids motion synchronization between ground and areo vehicles. The ground vehicle will also periodically send out locally planned paths to the drone for it to predict the ground vehicle’s trajectory a couple of seconds into the future. This prediction can help the drone to stay within close proximity to the ground vehicle by optimizing with a reference trajectory.

### The hardware components include:

Provided by Research Platforms

* A drone

* A ground vehicle

* A camera

Developed by our team

* An LED based docking indicator

* RF communication modules (xbee)

* Onboard compute and communication microprocessor (STM32F4)

* Standalone power source for RF module and processor

# Required Circuit Design

We will integrate the power source, RF communication module and the LED tracking assistant together with our microcontroller within our PCB. The circuit will also automatically trigger the tracking assistant to facilitate its further operations. This special circuit is designed particularly to demonstrate the ability for the drone to precisely track and dock onto the ground vehicle.

# Criterion for Success -- Stages

1. When the ground vehicle is moving slowly in a straight line, the drone can autonomously take off from an arbitrary location and end up following it within close proximity.

2. Drones remains in close proximity when the ground vehicle is slowly turning (or navigating arbitrarily in slow speed)

3. Drone can dock autonomously onto the ground vehicle that is moving slowly in straight line

4. Drone can dock autonomously onto the ground vehicle that is slowly turning

5. Increase the speed of the ground vehicle and successfully perform tracking and / or docking

6. Drone can pick up packages while flying synchronously to the ground vehicle

We consider project completion on stage 3. The stages after that are considered advanced features depending on actual progress.

Project Videos