Project

# Title Team Members TA Documents Sponsor
45 RF-based Long-range Motion Recognition and Communication System
James Tian
Jason Zhang
Joe Luo
Vishal Dayalan design_document2.pdf
final_paper1.pdf
other1.pdf
photo1.jpeg
photo2.jpeg
presentation1.pptx
proposal2.pdf
video1.mp4
video
# Title
RF-based Long-range Motion Recognition and Communication System

Team Members:
- Joe Luo (luo42)
- Zekai Zhang (zekaiz2)
- James Tian (zeyut2)

# Problem
While society accelerates into the digital age, an avid demand for more aggressively intimate ways of communication rises especially as the world welcomes a post-covid traumatic recovery. We witness the emergence of many novel products, albeit with mixed reception, that embrace this new concept, such as VR games, Metaverse, and holographic projection. It’s apparent that in modern days we crave information that goes beyond texts, videos, and sounds, but something mobile, three-dimensional, and interactive – for instance, transferring and reproducing motion across a long distance. It would be impossible if we want to shake hands with a friend that's located a mile apart from us.

Aside from peer-to-peer communication, a long-range motion communication system can be useful in a variety of scenarios. In a classroom setting, whenever a Physics teacher wants to dig further into a relatively abstract concept, like lattice structure and electron concentration in materials, board and chalk and other variants are their only reliable helpers. However, it would be more engaging for both the lecturer and the learner to see a 3D presentation of the topic in question that's able to move and change at our commands. Likewise, a controller that's able to move extended robot arms can sometimes prove to be ineffective, as not everyone is acquainted with controller maneuvers. It will be much easier to understand and control if one is able to move the arm in real time with points of reference placed on limb joints that match the ones on the machine. Other utilities include but are not limited to workplace security, drone navigation, and smart home. All of which can and will be made simpler with a motion recognition and communication system.

# Solution
We propose a duo-terminal system that reads motion data and sends the encoded information through RF communication to the other terminal which deciphers the data and reproduces the motion in real-time with 3D software simulation or mechanical integration like a motor. Built upon a previous project that clones movement data generated from MEMS sensor measurements to 3D animation, we will still work with discreet accelerometer and gyroscope measurements with appropriate sampling rate to ensure a seamless recreation even in a wireless setting.

Two PCBs are needed for each terminal. While most other components for this project will be printed to PCB, for the sake of flexibility of arrangement, IMUs, aka motion sensors, will preferably connect to the rest of the system via STEMMA QT or long wires. Unfettered from the confinement of circuit board, IMUs in free space can adapt to more situations and diversify the motions they can output. A great example is the VR controllers that accompany most VR headsets.

# Solution Components
## Power Supply:
Both RF components and MEMS sensors require a voltage of 3.3V. We will use AA batteries for the power supply, but onboard LDO from selected microcontrollers will help us control the output voltage. Both terminals use the same form of power supply.

## Motion-capturing Subsystem
This system consists of two or more IMUs that are in free space. The LSM6DSO32 6-DoF Accelerometer and Gyroscope IC will fullfill the need. Since we are aiming to use STEMMA QT as the connector, the I2C communication protocol is favored. This subsystem in particular will likely reuse some of the codes and concepts developed in a previous project that can be found here: https://wiki.illinois.edu/wiki/pages/viewpage.action?pageId=785286420.

## RF Transmitter Subsystem
Measurements from the registers of LSM6DSO32 will be sent to the Arduino or an SPI and I2C-enabled microcontroller that processes the information and packs them into a 16 to 32 bit code with at least 4 bits for position and 4 bits for rotation for each three-dimensional axis. The code sent via SPI interface will be transmitted wirelessly through RFM69HCW transceiver with external Antenna connector.

## RF Receiver Subsystem
Information sent through the transmitter will be recovered by another RFM69HCW module connected to another SPI-enabled microcontroller. The microcontroller will analyze and unpack the code to extract the information and prepare the data for respective motion recreation.

## Motion Reproduction Subsystem
The motion reproduction subsystem ideally consists of two major parts – software and hardware.
The software section is realized by receiving data sent through the serial port from the microcontroller at the receiver’s end. The Unity 3D engine will decode the information and animate a 3D model in a fashion similar to the previous project done by Joe Luo mentioned above. (https://wiki.illinois.edu/wiki/pages/viewpage.action?pageId=785286420)
The hardware component consists of a mechanical integration that’s able to recreate simple directional movements, like 3D printed structures or pulse-controlled continuous rotation servo motors (FS90R) that rotate on a 2D plane on a scale dependent on the degrees of rotation of MEMS sensors.

# Criterion For Success

Positional and rotational motions are captured through MEMS sensors and converted to human–readable data.
RF system is able to function properly and transmits the aforementioned motion data from one device to another at least 0.8-1 mile apart.
The reproduction system at the receiving end is able to dutifully repeat the motion set at the transmitting end on the software end for the minimum success criterion. If met, a 3D printed and/or motor-controlled hardware system can be built to further explore the potential of the project.

S.I.P. (Smart Irrigation Project)

Jackson Lenz, James McMahon

S.I.P. (Smart Irrigation Project)

Featured Project

Jackson Lenz

James McMahon

Our project is to be a reliable, robust, and intelligent irrigation controller for use in areas where reliable weather prediction, water supply, and power supply are not found.

Upon completion of the project, our device will be able to determine the moisture level of the soil, the water level in a water tank, and the temperature, humidity, insolation, and barometric pressure of the environment. It will perform some processing on the observed environmental factors to determine if rain can be expected soon, Comparing this knowledge to the dampness of the soil and the amount of water in reserves will either trigger a command to begin irrigation or maintain a command to not irrigate the fields. This device will allow farmers to make much more efficient use of precious water and also avoid dehydrating crops to death.

In developing nations, power is also of concern because it is not as readily available as power here in the United States. For that reason, our device will incorporate several amp-hours of energy storage in the form of rechargeable, maintenance-free, lead acid batteries. These batteries will charge while power is available from the grid and discharge when power is no longer available. This will allow for uninterrupted control of irrigation. When power is available from the grid, our device will be powered by the grid. At other times, the batteries will supply the required power.

The project is titled S.I.P. because it will reduce water wasted and will be very power efficient (by extremely conservative estimates, able to run for 70 hours without input from the grid), thus sipping on both power and water.

We welcome all questions and comments regarding our project in its current form.

Thank you all very much for you time and consideration!