Project

# Title Team Members TA Documents Sponsor
7 First Person Virtual Reality Interface with RC Car
Deniz Yildirim
Erik Jacobson
Sang Baek Han
Weihang Liang design_document2.pdf
design_document3.pdf
final_paper1.pdf
proposal2.pdf
proposal1.pdf
**Project Members:** Deniz Yıldırım (dy2), Erik Jacobson (erikj2), Sang Baek Han (shan67)

**Title:** First Person Virtual Reality Interface with RC Car

**Problem:**

Remote cars are very fun but often lack immersion and are hard to control at times because the person controlling it can lose line of sight. Remote controlled toy vehicles in general do not get much progress in terms of user experience, all having similar controllers and the few with camera displays showing only things in front of the car. Additionally, in the VR community there is little interest and research on using VR as an I/O device for robots compared to other areas where VR is used, making this project a good opportunity to explore new and different ways the VR technology can be used. While our project is about RC cars, the idea challenges the usual approach to virtual reality and hopes to widen its use cases. We hope to experiment with the idea that while VR can make us exist in immersive virtual environments (which is why it is named “virtual reality”), it can also make us exist in real but unreachable environments such as inside an RC car. The same concept can be applied to have a more immersive presence in parts of the world that humans cannot reach as easily as robots do.

**Solution Overview:**

We will use a VR set and the camera module that can display in 180 degrees to give players the perception that they are a small person inside a remote car, making the experience more immersive and fun. People controlling the car would never lose sense of where the car is as they would feel like they are inside the car and would be able to see their surroundings by looking around. A steering wheel and hands could be rendered on top of the video feed to let players control the car by holding the virtual steering wheel with their VR controllers. While there are some remote cars with cameras in the market, they fail at being immersive because they keep using traditional remote controllers and often do not give users the ability to see the car’s surroundings.

**Solution Components:**

Remote Control Subsystem
- We will design [H bridge and PWM circuits](https://www.acmesystems.it/pcb_pwm) to control the speed of DC motors of RC car. H bridge and PWM circuits are connected to the microprocessor [ATmega328](http://ww1.microchip.com/downloads/en/DeviceDoc/ATmega48A-PA-88A-PA-168A-PA-328-P-DS-DS40002061A.pdf), which receives the control input from the controller through Bluetooth. [HC-05](https://www.amazon.com/dp/B00INWZRNC) Bluetooth module is connected to the microprocessor.
- The smartphone app will be used to control the RC car. The smartphone will connect to the RC car through Bluetooth connection. This will work as a prototyping tool until we can merge with VR gear. We will use VR controller on the final stage.

Video Transmission using Wi-Fi Subsystem
- We will use [Raspberry Pi 3 Model A+](https://www.raspberrypi.org/products/raspberry-pi-3-model-a-plus/) to receive the image data from the camera and send the image data to VR gear through WiFi. This Raspberry Pi has Broadcom VideoCore IV MP2 400 MHz GPU and 2.4GHz and 5GHz IEEE 802.11ac Wi-Fi, which can help to reduce the delay of wireless transmission of video.
- We will use [USB Camera with a 180 degree angle fisheye lens](https://www.amazon.com/dp/B00LQ854AG/) for the camera module to provide a 180 degree view. Since we are using a fisheye lens, the raw image data is curvilinear. The curvilinear image will be converted to the rectilinear image using [Fisheye camera model OpenCV](https://docs.opencv.org/master/db/d58/group__calib3d__fisheye.html).

VR Headgear Facing Direction Subsystem
- We will use [Oculus](https://www.oculus.com/?locale=en_US) VR gear for this project. We are most likely to borrow a VR gear from CS498 or [UGL](https://www.library.illinois.edu/mc/lt/emergingtech/). If not, we might buy the used product to work with. We will program with Oculus [SDK](https://developer.oculus.com/).
- VR gear will receive the image data from Raspberry Pi through Wi-Fi transmission and display the video in real-time to the user. VR gear will continuously receive the entire 180 degree view image data from Raspberry Pi. The facing direction of the user will determine which section of the 180 degree view to display to the user.

Power Subsystem
- We will use 6V Li Battery for all the power source. H bridge and PWM circuits and DC motor will use 6V. 5V Voltage Regulator will be used to provide 5V for the rest of the parts.
- VR Headgear will be connected to PC to turn on.

**Criterion for Success:**
- Display the camera input of high resolution in real-time with less than 1s delay.
- The real-time image from camera can be displayed through VR headgear.
- The camera can display in 180 degrees with the user input.
- The direction of where VR headgear is facing to serve as the user input to rotate the display in 180 degrees.

S.I.P. (Smart Irrigation Project)

Jackson Lenz, James McMahon

S.I.P. (Smart Irrigation Project)

Featured Project

Jackson Lenz

James McMahon

Our project is to be a reliable, robust, and intelligent irrigation controller for use in areas where reliable weather prediction, water supply, and power supply are not found.

Upon completion of the project, our device will be able to determine the moisture level of the soil, the water level in a water tank, and the temperature, humidity, insolation, and barometric pressure of the environment. It will perform some processing on the observed environmental factors to determine if rain can be expected soon, Comparing this knowledge to the dampness of the soil and the amount of water in reserves will either trigger a command to begin irrigation or maintain a command to not irrigate the fields. This device will allow farmers to make much more efficient use of precious water and also avoid dehydrating crops to death.

In developing nations, power is also of concern because it is not as readily available as power here in the United States. For that reason, our device will incorporate several amp-hours of energy storage in the form of rechargeable, maintenance-free, lead acid batteries. These batteries will charge while power is available from the grid and discharge when power is no longer available. This will allow for uninterrupted control of irrigation. When power is available from the grid, our device will be powered by the grid. At other times, the batteries will supply the required power.

The project is titled S.I.P. because it will reduce water wasted and will be very power efficient (by extremely conservative estimates, able to run for 70 hours without input from the grid), thus sipping on both power and water.

We welcome all questions and comments regarding our project in its current form.

Thank you all very much for you time and consideration!