Project

# Title Team Members TA Documents Sponsor
53 Camera Triggering System
Daniel Salazar
Edwin Ortega
Matt Pattermann
Qingyu Li design_document2.pdf
final_paper1.pdf
photo1.PNG
photo2.PNG
presentation1.pptx
proposal1.pdf
# **Team Members**:
- Daniel Salazar(dsalaz8)
- Matt Pattermann (mpatte3)
- Edwin Ortega (eorteg27)

# **Problem**

As time goes on, climate change keeps leaving its impact. Notably, it affects the development of plants (such as corn). To combat this the Institute for Genomic Biology at UIUC collects data in regard to the plants, such as, height, color, size and so forth. This is crop phenotyping, and it is done manually by hand to be analyzed. The team led by John Hart has already developed an image processing tool, Computer Vision, to analyze pictures of these crops. The issue comes with the automation of taking the images. Go-Pro cameras being used take an image per hour, take images when crops are moving due to wind, or under inconsistent lighting settings. This results in not well taken images which the Computer Vision tool struggles to analyze.

# **Solution**


The solution to this would be a camera triggering system that would be mounted on the rain intercept facility. This camera would be a GoPro enclosed system that will not be damaged by the weather elements. In order to minimize the number of pictures, it will be able to have variable time-lapse control. Based on the lighting conditions present, the camera will adjust camera parameters (ISO, shutter speed, aperture, etc) in order to maximize image quality based on surrounding conditions. This is where sensors will come into play and detect wind speeds, sunlight, or rain and adjust accordingly. In reference to wind speeds, this can cause blurry images and thus will cause delayed imaging. If possible, it can also interact with the rain facility itself and close the facility if dangerous wind levels are detected. The procured images are then uploaded via Wi-Fi to then be analyzed by the Institute for Genomic Biology and Center for Digital Agriculture.


# **Solution Components**

_Wind Sensor_

The device will require a wind sensor, or anemometer, able to detect and measure the speed of the wind in its surrounding environment. Specifically, we are looking towards implementing a hot-wire anemometer. Essentially, this involves heating an element to a constant temperature and measuring the power necessary to maintain this temperature. The wind velocity will be proportional to this power. Hot-wire anemometers are analog devices, so the wind sensor will need to be connected to an A/D converter to calibrate and read the output to the controller.

_Sunlight Sensor_

The device will require a sunlight sensor, to detect and measure the intensity of light in its surrounding environment. This data will then be output to the main controller. For this subsystem, we are currently looking towards implementing a light intensity meter utilizing a light dependent resistor (LDR) as we have found there to be many resources available for such devices. However, we have not ruled out the use of light intensity meters utilizing photodiode or phototransistor-based circuits. More research is needed to fairly compare the various methods of light intensity measurement.

_Rain Sensor_

The device will require a rainwater sensor that is able to detect the presence and severity of rainfall. This sensor will output the data to the main controller. We are currently looking at a water sensor that utilizes rainwater to complete its electronic circuit, thereby reducing its resistance the more rain is on it, however this method results in rain being detected well after rainfall has stopped, so we are looking into more rainfall detection methods.

_Wi-Fi Interface_

The device will require the ability to communicate with the user remotely. This comes in three forms: the ability to take the images captured by the GoPro and send them remotely to the user (by uploading to a predetermined database, such as a server), the ability to take the readings and camera settings collected by the controller and send them to the user to be displayed on the remote user interface, and the ability to receive user inputs from the user interface to relay back to the main controller. We plan to facilitate these remote connections from the system to the user using Wi-Fi.

_Controller_

The device will be managed by a controller that takes detected sources of wind, sun, and rain to adjust the camera parameters such as ISO, shutter speed, and aperture to maximize image quality. Additionally, the controller will enable the time-lapse functionality to take images based on preset time conditions using the remote interface. If the user wishes, they can manually adjust the camera settings on the remote interface. The controller will receive these instructions via the Wi-Fi interface and will adjust the camera settings accordingly. In response to wind, the controller will not take images to prevent blurred images (and if implemented, it will tell the rain facility to close the retractable roof). The same will be done in response to significant rainfall.

_User Interface_

The user interface will be how the user monitors the environment within the facility/greenhouse. It will display all the data collected by the sensors, as well as the settings of the camera. If the user desires, the camera settings can be manually adjusted from the user interface. This communication of data will be facilitated by the Wi-Fi interface.

_Power Supply_

The power supply is dependent on the rain facility and what is within its range. If a wall plug is available nearby and can be implemented within the rain facility and not interfere with images, it would be implemented. If not, then a 12V battery would be implemented instead which would not require any direct wired connection. This will be cleared up with further communications with John Hart regarding the rain facility.


# **Criterion For Success**

-Provide variable time-lapse control and camera trigger multiple times per day

-Create wind sensor that accurately measures wind speed and enables delays in imaging until calm conditions

-Automatically adjust camera parameters in response to the environmental conditions such as sunlight, rain, etc.

-Create waterproof enclosure that protects and mounts the system

-Create user interface to allow for remote control of the system

-Interface with local network to offload images via Wi-Fi

-Interact with the rain facility if wind persists by closing the retractable roof (not mandatory, but can be implemented)

Cypress Robot Kit

Todd Nguyen, Byung Joo Park, Alvin Wu

Cypress Robot Kit

Featured Project

Cypress is looking to develop a robotic kit with the purpose of interesting the maker community in the PSOC and its potential. We will be developing a shield that will attach to a PSoC board that will interface to our motors and sensors. To make the shield, we will design our own PCB that will mount on the PSoC directly. The end product will be a remote controlled rover-like robot (through bluetooth) with sensors to achieve line following and obstacle avoidance.

The modules that we will implement:

- Motor Control: H-bridge and PWM control

- Bluetooth Control: Serial communication with PSoC BLE Module, and phone application

- Line Following System: IR sensors

- Obstacle Avoidance System: Ultrasonic sensor

Cypress wishes to use as many off-the-shelf products as possible in order to achieve a “kit-able” design for hobbyists. Building the robot will be a plug-and-play experience so that users can focus on exploring the capabilities of the PSoC.

Our robot will offer three modes which can be toggled through the app: a line following mode, an obstacle-avoiding mode, and a manual-control mode. In the manual-control mode, one will be able to control the motors with the app. In autonomous modes, the robot will be controlled based off of the input from the sensors.