People

TA Office Hours

Held weekly in the senior design lab (ECEB 2070/2072).
Name Time
Zipeng "Bird" Wang Monday 11am-12pm
Kexin Hui Monday 2pm-3pm
Eric Clark Monday 3pm-4pm
Vignesh Sridhar Tuesday 11am-12pm
Jackson Lenz Tuesday 2pm-3pm
Michael Fatina Tuesday 3pm-4pm
Dongwei Shi Wednesday 10am-11am
Yuchen He Wednesday 11am-12pm
Jacob Bryan Wednesday 2pm-3pm
Luke Wendt Wednesday 3pm-4pm
Jose Sanchez Vicarte Thursday 11am-12pm
John Capozzo Thursday 2pm-3pm
Daniel Frei Thursday 3:15pm-4:15pm
James Norton Friday 10am-11am
Sam Sagan Friday 11am-12pm
Daniel Gardner Friday 2pm-3pm

Spring 2017 Instructors

Name Area
Prof. Scott Carney (Instructor)
4061 Beckman
carney@illinois.edu
physics, optics, human potential, leveraging synergies of our core competencies, visioning new verbs
Prof. Arne Fliflet (Instructor)
3056
afliflet@illinois.edu
microwave generation and applications
Prof. Seth Hutchinson (Instructor)

seth@illinois.edu
Prof. Michael Oelze (Instructor)
ECEB 2056
oelze@illinois.edu
Biomedical Imaging, Acoustics, Nondestructive Testing
Prof. Karl Reinhard (Instructor)

reinhrd2@illinois.edu
Jacob Bryan (TA)
3038 ECEB
jdbryan2@illinois.edu
Speech signal processing, DSP, control systems, machine learning, optimization
John Capozzo (TA)
1538 Beckman
capozzo2@illinois.edu
I specialize in Infomation Communication systems and pipelines. Proficient in: Time Series Analysis, HPC, Machine Learning, Exploratory Data Analysis, Data Reliability & Reproducibility, Statistics, Modeling Biological Data
Eric Clark (TA)

ejclark2@illinois.edu
Computer Architecture, IC Design, FPGA, Embedded Systems, Distributed Systems and Networks
Michael Fatina (TA)

fatina2@illinois.edu
embedded systems, microcontrollers, robotics, PCB design, EMG
Daniel Frei (TA)

dfrei2@illinois.edu
Security, Networks
Daniel Gardner (TA)

dgardne2@illinois.edu
RF, wireless communications, circuits, PCB design, embedded systems
Yuchen He (TA)
ECEB 3038
he33@illinois.edu
Robotics, Machine Learning, Computer Vision, Natural Language Processing, Control Systems
Kexin Hui (TA)

khui3@illinois.edu
Digital/Analog IC Design
Jackson Lenz (TA)

jdlenz2@illinois.edu
Electric Machines/Drives, Power Electronics, Batteries
James Norton (TA)
139 Coordinated Science Laboratory
jnorton4@illinois.edu
Physiological Monitoring (EEG, EMG, etc.), Digital Signal Processing, Cognitive Neuroscience
Sam Sagan (TA)
425 CSL
ssagan2@illinois.edu
Electrostatic Discharge, RF, Machine Learning, Audio, Embedded Systems, Electromechanical Systems
Jose Sanchez Vicarte (TA)
ECE3038
josers2@illinois.edu
Computer Architecture, Machine Learning, Embedded Systems, Linux
Dongwei Shi (TA)

dshi9@illinois.edu
Computer Vision, Parallel Computing, Robotics
Vignesh Sridhar (TA)

vsridha2@illinois.edu
Sensor Fusion, Virtual Reality, Economic Forecasting
Zipeng Wang (TA)
206H Talbot
zwang87@illinois.edu
Atmospheric Remote Sensing, CubeSat, Analog Computing, Evolving Hardware, Satellite Attitude Determination and Control.
Luke Wendt (TA)

wendt1@illinois.edu
[http://luke-a-wendt.info] Adaptive Control, Robotics, Computer Vision, Machine Learning, Optimization

Other Important People

Name Office Phone Email Area
Scott McDonald 1049 ECE Building samcdona@illinois.edu Machine Shop
Mark Smart 1041 ECE Building mwsmart@illinois.edu Electronics Services Shop
Casey Smith 3064 ECE Building cjsmith0@illinois.edu Instructional Lab Coordinator
Waltham Smith 1041 ECE Building wlsmith@illinois.edu Electronic Services Shop
Skot Wiedmann 1041 ECE Building swiedma2@illinois.edu Electronic Services Shop
Prof. Scott Carney 4061 Beckman Institute carney@illinois.edu Course Director

Cloud-controlled quadcopter

Anuraag Vankayala, Amrutha Vasili

Cloud-controlled quadcopter

Featured Project

Idea:

To build a GPS-assisted, cloud-controlled quadcopter, for consumer-friendly aerial photography.

Design/Build:

We will be building a quad from the frame up. The four motors will each have electronic speed controllers,to balance and handle control inputs received from an 8-bit microcontroller(AP),required for its flight. The firmware will be tweaked slightly to allow flight modes that our project specifically requires. A companion computer such as the Erle Brain will be connected to the AP and to the cloud(EC2). We will build a codebase for the flight controller to navigate the quad. This would involve sending messages as per the MAVLink spec for sUAS between the companion computer and the AP to poll sensor data , voltage information , etc. The companion computer will also talk to the cloud via a UDP port to receive requests and process them via our code. Users make requests for media capture via a phone app that talks to the cloud via an internet connection.

Why is it worth doing:

There is currently no consumer-friendly solution that provides or lets anyone capture aerial photographs of them/their family/a nearby event via a simple tap on a phone. In fact, present day off-the-shelf alternatives offer relatively expensive solutions that require owning and carrying bulky equipment such as the quads/remotes. Our idea allows for safe and responsible use of drones as our proposed solution is autonomous, has several safety features, is context aware(terrain information , no fly zones , NOTAMs , etc.) and integrates with the federal airspace seamlessly.

End Product:

Quads that are ready for the connected world and are capable to fly autonomously, from the user standpoint, and can perform maneuvers safely with a very simplistic UI for the common user. Specifically, quads which are deployed on user's demand, without the hassle of ownership.

Similar products and comparison:

Current solutions include RTF (ready to fly) quads such as the DJI Phantom and the Kickstarter project, Lily,that are heavily user-dependent or user-centric.The Phantom requires you to carry a bulky remote with multiple antennas. Moreover,the flight radius could be reduced by interference from nearby conditions.Lily requires the user to carry a tracking device on them. You can not have Lily shoot a subject that is not you. Lily can have a maximum altitude of 15 m above you and that is below the tree line,prone to crashes.

Our solution differs in several ways.Our solution intends to be location and/or event-centric. We propose that the users need not own quads and user can capture a moment with a phone.As long as any of the users are in the service area and the weather conditions are permissible, safety and knowledge of controlling the quad are all abstracted. The only question left to the user is what should be in the picture at a given time.

Project Videos