Project

# Title Team Members TA Documents Sponsor
13 Automated Closet
Nikhil Parmar
Shania Arora
William Doherty
Jack Li design_document3.pdf
final_paper1.pdf
other1.zip
presentation1.pptx
proposal1.pdf
Team Members (NetID)
Nikhil Parmar (nparmar2) | Shania Arora (shaniaa2) | William Doherty (wjd2)


Problem

Choosing what to wear is an unavoidable part of everyone’s routine and as of today, there is no real “automation” to this process. Moreover, many people spend a lot of time in the mornings deciding on what clothes to wear. In Illinois especially, the weather varies drastically every day, and many people always have to consult weather apps to judge what they should wear for the day. This is a hassle and takes away precious minutes of sleep in the morning.


Solution Overview

We propose an automated closet system that can select an outfit for the user based on the type of clothes, weather, color, and other metrics. The user will have to insert data about the clothes when the first time the closet system is set up. For every article of clothing, a database would store details about each clothing piece such as color, size, type of clothing, sleeve length, fit, etc..
Although motorized closet systems exist, they don’t have decision making capabilities, relying on users to decide as the system moves. Our solution will provide this additional functionality in an easy-to-use package.


Solution Components (Hardware - HW, Software - SW)

[HW] Clothes Rotation System
Our system to rotate clothes will consist of an overhead conveyor to which we will attach modular “tabs”, each of which will carry a single hanger. A motor attached to the frame of the system will supply the power used to move clothes around the system.

[HW] Motor Driver
We will require a driver that can convert an analog signal from the control board to a -12V-12V operating range, given a supplied reference voltage.

[HW] Power
We will use a power distribution panel to convert AC input from the wall to 12V power that is required to drive the motor of the clothes rotation system and 5V to power the microcontroller and other logic elements.

[SW] Decision Making
Using limit switches, we can determine where each article of clothing is located in the closet system. Depending on details about the article of clothing such as color, fit, etc., we would first compare the potential outfit to the last several chosen outfits to make sure we don’t choose the same or a similar outfit again. After this, the chosen outfit would be presented to the user via the UI.

[SW] UI
Delivering the chosen outfit will rely on buttons as the primary interface. There will most likely be 3 buttons: outfit rejected, clothing accepted, and choose outfit. The ‘choose outfit’ button would trigger the system to start choosing clothes to make outfits. We will present the shirt/dress first, then the bottoms, and then any accessories. If the user likes the piece of clothing, they will take it off the hanger, then hit the ‘clothing accepted’ button to indicate that the clothing was picked up. This will trigger the system to move to the next article of clothing in the outfit. If they do not like the article of clothing, they will hit the “outfit reject” button. They will put back the clothes they have taken, and the decision algorithm will choose different clothes until a match is made. We may consider an LCD screen to display any warnings or error messages that may occur from operation.


Criteria For Success

The automated closet spins with all the clothes in the system and is able to appropriately suggest outfits until there are no more clothes left to choose from or until the user is happy with the outfit presented.

VoxBox Robo-Drummer

Craig Bost, Nicholas Dulin, Drake Proffitt

VoxBox Robo-Drummer

Featured Project

Our group proposes to create robot drummer which would respond to human voice "beatboxing" input, via conventional dynamic microphone, and translate the input into the corresponding drum hit performance. For example, if the human user issues a bass-kick voice sound, the robot will recognize it and strike the bass drum; and likewise for the hi-hat/snare and clap. Our design will minimally cover 3 different drum hit types (bass hit, snare hit, clap hit), and respond with minimal latency.

This would involve amplifying the analog signal (as dynamic mics drive fairly low gain signals), which would be sampled by a dsPIC33F DSP/MCU (or comparable chipset), and processed for trigger event recognition. This entails applying Short-Time Fourier Transform analysis to provide spectral content data to our event detection algorithm (i.e. recognizing the "control" signal from the human user). The MCU functionality of the dsPIC33F would be used for relaying the trigger commands to the actuator circuits controlling the robot.

The robot in question would be small; about the size of ventriloquist dummy. The "drum set" would be scaled accordingly (think pots and pans, like a child would play with). Actuators would likely be based on solenoids, as opposed to motors.

Beyond these minimal capabilities, we would add analog prefiltering of the input audio signal, and amplification of the drum hits, as bonus features if the development and implementation process goes better than expected.

Project Videos