Project

# Title Team Members TA Documents Sponsor
33 Chess Playing Robot with Computer Vision
Honorable Mention
Jose Flores
Joshua Hur
Zack Alonzo
Zicheng Ma design_document1.pdf
design_document2.pdf
final_paper2.pdf
photo1.jpg
presentation1.pdf
proposal1.pdf
proposal2.pdf
video
# Chess Playing Robot with Computer Vision
jhur22, joseaf3, zalonzo2

## Problem
Our project’s goal is to address the need for a tangible and interactive chess-playing device, enabling users to play in the physical world against a chess AI rather than relying on digital platforms. Designed for both beginners and advanced players, the chess-playing robot would provide an engaging alternative to mobile apps, allowing for skill development and strategic thinking in a hands-on manner.

## Solution
We plan to develop an autonomous chess-playing robot that eliminates the need for a human opponent by incorporating our own chess algorithm with varying difficulty levels. Using a system involving a magnet and motors beneath the board, the computer opponent’s chess pieces will move autonomously while the human player will simply pick up and place their pieces. Then, our robot will analyze the current board position by capturing an image through a camera and will identify all the pieces on the board by identifying each piece's color, associating it with the corresponding chess piece. With this updated board, we will now be able to determine the optimal move based on the chosen difficulty level and current board position. When identified, our code will output the necessary information to the system with the magnet and the motors underneath the board to move its intended piece and wait for the subsequent human player’s move (additionally, a button press will “submit” the player’s move).

## Solution Components
The project contains three major subsystems to accomplish its task.
- Magnetic Chess Board
- Computer Vision-based Chess Board Visualizer
- AI Chess Algorithm

## Subsystem 1: Magnetic Chess Board
A version of this board already exists in the machine shop from a previous student project, so this part is mostly complete. However, our goal will still be to improve upon the design of the board, as the current board has some issues with the main magnet and its consistency in grabbing the chess pieces.
The chess board itself consists of 3 motors: 2 for one axis (AXIS1) and 1 for the other axis (AXIS2). The purpose of having 2 motors for AXIS1 is to prevent AXIS2 from tilting and being offset. Connected to AXIS2 is a magnet that will be responsible for moving pieces on the computer’s side of the board. When the computer executes a ply, the code running on the microcontroller will move the magnet to the piece’s starting position. Once it arrives, it will activate a voltage high to enable the magnet to grab the chess piece. Once held, it will navigate through the board to the desired end location, activate a voltage low, and finish its ply. Because the pieces will be sliding around flush with the board, the pieces or board need to be modified. In chess, knights can move over other pieces so to avoid collisions with other pieces, we thought of centering all chess pieces in their respective tiles and guiding chess pieces along the lines or borders of a board. To complete the solution, we thought of two ideas:
Method 1: Enlarge the board to grant the pieces more clearance when moving around the board.
Method 2: Reduce the size of the pieces to give them more space when moving around.
The method we go with will depend on where we can store the board because we want it to be large but not so big that we can’t easily move it somewhere, such as between the machine shop and the lab room.

Parts:
- Motor: Mercury Motor SM-42BYG011-25 2 Phase 1.8° 32/20 (x3) (already have)
- Large Magnet (already have)
- Chess Board with Plastic Sheet Covering
- ESP32 S3 Microcontroller (we can get this from the ECE supplies instead of using our budget)

## Subsystem 2: Chess Board Visualizer with Computer Vision
This will be the main challenge of the project. First, we require an arduino camera that will be mounted above the chess board, enabling us to have a top-down view of all of the chess pieces. This camera will utilize a MIPI interface, allowing us to connect it to the CSI port of a Raspberry Pi and run all of our code for the computer vision part (and the Raspberry Pi will be mounted on our PCB to create a Pi HAT). Next, each of the 32 magnetic chess pieces will be color coded. With 6 types of pieces, we will use the 3 primary colors (red, blue, and yellow) along with the 3 colors in between the primary ones (purple, green, and orange). To differentiate between the opposing sides, the human player will have a darker shade of these colors and the robot will use lighter shades.

Parts:
- Colored Chess Pieces with Magnetic Bottoms (x32): (will 3D print our own)
- Neodymium Magnets (x32): https://www.amazon.com/dp/B0BVYFSDNS/ref=twister_B0C6X3LNB9?_encoding=UTF8&psc=1 [$13]
- Raspberry Pi: (SC0685 Raspberry Pi | Embedded Computers | DigiKey) [$60]
- MIPI Camera: (SC0194(9) Raspberry Pi | Embedded Computers | DigiKey) [$55]

## Subsystem 3: AI Chess Algorithm
The artificial intelligence agent will need to calculate optimal moves of various proficiency based on Subsystem 2’s computer vision. The agent’s logic will be based on Python’s chess library to calculate effective moves, check the legality of said moves, and judge a game’s outcome (a win, defeat, or stalemate). To check the status of the chess board (e.g. piece positions), Python’s chess library needs to parse in a string to describe the board. The syntax of the string needs to be in Forsyth-Edwards Notation (FEN) and it denotes the following.
- Piece locations
- Active color’s ply
- Castling availability
- Enpassant possibilities
- Half move clock
- Full move number

An example board for FEN could be "rnbqkbnr/pppppppp/8/8/8/8/PPPPPPPP/RNBQKBNR".

More details for parsing and other information can be found here: https://python-chess.readthedocs.io/en/latest/core.html

## Criteria for Success (5 things)
- Computer vision algorithm correctly identifies piece positions on the board with high accuracy
- Successfully update internal representation of board
- Magnet correctly grabs intended piece and does not make the current piece bump into others
- Robot will successfully detect if the human player cheats/performs an illegal move
- Chess board moves the pieces to the intended positions with high accuracy

## Proposal for Expansion
A really fun expansion that we want to do is to make this a more universal game-playing robot rather than just a chess-playing robot by adding games like Checkers, Go, Sorry, etc. Once we have the base chess game working with the magnetic arm on the bottom and the CV, all we would have to do is 3D print more pieces, make a new sheet to put on top of the board, and use other libraries for rules for other games and interface that with how to move the magnetic arm for the specific game.

Low Cost Myoelectric Prosthetic Hand

Michael Fatina, Jonathan Pan-Doh, Edward Wu

Low Cost Myoelectric Prosthetic Hand

Featured Project

According to the WHO, 80% of amputees are in developing nations, and less than 3% of that 80% have access to rehabilitative care. In a study by Heidi Witteveen, “the lack of sensory feedback was indicated as one of the major factors of prosthesis abandonment.” A low cost myoelectric prosthetic hand interfaced with a sensory substitution system returns functionality, increases the availability to amputees, and provides users with sensory feedback.

We will work with Aadeel Akhtar to develop a new iteration of his open source, low cost, myoelectric prosthetic hand. The current revision uses eight EMG channels, with sensors placed on the residual limb. A microcontroller communicates with an ADC, runs a classifier to determine the user’s type of grip, and controls motors in the hand achieving desired grips at predetermined velocities.

As requested by Aadeel, the socket and hand will operate independently using separate microcontrollers and interface with each other, providing modularity and customizability. The microcontroller in the socket will interface with the ADC and run the grip classifier, which will be expanded so finger velocities correspond to the amplitude of the user’s muscle activity. The hand microcontroller controls the motors and receives grip and velocity commands. Contact reflexes will be added via pressure sensors in fingertips, adjusting grip strength and velocity. The hand microcontroller will interface with existing sensory substitution systems using the pressure sensors. A PCB with a custom motor controller will fit inside the palm of the hand, and interface with the hand microcontroller.

Project Videos