The final project is our own implementation of the popular virtual reality (VR) game, Beat Saber displayed on the PiTFT. In the original game, the player slashes through incoming blocks viewed in a headset by moving a controller through the air. Instead of a headset, and a VR system, we use a joystick, IMU, and PiTFT display screen. The blocks are marked with arrows indicating the direction in which they should be slashed. Each block appears so that they are slashed to the rhythm of the music. This project used both hardware and software components. We include hardware for user controls, as well as speakers for playing the music. In particular, we connected an inertial measurement unit (IMU) to calculate tilt to move the saber across the bottom of the screen. We also attached a joystick that helps determine whether there was a slashing motion and in which direction it occurred. For the display, we used the PiTFT that we have been using from the previous labs to display our game and collect touchscreen inputs for buttons such as starting the game, pausing, etc. We connected speakers to the auxiliary output of the Raspberry Pi to play music out loud. On the software side, we developed our game using PyGame, which has advanced graphics capabilities. We use the object oriented features of Python to create block and cursor classes to make it easier to generate blocks for the player to slash as well as move around the player sprite. We also use the application, mplayer for playing music. Our project is an embedded system since it receives live inputs and makes real-time changes in response to them.
This system consisted of multiple elements that combined together to make a robotic tea steeper. It cumulates mechanical and electronic parts through python programming in a linux operating system on a raspberry pi 4 (RPi) to create one large embedded system. This system works using multiple servo motors to dispense loose leaf tea, as chosen by the user, into a diffuser. It then pulls the diffuser through a self-closing ramp and dips the diffuser into a mug of hot water. Meanwhile, a countdown for the tea brew starts on the piTFT screen, the diffuser is steeped by being slightly lowered and lifted in the hot water. Once the steeping time is completed, the diffuser is withdrawn from the mug and the tea is brewed.
Sandwiches have been around since the 1700s. They are one of the most common forms of food in the world, and they are quite simple to make. Yet, with all of humanity’s technological advances, the easiest way to attain a sandwich is still to make one by hand. We were actually surprised to find very little prior development in the field of “automated sandwich-making robots.” We sought to change that by making a mechanical system that crafts sandwiches to order by assembling the ingredients for you at the push of a few buttons, eliminating the need to tiresomely make yourself a sandwich.
In this project, we designed a dance box that enables a user to fit into the box shown on the PiTFT screen. We used a Pi camera to capture body movements and lit a number of boxes, each lasting for several seconds, and we set different difficulty levels for users to play. A song is played in the background as the game starts and the score is displayed.
In this project, we decided to make a robot that plays your Spotify playlist via an API, while tracking you and using your posture to send commands to the playlist to make remote changes. The first step is user identification. In this step, the robot looks for a face that matches one of the reference images of us partners, while excluding other identified faces. Once it identifies a match, it plays their playlist on the linked Spotify account using the API. Then it enters the tracking phase, where it follows the pants of the target to track their position and maintain a near constant distance away from them. While doing this, it also looks for leg postures that are outside of the default position. When one is found, the API sends a remote command to the Spotify account to do things like pause/play the playlist.
In this project, we created a relatively cheap fitness tracker using a Raspberry Pi 4 and GPS module. By connecting to GPS, the device determines and displays the current date and time. The device allows users to log data on activities they do. Specifically, the device tracks distance traveled, elapsed time, current speed, current elevation, and total elevation gain. When finished with an activity, users can save it to the device’s memory and view its statistics later. Furthermore, the device summarizes weekly data across all stored activities and presents the summarized data in convenient ways. This includes listing various totals for each week, along with weekly graphs displaying data for the various activities recorded during that week.
We set out to create a remoteless gaming console that would map different poses and movements of the players to video game controls allowing you to play a video game without any sort of physical remote. The movements and poses of the players are detected through computer vision. In order to test the system, we mapped various movements to a popular 2 player Linux racing game called Super Tux Kart.
In the modern day and age, the average university student is accompanied by their phone, which is often connected to campus Internet. Of course, they may also be carrying their laptop, their tablet, and other electronic equipment. When these devices are connected to the Internet through Wireless Access Points, the resulting traffic data, when drilled down to the physical location level, can serve as a measurement of the level of campus congestion across campus. This project utilizes the public Multi Router Traffic Grapher (MRTG) website full of information as provided by Cornell’s IT department. By leveraging the network traffic data of the Local Area Network (LAN)’s switch ports, we are able to look under the hood at the network that services the wireless access on Cornell’s campus. Utilizing this, our project is able to take the incoming and outgoing traffic data from multiple Access Points (AP), consolidate for total traffic categorized a given room (such as the Duffield Atrium or CIS Lounge), and create a graphical representation merged with a map of the Engineering Quad of campus for red, yellow, and green levels of campus congestion. The project also creates a list of recommended study spaces in the Engineering buildings ranked in least to greatest amount of campus congestion, and can recommend “routes” through campus that take into account the congestion levels of the intervening areas.
The Laundry Sorter is a one-of-a-kind product that helps people sort their laundry. Rather than getting lazy and throwing everything into one basket, The Laundry Sorter will ensure that one’s laundry is sorted appropriately. The user can select whether lights, darks, colors, reds, and/or whites go in different baskets. When using The Laundry Sorter, the user will drop their clothing into a box that is just above three baskets. A camera will then detect the color of the clothing. Based on the user’s selection, a turntable mechanism will be used to move the piece of clothing into the correct basket. You can put as many clothes as you want through The Laundry Sorter. At any point, you can chance the selections for each basket using the multi-select screen in each basket option on the touchscreen.
Lab is usually the place where students store their devoted work. We are no exception. A lab place, however, is usually used by multiple groups of students. Keeping track of who entered the lab at which time helps the professor to better manage and protect the lab properties. To complete our objective, the system is divided into two main sections: hardware and software. The hardware components include locks with relays to convert small electrical stimuli into larger currents and control the opening and closing of the circuit. They will be unlocked when the user is authenticated with a verified account and correct passcode. Some other hardwares also include multiple pyroelectric sensors for motion detection, a HDMI display for monitor screen, a keypad for passcode input, and a dedicated capacitive touch lock button to arm the security system. Once the lock button is pressed, the security system will be turned on in 30 seconds. The software of this embedded system includes the app and the control codes for the security system. User is authenticated through our app after successfully signing up and logging in. The app will then provide the user with an unique, time-sensitive, 6-digit code to enter the room.
In the final project of ECE 5725, the team built a Raspberry Pi 4 based embedded music synthesizer for music production. This device supplies users various input and instrument sources including a virtual drum set, a virtual piano, a physical piano MIDI keyboard, and a USB microphone input. Both the virtual drum set and the virtual piano are developed on the PiTFT touchscreen, so users can play the virtual drum set and the virtual piano by touching the PiTFT. The physical piano MIDI keyboard and the USB microphone are connected to the Raspberry Pi 4 by the USB ports. By some linux commands, the MIDI keyboard can be played and make sounds, and the vocals can be recorded and stored in wav files by the USB microphone. The team also built a GUI on PiTFT supplying users various functions such as concatenation, stack, music tone processing, reverb, and echo for post production. All the software was written in a Python file. All the elements shown on the PiTFT were developed using the Pygame module. And all the rest functions were developed using other Python modules such as Numpy, Pyaudio, and Scipy.io.wavfile.
Throughout the semester we learned how to use Pygame, a powerful library for creating games and other graphical applications. We implemented all the base features, including single note rhythms and legato rhythms (connected note sliders) for running on the Raspberry Pi PiTFT. Our game also features scoring and automatic beatmap generation.
For this final project, we recreated the Red Light, Green Light doll from the famous Netflix series Squid Game. The game works by having players all start at the start line and whenever they hear the “Green Light” music, they can move forward, whenever this music stops, the game is in “Red Light” mode and the players must freeze. If they are caught moving during the red light phase, they are out. The goal of this game is to get to the doll first and press a button on the Pi, signaling that the player has crossed the finish line. Our game has a single-player and 2-player mode. The Raspberry Pi runs OpenCV to capture video stream from an attached PiCamera and TensorFlowLite (TFLite) process a frame to identify 17 keypoints on a person to identify their pose. The Main Pi (henceforth referred to as RPiMain) controls an RGB LED strip that lights up red during the Red Phase, green during the Green phase, blinks red when a player is eliminated, and oscillates rainbow once the game has a winner. When a player is eliminated, we use the keypoints from our pose detection to rotate a servo-mounted laser pointer at the losing player. We use a second Raspberry Pi (RPi2) to play sound via Bluetooth connection to a speaker during the Green Light phase of the game.
For this project, our team created a miniature arcade with three different games. We implemented Tetris, Brick Breaker, and Pong completely from scratch utilizing python and the pygame library. To create a cohesive gaming experience, we also created a game launcher home screen for these three games. The PiTFT was utilized as the display for this arcade and the on-board buttons are used for game control. Pong and Brick Breaker incorporate an I2C joystick, which is used for paddle movement. Tetris uses the PiTFT buttons to control the pieces. The Raspberry Pi, PiTFT, and Joystick are also all fitted into a 3D printed case for the optimal gaming experience.
The goal of this project was to use the high-definition camera module and lens to emulate an industry DSLR camera with auto-focus capabilities. Features include image capture and storage, video monitoring via TFT display, and automated focus—with touchscreen subject identification.
This is the Distance Keeping Robot Car. The objective is for the car to be able to dodge all incoming objects that may run into it. This could be a person, another object, another car, or anything that the Robot senses around it. The Robot uses a network of four distance sensors to “see” what is going on around it. The Robot also uses omni-directional wheels to enable the traditional forward and backward motion as well as side to side movement.
Almost everyone has heard of or played the game of chess. Chess is a classic game where two opposing players face each other with equal pieces and aim to capture the opponent’s king piece. The issue with chess is that there needs to be two players. Online simulators of chess choose to remedy this problem by either matching other players through the Internet or placing the player against a bot. This solution, unfortunately, doesn’t allow the solo user to play with a physical chess board. To fix this problem, we created a chess board that you can play against. The chess board uses a Raspberry Pi and an array of sensors to determine the current state of the game and uses the award winning Stockfish chess engine as the user’s opponent.
We found that most of the shuffler on the market can only achieve one-time shuffle. What we planned to achieve is continuous shuffles, where we can shuffle multiple times. Besides, we planned to combine the shuffler with a dealer so that after the shuffles, one can deal the cards according to the pre-defined games or define one’s own game. We achieved the continuous shuffles by a scissor lift and an arm. After each shuffle, the scissor lift would lift up the deck and the arm would cut the deck into half and swing them to each sides for that they are ready for the next shuffle. We use a Raspberry Pi and the PiTFT for the control and the GUI of the program. We provide Texas Hold’em, Fight the Landlord, Showhand as the pre-defined games. For user-defined mode, one can customize the player number, the public cards number, and cards per person.
So Anya is an avid coffee drinker and for her birthday she received an Ember Travel Mug that was temperature controlled. The mug is pretty unique in the sense that it allows the user to set an exact drinking temperature for whatever fluid they store inside the mug. So this inspired us to think bigger and create our very own temperature regulated coffee maker with “latte” abilities. This project really pushed us outside of our comfort zone and forced us to use an ample amount of the electrical engineering and prototyping skills we had picked up over the course of our time here at Cornell.
We created a wireless arcade stick complete with a joystick and six arcade buttons using the Raspberry Pi 4. This stick utilizes TCP to communicate between a server and a client. The server is hosted on a laptop that will be displaying/running the game, and the Raspberry Pi acts as the client.
In this project we created a program that uses machine learning, specifically a Convolution Neural Network, to detect whether people are wearing face masks correctly or not. We used the Raspberry Pi to display a GUI that allows users to take photos of people using the Raspberry Pi Camera. Then an OpenCV program is used to detect the person’s face. An image of the face is run through the neural network, and its result is then displayed back on the GUI.
Are you tired of the stranglehold stocks have over your life? Tired of leaving 50 charts open wondering if you should sell? Leave that to pi stock! This project provides you with a simple interface to check on your portfolio and provide some info that might suggest how prices will change. Like the large LED boards in Times Square or on the trading floor, the screen scrolls stock names and prices. The display changes between green and red depending on if the stock price is going up or down. The advantage of this system is that it uses tensorflow models to make predictions about the stock price and displays those as well. This project uses an API, tensorflow, and network sockets to update and generate the values that scroll across the screen. All it requires is a raspberry pi, a screen (in this case, a piTFT), and a computer to run tensorflow (if the raspberry pi is 32-bit).
For our final project we built an autonomous plant watering system that can take care of your plants while you are away. In addition to simply watering the plants, a suite of sensors takes measurements of the plants surroundings so that you can keep track of the environment over time and ensure that it is the proper one for growing a plant. The main idea behind the autonomous system is to allow taking care of your plants even while you are away. And if you are away you want to be making sure that the system is still operating as expected. With our system, you can check in on your plant remotely through any web browser and watch a live feed of the plant to make sure it still safe. You can also watch a timelapse of the plant growth by checking in remotely.
Our Smart Lock uses two-factor authentication to grant access to users recognized by the system. Upon access being granted, all users can lock or unlock the system. The first method of authentication uses one of the piTFT buttons, which upon being pressed, verifies that there is a user associated with that button. If recognized by the system, the system will then use the piCamera to scan the user’s face and use facial recognition to either grant access if the user is the correct individual associated with the piTFT button or deny access if the user is not the correct individual associated with the piTFT button –– our second measure of authentication. Our smart lock has two modes: superuser or regular user. An individual with superuser rights has the ability to add new users, remove existing users, and view the recent access history associated with their tag, as well as the ability to lock and unlock the lock. A regular user only has the ability to either lock or unlock the lock, and view the recent access history associated with their tag. The Smart Lock has been developed using a Raspberry Pi 4B as the final project of the ECE 5725 Design with Embedded OS class at Cornell University.
Robotics is the future. Robot-human interaction can be applied widely.This project is a simple exploration of this interaction.The whole project is divided into two parts: image processing and servo control. The image part is based on openCV and python; the camera can detect the concours of objects in the frame. Then the position and size of the contours will be calculated and send different commands to the GPIO pins. The control is dominated by a microcontroller. Based on the low and high levels of output, the robot can take actions accordingly.
Controlled by the Raspberry Pi, our security robot attempts to center itself with a green target and is continuously scanning the area in front of it for authorized persons or intruders. To center the target, the robot will move to attempt to find the target and then center itself when the target is somewhere in front of it. This image processing is accomplished using OpenCV. For facial recognition, we utilized the dlib python API to quickly implement a facial recognition system within our security robot. Finally, motor control and gimbal movement is controlled with software and hardware PWM respectively, whose driving logic is controlled via software.
For this final project, we wanted to create a physical gaming system with embedded controls and a GUI on the PiTFT screen. We considered several alternatives, but we finally landed on an old-school pinball machine, which we called the “PiNBall”. Old pinball machines are a childhood classic for most students our age, and the team considered that it was the perfect choice for our project since it implements various elements studied throughout the class. Our design consists of a physical build, which is made out of laser-cut wooden pieces held together by screws, hardware elements connected to the GPIO pins on the RPi, and a Python script that controls the hardware and the graphical user interface.
For our project, we opted to build an IOT food management system that keeps track of the quality and freshness of stored foods based on the food itself and the ambient conditions of the storage environment. Based on these parameters, our system indicates, with LEDs, the status of our environment and food, provides an estimated shelf life timeline, and web scrapes the internet for recipe ideas for the stored foods. The goal of this system was to help users better manage their food inventory and plan their meals according to how fresh their groceries are, thereby reducing food waste.