Pi Notes
The objective of this project was to develop a note-typing application on the Raspberry Pi, aimed at providing an alternative to the notes applications on smartphones, which can quickly become a source of distraction, with social media applications such as TikTok and Instagram in easy reach. By using the Raspberry Pi platform, we were able to create a simple environment for users to type notes in using the PiTFT touch-screen device that was easy to use, and that did not require sacrificing the benefits of the smartphone apps such as automatic saving and being able to send the notes to other people. Additionally, the project aimed to explore the possibilities of the Raspberry Pi platform as an embedded system, and showcase its potential for developing practical solutions that can benefit users.
Plant System
This project focuses on building a system that waters your plants for you with little to no effort. Just set it up, and refill the water tank when your told and you will have beautiful happy plants!
Rootin’ Tootin’ Robot
We present a shooting robot game that we designed and built using two Raspberry Pis, a PiTFT, motors, servos, and Bluetooth communication. Our objective was to create an engaging and fun game and to demonstrate what we have learned throughout the semester. The game is playable by one player. Here we will describe the design, testing, and results of our project.
Pi Sweeper
This project is a physical rendition of the classic Minesweeper video game, using a hardware game board where squares can be uncovered by pressing them directly. There is an 8×8 grid, with each grid square illuminated from within by a colored RGB LED to display the nearby mine count, undiscovered status, flag, or location of mine when the game is lost. The player is able to push the grid squares down to explore that space, with other open squares automatically being revealed like how auto-exploration works in the traditional Minesweeper game. The Raspberry Pi and all hardware are enclosed in a custom 3D printed chassis, and the PiTFT displays a leaderboard with previous best times, filtered by difficulty.
Rat in the Hat
Our project uses embedded operating systems to recreate the classic Ratatouille scene where Remy the rat pulls on Linguini the chef’s hair in order to control his arm movements and cook the perfect dish. Instead of being controlled by the rat, our project will allow the rat’s silhouette in the hat to mimic the motion of the wearer using servo motors, IMU sensors, and Bluetooth. The goal is to make it seem as if there is a real rat inside the hat controlling the chef’s every move!
RoboGPT
This project is a small AI-Powered mobile robot platform centered around using cloud AI services and Raspberry Pi 4. We are using four continuous rotation servos mounted on a laser cut acrylic sheet. For our sensors, we are using an accelerometer and gyroscope, two ultrasonic sensors on the front and back of the rover to avoid running into obstacles, a microphone to collect audio input, and a bluetooth speaker to play sounds back from the robot. The robot software uses Azure Speech Services for speech-to-text to transcribe user requests and text-to-speech for the robot to talk to the user. It also uses the OpenAI ChatGPT API with a custom prompt that processes user requests and returns a response that includes what the robot should say or do. ChatGPT’s role here was to take a high level command, like “draw a square with sides of 1 meter” and create lower level movement commands that the robot should run, allowing the robot to have a large degree of functionality and responsiveness to most commands. The Raspberry Pi collects data locally, sends data to the cloud AI services, uses the sensor input to avoid hitting obstacles while moving, and processes the ChatGPT response to move the robot and talk to the user.
Desperation Machine 2.0
When you use a vending machine, the interface is usually fairly straight-forward; you enter what you’d like, and the machine deposits it. While this is relatively simple for the user, it doesn’t leave much room for adaptability on the producer’s end. If a product suddenly becomes popular, the owner has no way to capitalize on this increasing demand. Likewise, if a product becomes unpopular, the owner has no way to increase the incentive for consumers to purchase it by lowering the price. While owners could go around and manually modify the price in-person, this doesn’t scale well for large systems of machines.
To address this issue, we propose the Desperation Machine 2.0 (named due to an inside joke resulting in the vending machines in the basement of Upson Hall being christened the “Desperation Machine”). In addition to serving as a fully functional vending machine, the Desperation Machine 2.0 introduces a web interface that allows an owner to monitor their machines from a distance. This not only allows the owner to manually change prices, but to define rules as to how the machines can automatically change their prices based on their current stock of an item, as well as how often the item is purchased.
Robotic Arm
Through this project, we worked on every aspect of controlling a robotic manipulator, from mechanically and electrically integrating the arm to deriving the kinematics formulation and implementing user control. First, we laser cut parts based on an open-source robotic arm design, then assembled the arm. Next, we wired the arm such that each servo could receive PWM from the RPi and tested the functionality of the servos on and off the arm. After deriving kinematic equations for the arm, we measured the effect of servo speeds on key angles in the formulation and mapped combinations of servo values to real-world coordinates. Finally, we used our knowledge of the arm’s inverse kinematics to repeatably pick up and drop off a Sharpie at various locations.
Pac Man
For our ECE 5725 final project, we created a replica of the classic arcade game, Pac-Man! The game runs on a Raspberry Pi 4, using a piTFT 2.8” LCD Touchscreen for graphics, and a four-directional joystick for input. We use the Pygame library to generate all of the animations. The nature of the game is identical to that of the original version originally released in arcades by Bandai Namco in the 1980s. In our version, the player uses the joystick to navigate the main menu and select different configurations of the game, including game modes and easter-egg features not available in the original 1980s version. During regular gameplay, the player uses the joystick to steer Pac-Man through a maze, collecting points and power-ups while avoiding the four ghosts. When the game is over, the user is brought back to the main menu where the game can be reconfigured and restarted.
Pi Augmented Reality
For this project, we really wanted to make something portable embedded-system-like. To that end, using just a TFT screen, camera, and a six-axis sensor, we made a display that takes in a camera feed and overlays a rendered object on top of it using the ImageMagick command-line toolset. By moving the system around, the program takes in current accelerometer and gyroscope readings to update the position of the object based on the original position. The display is then updated with the new render, thus giving us a sort of small-scale augmented reality. In order to show the camera feed underneath the objects, we modified preexisting renderer code to support background transparency. Camera images were taken using Raspistill, and the most recent version of the now-deprecated WiringPi was used to get raw data from the MPU6050 module.
Billiard Assistant
The Pi Billiard Assistant is an embedded system that consists of a Raspberry Pi 4, a Raspberry camera and an external laptop providing a dynamic trajectory estimation based on player”s cue direction. The system is capable of calculating the trajectory with a collision between cue ball and cushions. Players can better adjust the shot by referring to the trajectory and explore more potential solutions to snooker from laptop display. The system is installed with an external frame for video capturing and leaves space for future stretch designs and implementations.
Automatic Plant Watering
This final project was created by Maria Boza (mib57) & Ashley Heckman (agh93) for Cornell’s ECE 5725 Spring 2023 class. We implemented a planter that has many capabilities. The brains of our project is the Raspberry Pi 4 (RPi), which has been used all semester to work on our labs. We used four sensor readings: soil moisture, humidity, temperature, and light. These were received with a plant monitor and photoresistor sensor. We also implemented a camera to send a live feed of the plant to a website. We used python to code all of the sensor readings, the camera, and the GUI (which was displayed on the piTFT). This project is geared towards new plant owners who do not know much about plants and would like help. It is also geared towards people who cannot water their own plants.
Network Square Shootout
Welcome to an incredible world where technology and gaming collide! In this thrilling adventure, we delve into the realm of TCP internet connection sockets, Pygame, and the mesmerizing TFT display. Prepare to witness a revolution in gaming as we unlock the secrets of creating a captivating field of vision, complete with bullet-shooting action and seamless movement. We take things a step further by designing meticulously crafted data structures that will be serialized through the very sockets that connect our players. The organization and efficiency of the Model-View-Controller (MVC) pattern takes center stage, with one Raspberry Pi proudly assuming the role of the server, while both Pi devices seamlessly transform into powerful clients. Additionally, we embrace cross-platform compatibility, ensuring that this epic gaming experience transcends limitations.
RPi DJ
Music is one of the most common forms of media consumed by college students, whether it is listening while we are studying or while hanging out with friends. As our final project for this course, we wanted to approach the idea of a customizable music system that functioned similar to that of a DJ. This customizable DJ system allows for the user to increase and decrease the volume, bass, mid, and treble of a song actively while it is playing. These adjustments are instantaneous and can be heard clearly in the audio playback of the song. The user of our system is given eight song options to choose from at the beginning of the program, and can choose a different song to play by simply hitting the back button on the screen. Through our graphical user interface, the user can be their own DJ on the Raspberry Pi.
Robo Buddy
Robo Buddy is a friendly robot assistant who is always ready to help answer any questions and share his love for potatoes. Robo buddy runs on NixOS and uses various API’s including Whisper, OpenWakeWord, and Mimic to listen, understand, and respond to users. He underwent multiple prototypes before settling with a laser cut acrylic body. He has the ability to dance and wave with the help of servo and DC motors.
Dynamic Directions
We are Elizabeth Garner and Joey Horwitz, and for our Embedded OS final project we built a direction finding compass. The project is in two parts: an embedded Raspberry Pi product and an Android app. The user inputs a destination into the app and then the Raspberry Pi controls a physical arrow which points to the target location. This is accomplished using the Google Places API, the phone’s GPS module, a magnetometer to measure the Pi’s orientation, and a stepper motor to control the arrow. At the end of the day we have a product which allows you to know where you’re going while still allowing you to wander. And, once you input the destination, all navigation can be accomplished without staring at a screen. It’s ideal for freeform exploration of new places.
Self-Playing Guitar
This project was created to test the limits of a Raspberry Pi based guitar robot. We wanted to explore how close to normal playing we could get with what we had available in the lab. The hardware of the self-playing guitar includes an array of 12 servos that are individually positioned in their own holders up the guitar neck. Because the servos are in their own holders that then get attached to a wooden base, the guitar itself is able to remain unmodified and the machine could be placed on any guitar. The servos all line up above the B string on the guitar. Each servo wing has a rubber eraser on the end, mimicking a finger, which increases the effective surface area and allows tolerances to be looser. As each servo gets activated in the pattern of a song, the eraser frets the guitar string at the corresponding note. The 13th servo is placed inside a cavity for a pickup in the guitar body. This servo has a piece of adhesive on the wing. When the servo is activated to “pluck” the string, it essentially hits the string with the adhesive with enough force to slightly pull the string.
Capacitive Touch Keyboard
For our final project for ECE 5725, we designed a Capacitive Touch Keyboard. This was implemented by connecting capacitive touch sensors to the GPIO ports of the Raspberry Pi 4. When a key is pressed, a speaker connected to the RPi outputs the corresponding note. To add further controls to the keyboard, we designed a user interface on the piTFT screen that allows the user to change the keyboard octave, play a metronome in the background, and switch to “Drum Mode,” where all the keys are mapped to drum sound effects rather than piano keys.
Force Glove
Force Glove is a wearable product that controls a tiny wireless omni-directional robot. The glove controls the movement of the robot in two separate ways. (1) Rotating the robot and (2) “throwing” the robot either forward, backwards, left,or right. To make this possible, the glove consists of two flex sensors that determine when there is a “grab” of the fist, an IMU to get the direction of the rotation and throw, and a Raspberry Pi Zero to send the data to the robot (controlled by the Pi 4) via bluetooth. The robot itself consisted of the Pi 4, two motor controllers, two power supplies, and four omni-directional wheels. These wheels made it possible to throw the robot in any direction without needing to rotate beforehand.
PiEye Smart Glasses
The PiEye Smart Glasses system is an augmented reality assistant similar to Google Glass. It takes user commands spoken through a microphone and then responds to the user’s command by displaying the response to a semi-transparent heads-up display (HUD). Asking it about the time will cause it to display a digital clock with a date, and asking it about the weather will cause it to display the current temperature, humidity, and a short weather description for Ithaca, NY using the OpenWeatherMap API. If the user’s command isn’t about the time or weather, it will ask ChatGPT about the user’s question using the OpenAI API.
Apollo Guidance Computer
The Apollo Guidance Computer was the primary computer used by astronauts of the Apollo Program. It performed complex calculations and provided control over critical mission operations. Astronauts used the computer through an interface known as the DSKY (for DiSplay and KeYboard). For this project, we have recreated the DSKY and some of its software using Python and a Raspberry Pi 4B. Furthermore, this DSKY can run user-defined Python programs.
Space Finder
Our team, leveraging the power of embedded systems and computer vision, is developing a Raspberry Pi-based system designed to scan, analyze, and report on the availability of study spaces in real-time. This project aims to enhance students’ experiences on campus, saving valuable time and optimizing the use of communal spaces. With the integration of technologies like OpenCV and Python, our system will not only recognize the presence of students and chairs but will also provide real-time updates about available spaces via a user-friendly website or application. Join us as we revolutionize the way students navigate their study environments, making finding a study spot as easy as checking a website.
American Sign Language
For our final project, we developed a real-time embedded system that can detect and classify basic American Sign Language (ASL), specifically the letters of the alphabet, excluding J and Z, and numbers 0 through 9. Through the video feed provided by the Raspberry Pi HQ camera, the program detects and captures images of a person’s hand using computer vision and classifies the resulting image through our custom machine learning models. Two custom models were trained and tuned to classify letters and numbers based on an open-source ASL image database. We created a user interface (UI) in order to gamify our system to be used as an educational tool to teach ASL.
Portable Tracking Camera
The Robot Camera System is an innovative project that combines robotics, computer vision, and web technology to deliver a dynamic and user-friendly experience. Utilizing a Raspberry Pi 4 board, Pi camera, BNO055 orientation sensor, and a mobile robot car, this system offers two modes of operation: handheld and remote control. In the handheld mode, the user navigates via a PiTFT screen and physical buttons, choosing from manual control, object tracking, and stabilization modes. The remote mode, accessible via a Flask-built web interface, offers the same operational modes plus the ability to control the robot car’s movement. Despite the operational mode chosen, the system ensures a consistent user experience through seamless data and command sharing between the PiTFT screen and the web interface. This project demonstrates a unique blend of modern technologies, offering an intelligent camera system that can be operated directly or remotely.
PiSketcher
Two dimensional pen plotters are commonly used for making diagrams and machine drawings. Our project, PiSketcher is a portable robotic arm capable of sketching various shapes using a drawing instrument on any sketching surface. Built with two degrees of freedom, PiSketcher’s reach spans all points in the two dimensional plane. Controlled movements of the arm across the 2D plane are computed using inverse kinematics. Motorized control also allows lifting up the writing instrument when pressure is not required. PiSketcher has two inbuilt drawing modes – one drawing fixed standard size shapes and one where the dimensions of the shapes can be controlled by the user using push buttons. These simpler shapes can be extended more generally to draw more complex shapes and drawings.
Color Tube Solver
The Color-Tube game is a captivating puzzle that challenges players to sort the tube using strategy and logic. At the beginning of the game, we have multiple tubes containing colored blocks and two empty tubes. Each tube can hold up to four blocks, and there are four blocks for each color in the game. The objective of the game is to strategically use the empty tubes to rearrange the blocks of different colors that are currently mixed in various tubes, placing blocks of the same color into the same tube. However, as the game progresses, the number of blocks and tubes increase, and some levels have tricky obstacles that make it difficult to come up with a solution. To keep the fun going, we need to create an automatic solving device that can take photos, analyze the position and colors of the blocks in the image, and use an efficient algorithm to generate the optimal solving strategy. The device can use an iPencil to implement this strategy in the game and complete the level.
Party Bot
Music plays an essential role in everyone’s life. Whether at home, a concert, or a party, music is constantly played regardless of the setting. Therefore, for our final project, we created a party robot that embodies all aspects of a quintessential DJ and party. The robot plays the music of the user’s choice through JBL speakers connected to the Raspberry Pi and dances to the song using omnidirectional wheels.
Tidy Plotter
A 2-D plotter is far from a novel idea. Nevertheless, we chose to pursue this project to explore how cheap and accessible technology can be utilized to design and build a device capable of producing high-quality and accurate drawings. Unfortunately, many plotter designs are both large and expensive. Therefore, we aimed to create something robust, functional, and reliable in our development. Simultaneously, we also wanted to construct it using materials that could be cheaply bought or easily built by a large community of makers.
Magical Slide Whistle
The main goal of this project is for the RPi to control a slide whistle as a musical instrument and play songs. A stepper motor will drive a linear gear rail that will control the slide of the whistle. The system is programmed so that it will automatically move the slide to the correct location to produce notes of a song. This system will remind the human “musician” when they should play our autonomous instrument, resulting in a Magical Slide Whistle that “plays itself”
Auto Magic Etch-a-Sketch
We set out to build a system that could take an image and draw it on an Etch-A-Sketch. We made our design be user friendly, so that anyone can upload an image and appear on the Etch-A-Sketch screen. We developed a user interface to allow the user to draw an image in these steps: Add image to a specified folder on the Pi, Select from the folder the desired image, Generate trace (pathfile) of image, Check simulation, and Draw the image. For this project, our goal was to draw any given image file on an Etch-A-Sketch.
Butter Bot
After learning so much about the Raspberry Pi in class, we knew that with it, we would be capable of accomplishing anything we wanted. Of course, the first thing that came to our minds was the Butter Bot from Rick and Morty, as it would be both a complex and fun project to work on. Our version of the Butter Bot is not nearly at the same level as the one in the show, but is still able to accomplish its sole purpose – passing butter. The robot uses a Raspberry Pi camera to detect people in dire need of butter (the first person it finds), then uses a pair of tank treads to navigate towards them. From there, it uses a linear actuator with a razor at the end to cut a piece of butter from a stick loaded inside of it, then launches it at the person using a motorized catapult system. The robot then reloads another piece of butter by winding the catapult back and using another linear actuator to push the rest of the stick inside. It then seeks out its next victim – er, patron – and repeats the process.
2048
For our final project, we decided to develop our spin on the classic game of 2048. In 2048, tiles labeled with powers of two are laid out on a 4 by 4 grid. You can shift all tiles up, down, left, or right until they collide with either another tile or the boundary of the grid. If two tiles labeled with the same tile collide, they are merged into a single tile, labeled with the double of the two individual tiles. You lose when the board fills completely with tiles and you can not make any other move to merge them. We aimed to implement 2048 so it was playable on the PiTFT and could be controlled using either the 4 tactile buttons or WASD on an external keyboard. Further, we wanted the player’s high score to persist across different sessions, and we also wanted each player to be able to view the high scores of other players playing on different machines. We implemented both the game logic and the GUI in Python (using the pygame library), while we implemented persistent high scores by recording this information in Cloud Firestore.