ECE5725 Spring 2024 Projects

Self-Driving Straight-Line Car

Self-driving cars (autonomous vehicles, AVs) have always been a dream of the future, with more and more companies and individuals looking to revolutionize the space with novel algorithms and hardware. Usually, these systems are complex with dozens of cameras, LiDAR, ultrasound, radar, and GPS guidance sensors to complement a powerful multi-core CPU and GPU. This allows modern AVs to run a complex machine-learning model which detects the environmental traffic and road conditions, and then make decisions based on the data. We wondered if we could simplify a complex AV system to a basic level, yet still achieve the same goal of autonomous driving. In this project, we aim to build a self-driving car with a Raspberry Pi 4, which is a single-board computer with a quad-core ARM Cortex-A72 CPU. We use a Raspberry Pi camera to capture the road ahead, and process the image using OpenCV to detect the lanes. We then calculate a heading angle from this image, and use the angle to steer the vehicle. The car is controlled by two DC motors, and we use a PID controller to adjust the motor speed and direction to keep the vehicle centered in the lane. However, real AVs require much more than simple vision, since even the most advanced vision systems can be fooled by shadows, reflections, and other environmental factors. Therefore, we supplement our vision system with an inertial measurement unit (IMU) to detect the vehicle’s orientation, and photointerrupter encoders to measure the vehicle’s speed. We fuse the data from these sensors to adjust the AV speed and direction, supplementing the vision system and ensuring reliable performance even under suboptimal lighting conditions. This project demonstrates how an embedded system can be used to manage multiple sensors and motors to achieve a complex task like autonomous driving. This project showcases a relatively small implementation of real-time, decision-making robotics.

Embedded Mahjong Player

Mahjong is a type of Chinese board game where four players are required for a game. Often times, people find it hard to always find four people for the game, often there are only three. Therefore, it became a problem that a fourth player is needed. To solve this problem, our team worked on the embedded mahjong player, so that it can play along as the extra player to help start the game. Our project first uses the Pi Camera to take photos of Mahjong tiles, and uses the photos to train the computer vision model, and then uses the trained model to identify the tiles. Next, the Mahjong algorithm would determine which tile it should play this round, and send the instruction to the PiTFT and servo to indicate the played tile. Displays and buttons on the PiTFT are also used as round controls.

Tetris

We set out to create an arcade-style embedded system that allows a user to play Tetris, powered off of a Raspberry Pi 4. Rather than have the game displayed on typical screen, we opted to integrate a 32×64 LED matrix to display the game to the player. The player interacts with the game through arcade buttons housed in a 3D printed enclosure to hold all of the electronics. The game is implemented from scratch, allowing for a player to quick-drop, save pieces and shoot for a high score!

Hey Mungo!

Studying at Cornell can be very time consuming and lonely – Making friends is tough! However, with our assistant Mungo, you don’t have to feel that way anymore. With Mungo, you can ask it to tell you the weather before you head out for a big day. Or, you can ask Mungo to tell you a joke to cheer you up after a long day in the lab. Even if you only get a breather once every other Tuesday, you can still use Mungo to have some fun and play chess!

Ultrasonic Levitation Flappy Bird

Since the release of the Flappy Bird game in 2013, its simple and addictive gameplay has attracted millions of players worldwide. However, most modern games still rely heavily on screen touch or traditional controllers. Why not making something really fly in front of us? By combining the Raspberry Pi and ultrasonic sensors, this project hopes to explore new ways of interaction between games and players by combining modern technology with physical world interaction. This novel method of interaction will not only enhance the gaming experience, but also stimulate the interest of developers and creators in using the Raspberry Pi for physical programming and game design.

Barrier Avoidance Robot

We demonstrate the implementation of three ultrasonic sensors in our project, enabling our robot to turn at the correct angle upon detecting obstacles. We also incorporated a gyroscope mpu6050 to ensure the robot goes as straight as possible. A PiCamera was used for streaming the robot’s vision, serving as a surveillance camera.

Pi-Home Voice

For this project, we designed a touch and vocal user interface on the Raspberry Pi 4 and PiTFT with a USB Microphone that sends commands to control smart connected home devices. We had three devices we could control, including the living room light, kitchen light, and the door lock. The goal of this project was to be able to speak commands, control, and monitor the home devices wirelessly using PyGame and PiGame for the touch user interface and SpeechRecognition for the voice recognition. We made a model home with 2 LEDs for the kitchen and living room lights and a servo motor for a raising and lowering door latch. These devices were wired to a Raspberry Pi Zero W server in the model home. The Raspberry Pi 4 user interface was used to send commands to the R-Pi Zero W server over WIFI and configure the devices after a vocal command was given using the Python Socket library. The Raspberry Pi 4 is secured through facial ID and uses a Raspberry Pi Camera and OpenCV to distinguish the owner of the device. We developed our own ML Random Forest Classifier to map voice commands to actionable commands for the R-PI Zero W to implement in the model home.

Pi Tamagotchi

Tamagotchi is a handheld virtual pet simulation game released in the 1990s, where the user must take care of the pet by feeding and training it. Our project, Pi-Tamagotchi, is inspired by this nostalgic system. We maintain the objective of taking care of virtual pets and keeping them happy, healthy, and fed, but also incorporating a facial recognition component to foster a greater sense of camaraderie and connection. We used a Raspberry-Pi 4, a PiTFT, and a Pi camera to create a device reminiscent of the Tamagotchi, and OpenCV and Pygame for the system software.

Ping Pong Ball Balance

The objective of our project was to explore the possibility of balancing a ping pong ball on a flat surface whose orientation is controlled by two positional servos. We aimed to create a robust system that could support large disturbances to the ping pong ball as well as catching the ball when thrown on. This project involves a careful yet efficient combination of techniques from, Computer vision in tracking the ping pong ball, PID control for updating desired servo positions, Accompany hardware pulse width modulation to execute servo updates, piTFT display for intuitive ease of control for a user, Multiprocessing to ensure a necessary fast response, and A strong yet lightweight build to facilitate quick yet precise reactions.

Recipe Generator

This project introduces an innovative application of the Raspberry Pi 4, paired with a PiCamera, designed to enhance the cooking experience by seamlessly integrating technology into the kitchen. The core functionality of this system lies in its ability to scan barcodes of various kitchen ingredients using the PiCamera. Once scanned, these ingredients are cataloged into a digital list. This list is then cross-referenced against a comprehensive database of recipes. The system smartly identifies and displays a step-by-step guide for the recipe that utilizes the highest number of available ingredients, ensuring an efficient and creative cooking process. This project not only simplifies meal preparation but also encourages the use of on-hand ingredients, reducing waste and promoting culinary creativity.

Autotune

Pi Autotune is an embedded system to tune voice and instrumentals using a Raspberry Pi 4 and a USB-connected microphone. The graphical interface on the PiTFT reveals the extent to which the input is flat, sharp, or spot-on, providing a sliding scale to display the frequency difference between the input and the desired frequency. The user can rotate between this default mode, submitting a recording to be autotuned, as well as singing along to a song similar to Karaoke that results in a version of the song where the user is singing at the proper tempo. Pi Autotune is based on mathematical formulation using the Fast Fourier Transform (FFT), sound frequency relations, and domain transformation to properly autotune any input.

Face Recognition Door Lock

We developed a facial recognition access system prototype using a Raspberry Pi. The project involved creating Python scripts for facial recognition and user interface control, constructing a 3D-printed door with a solenoid lock, and adding additional small doors to demonstrate room access. The system verifies captured facial images against a database and displays access permissions on a piTFT screen, ensuring only authorized users can enter.

Volumetric Display

Flat LED displays provide convenient ways to display images in 2D, but 2D displays can feel flat and lifeless.Our project is a volumetric display and aims to display images in 3D. Using a spinning matrix of LEDs, 3D images and animations can be displayed with a persistence of vision effect. The goal of our project was to use the persistence of vision effect to display 3D images and animations using a 2D LED matrix. Through spinning the LED matrix, we want to utilize the persistence of vision effect to blink LEDs in rapid succession to generate images and animations that make it appear as if there is a 3D display, like a hologram. In essence, we want to create 3D from 2D. More broadly, one of the motivations behind our project is as a method for creating mesmerizing customizable 3D light displays in cities like New York for holidays and special events.

Security Camera

The primary objective of this project centred around developing a DIY multi-camera CCTV system built atop of raspberry-pi hardware whilst integrating computer vision algorithms and capabilities. More specifically, IOT Sauron aims to provide cameras with the ability to track and reposition actively adapting to the location of users in frame. In addition to this, this project focused on making the camera streams web-broadcastable to enable consumers to monitor camera feed when at work/ other asynchronous locations. Altogether, this seeks to provide a working prototype security system which is capable of being assembled and utilized within a home environment.

Automatic Selfie Camera

Inspired by the concept of gesture-controlled technology, our project aims to design an automatic selfie camera, which takes photos in response to a handclap, by directing the camera in the direction of the handclap. The system makes use of the picamera, piTFT and two microphones for audio input and signal processing. Our design comes in two parts– the piTFT user display, which comprises the UI and all its menu items, and then the controls setup which consists of two usb microphones, the picamera, and the servos which control the camera.

Pi Beat

PiBeat is a reaction time game played using a RaspberryPi with a piTFT connected to an accelerometer and a 32×16 LED matrix. On boot, the game screen automatically starts at the home screen using cron. The user uses the piTFT to select which team they want to be on, which will contribute to the leaderboard score. In order to start, stop, or resume game play, the user will use the corresponding buttons on the piTFT. The game is then displayed on the LED matrix where instructions flow down the board, indicating the desired orientation of the accelerometer. The player times their actions to orient the accelerometer as directed when the instructions reach the bottom of the screen, and they are awarded points based on how accurately their actions are timed. The game ends when the player reaches a threshold number of missed instructions. The system is driven by a main game loop, which periodically polls the accelerometer and produces output signals for the LED matrix. The game loop also generates random instructions and instruction positions, which adds an unpredictable component to the game.

PiDeck

At any club, you’re likely to find a DJ using a high-tech controller to mix music. The technology behind mixing music has advanced significantly, allowing DJs to manipulate tracks in increasingly entertaining and innovative ways. However, commercial DJ controllers are often expensive and inaccessible, unnecessarily complicating entry into DJing as both a hobby and profession. Our hardware setup for the Raspberry Pi consisted of 4 large push buttons, 4 small push buttons, 2 rotary encoders, and a piTFT display.

Las Vegas Sphere

If you want to experience Las Vegas right in your own home then look no further because we have the perfect solution for you. This project creates the mini Las Vegas sphere and it features some cool images that you might see on the real sphere such as its infamous emoji and our home: the Earth. This light display will take you traveling back to the city of lights. Using a persistence of vision illusion, this project uses a motor to spin a ring of pulsing LEDs at a rate such that our eyes detect an illusion of an image from this movement.

Raspberry Pi Tag Rover

This project develops a Raspberry Pi-powered autonomous robot that navigates and operates independently using a Pi camera to recognize AprilTags. That will allows the robot to execute tasks based on visual cues, ensuring precise and intelligent navigation without human intervention. Upon recognizing a tag by utilizing OpenCV, the robt will automatically adjusts its behavior, calibrates itself, and displays the tag’s ID information. The integration of the Raspberry Pi with a Pi camera, micro-servo system, and OpenCV technology significantly enhances the vehicle’s capability for interactive and automatic control, making it a sophisticated tool for navigating complex environments.

Friend or Foe

Since lab sessions run until 7:30, there is no time for dinner! We are always snacking in the lab, but with a limited supply, we can only share snacks with our friends. Therefore, we created the Friend or Foe snack dispenser to select who gets snacks. Furthermore, this project will allow us to explore the computation ability of the Raspberry Pi to handle complex algorithms such as the Haar Cascade facial recognition algorithm.

PlayPi

The objective of our project was to incorporate a motor system that would spin a strip of LED lights, and sync these light effects to the music being played by the RaspberryPi using spotify. However, we decided to make our project more complex by creating a persistence of vision display that would create an image that would sync to music. Initially, we were going to do a heart, but due to time constraints and technical limitations, we opted to do a flower instead. The idea is that the flower petals would sync to the beat of the music, growing or shrinking in size alongside functionality to change the color of the petals.

Trumpet Harmonizer

This is the Trumpet Harmonizer, which is a device that a trumpet player can use to harmonize with themself in real time. Similar devices exist, such as guitar pedals, but are usually exclusive to electric instruments, and alter the signal before it is amplified and played through speakers. However, this task is more difficult for instruments like the trumpet or the human voice, since there are no wires or electric components. We were inspired by certain devices that achieve this idea in some way, such as the vocoder or Jacob Collier’s custom harmonizer. Additionally, effects pedals are known to be used on the sound of a trumpet, but they typically change the timbre of the sound or add reverberation, and we did not find any instances of any harmonizing effects aside from an octave pedal. Our project achieves harmonization for standalone instruments like the trumpet, albeit with many opportunities for future improvement.

Nerf Turret

The Nerf turret was built out of Legos and a Nerf blaster and was controlled using the Pi, a motor controller, Lego motors, and relays for the blaster control. Using a Pi Camera and computer vision (using OpenCV), the turret would scan the environment for a target (which we made a red circle). Two motors, one for horizontal motion and one for vertical, would aim the blaster so that the target would be in the middle of the turret’s view. For height feedback to prevent the turret from continuing to try to move up or down once it has hit the range limit, two limit switches were used. Additionally, we displayed live video of the turret’s view with targeting indicators to show when a target was detected. Once a target was detected and determined to be properly aimed at for 3 seconds, the Nerf turret would then be powered by the 5V pin from the Pi, through a relay, so that the blaster would then fire at the target. We also allowed for easy on/off control of the turret by touch on the PiTFT screen.

Smart Surveillance System

Building as one of the most crucial components of infrastructure that facilitates human’s daily life and work, it is determined to enhance occupant experience in terms of both security level and the interior atmosphere. Video and CCTV systems have been implemented for security applications for decades, they provide remote surveillance for security operators and keep a video record of the spaces under monitoring. The system employs facial recognition technology to control access to the building. Authorized individuals are immediately recognized, and access is seamlessly granted. Upon entry, the system can display relevant IAQ and ambient data at the gate so that users can adjust the office environment by using their mobile phones before heading to their office rooms. A cosy workspace will be prepared beforehand. Moreover, an admin system is designed for the building administrators to manipulate the grant access to different people. For example, the admin can add or remove authorization to a certain person based on the admin system. The security can control the orientation of the surveillance camera to cover a wider range.

Felicia: An Animatronic Face

For our final embedded operating systems project, we made a cardboard animatronic face that makes different facial expressions, moves its mouth to music, and vomits confetti. Our crafty project combined mechanical and electrical concepts as we integrated the cardboard facial components with positional servo motors and controlled them with the Raspberry Pi. The animatronic eyes are an interesting mechanical feature of our project as we used gimbals, ping pong balls, and wire to enable horizontal and vertical rotational motion to mimic real eye movements.

BoPit

For our final project embedded device, we were interested in developing a fun and interactive game. Bop-It encapsulated that idea while requiring us to learn how to work with sensors, electronics, and the pygame modules. We first defined flick-it, shake-it, spin-it, pull-it, and bop-it as our five actions for the game. For each action, we had to match a sensor or device that the player could interact with to register an action. After wiring a push-button, limit switch, rotary encoder, MPU, and joystick we began to develop the game logic and screens for piTFT screen. A feature of our Bop-It game logic allows the player to input their initials and save their score to a leaderboard screen. Finally, to wrap the project together we integrated the RPi, piTFT, and the Bop-It circuit into a cardboard box that contained all the sensors for a user to play the game.

CompanionBot

In today’s fast-paced, technology-driven world, people of all ages and backgrounds are increasingly reliant on their smartphones for interaction and entertainment. While this trend brings many conveniences, it also contributes to social isolation and reduces face-to-face interactions. To address these challenges, we have designed Companion Robot, an innovative companion robot that provides meaningful interactions and offers a respite from screen time. Companion Robot is specifically designed to cater to various groups, offering them valuable opportunities to engage with a responsive and interactive companion. Whether it’s for elderly individuals seeking companionship, children needing a playful friend, or anyone looking to reduce their smartphone dependency, our robot serves as a versatile and enjoyable solution. Our companion robot has three core modes that enhance the user experience. First, Chat Mode: the companion robot can participate in dialog, providing users with a sense of companionship and the fun of interactive dialog. Second, Play Mode: the robot has the ability to listen to the commands and execute user interactions, and can perform various actions according to the commands. Third, Music Mode: The robot companion robot is preloaded with 8 selected songs to provide a pleasant music experience, and the user can control the stop, fast-forward, rewind and exit music mode.

RaspPi Beats

RasPi Beats is an interactive drum beat maker. The embedded system uses an LED panel, a display, and a button matrix to make drum beats. The system allows you to interact with eight tracks simultaneously, each consisting of 16 beats. The sound loops through each beat, allowing consistent playback. The system can play up to 13 sounds, with each track being able to change between the different sounds when toggled.

Stepper Symphony

Our goal in this project was to create music using stepper motors. By controlling the speed and duration of a stepper motor’s rotation, we can control the pitch that it makes and play musical notes. This is the basis for our project. Our goal is to be able to play different notes on the stepper motors through using a keyboard, simulating a piano.

Electronic Battleship

Electronic Battleship is a project to play games of battleship on a 10×10 grid where each square has RGB functionality. Specifically, the game is played by pressing buttons on a 10×10 button board on top of 32×32 LED screen background. The system lets the user choose between a 1 player verison of battleship in which the player battles against a computer controlled opponent, or a more traditional version of the game in which 2 different players battle each other, taking turns on the board. The 32×32 LED screen displays the 10×10 grid for the game by using 3×3 groups of pixels as grid spaces, leaving a single led thick border. Each grid space has an associated button set over top the middle of it. During the normal game operation, on a player’s turn, they can press the button of the grid space they wish to guess. If one player sinks all of the other player’s (or AI’s) ships, the board will display a ‘winner’ message, before returning to the starting screen to re-select the game mode.

Vision Matrix

In this project, we developed Vision Matrix, an interactive art installation designed to engage users through dynamic visual representations. We utilized a combination of two Raspberry Pis, a PiTFT display, an LED matrix, and a camera to bring this concept to life. The installation captures the intersection of technology and art, transforming real-time video data into interactive displays on an LED matrix. The Vision Matrix operates through a dual-system setup. The primary Raspberry Pi processes video inputs to create real-time art manifestations such as silhouettes and gesture drawings, which are then displayed on the LED matrix. The secondary Raspberry Pi serves as an interactive control panel via the PiTFT display, allowing users to select between three modes: mirror mode, art mode, and gallery mode

MidiKey Conductor

Want to get started with recording music, but don’t know how to use the complicated settings in music production software? This project allows you to do the basics of recording a MIDI file and changing the dynamics using intuitive motion controls. With our system, the user can set a metronome to their desired Beats Per Minute (BPM) and play the keyboard as if it were a piano. They can then edit the dynamics of this audio by conducting it through motion gestures. The dynamics for the modified audio are then also written into the MIDI file along with the BPM and other track information. This exported MIDI file can then be opened on any major musical software, allowing you to integrate and continue your production elsewhere. The objective is to create a Graphical User Interface (GUI) and Raspberry Pi (RPi) system that can turn a keyboard input and video recording into a MIDI file with appropriate dynamics. The keyboard input plays notes like a piano, which is then converted to a MIDI file. Then, the camera records user hand gestures while the audio plays to modify the audio dynamics of the MIDI file. Then, the combined audio with dynamics is exported as a MIDI file.

 

 

Comments are closed