ECE5725 Fall 2022 Projects

Visual FFT

Our Visual FFT project involved performing a live FFT on music coming in from a small microphone. The project involved three parts: sampling, computation, and display. Data was sampled using an analog to digital converter (ADC), sent to our raspberry pi where we did FFT calculation into 32 frequency bins, and then forwarded it to an LED screen to be displayed within 32 bars with heights ranging from 15 to zero.

Smart Lock

To create the smart lock system, we connected multiple peripherals to various Raspberry Pi 4 GPIO and communication interfaces in combination with a software controlled solenoid lock to open a safe door. After thorough testing of these components and weaving them together in a coherent software routine that prompted for user enrollment, authentication, and other modifications, we created a physical housing with laser cut wood to place our components for user interaction. This included creating a touch screen piTFT interface for users to select different functions, and to print output confirming specific authorization procedures. The final safe was able to enroll new users into its database, prompting them for a user ID, fingerprint, RFID tag, and password. Users with elevated permissions were able to delete others from the database.

Baby Monitor

Taking care of baby has always a been difficult task for lots of families, regardless of their experiences in baby-sitting. Our project is to the rescue – it provides autonomous tracking functions to make sure that your babies are safely staying in their rooms. With this application, parents would be able to monitor what’s going on in babies’ rooms whenever they feel necessary, with just a click in a browser. When you’re busy working, the system can also autonomously focus the camera on babies and record pictures whenever there’s a sound impulse or a crying sound. With additional protection of PIR motion sensors and our smart text alert system, you will know immediately on your phone when your babies try to leave the room. Our program also includes a user-friendly graphical interface on piTFT that anyone can learn how to use in a minute!

Pi Breaker

The objective of this project was to create a brick breaker game that can run on the Raspberry Pi, called PiBreaker. With the attached TFT screen and buttons on the Raspberry Pi, we figured that this would be the perfect form factor for a handheld video game. In PiBreaker, a player can progress through ten increasingly challenging levels to beat the game. After implementing the basic game, we added several features that made the game more exciting and competitive, like a high score tracker and powerups.

Stage Light Controller

This project was an effort to simplify and streamline the control of a stage light using the DMX512 protocol. The DMX512 protocol was invented by the Entertainment Services and Technology Association (ESTA) in 1988. The typical setup for producers setting up a light show involves getting a DMX controller, which is a board full of sliders and buttons larger than a typical laptop. These bulky controllers are cumbersome (and in our opinion, unnecessary) so our goal was to eliminate the need for them entirely by implementing the DMX controller by bit-banging GPIO signals. Ultimately, this means that we can replace the typical system with a Raspberry Pi connected to an XLR cable using two GPIO pins. While coming up with a solution that met the timing constraints of the protocol proved challenging, ultimately we successfully created an API capable of running light shows using a Raspberry Pi, and created a robust system that can reliably send packets to fixtures using DMX.

Digital Otamatone

In this project, we made a prototype of Digital Otamatone, a digital version of the Japanese electronic music synthesizer (Link). The classical Otamatone is difficult to play simply because of its analog nature, – a press on the ribbon can easily go off tone. (Youtube video) We therefore empowered it with a Raspberry Pi 4 for control and an Arduino Nano 33 BLE sense to aid with calibration, which makes playing the Otamatone as easy as playing on a keyboard. We also built a touchscreen user interface to allow full functionality without manually running scripts and/or commands.

Dynamic Fire Egress

In case of an emergency, navigating to the nearest and safest exit can prove to be difficult when you are in an unfamiliar location or you don’t know the location of the nearest exit that avoids the emergency event. The Emergency Egress System provides topological guidance to individuals in a building to navigate to the nearest and safest emergency exit through the use of LED arrows attached to exit signs. Through an expanded emergency exit sign design, the addition of arrows placed to the left and right of the sign, individuals are able to receive real-time guidance for which exit(s) are on the shortest and safest path out, and which exits are not.

Exploration Robot

People are very vulnerable to danger and accidents when exploring uncharted areas, be it abandoned tunnels or unknown caves. Therefore, The purpose of our robot is to explore the way for humans under the unknow or danger environment. In our project, two cameras will be installed in our project. One will be placed on our robot to record the view and send it back to the monitor simultaneously. Additionally, RPi 3 will be applied to control movement of the Robot. The other one will be used to analyze our hand gestures and give commands to the robot (forward, left, right, stop). The hand gesture recognition will be based on Computer Vision, and we will apply one more RPi 4 to handle the recognition.

Piano Tutor

This project is an embedded system that uses a Raspberry Pi (RPi) as the central component for data and signal processing. The system will use an USB microphone as the input device and a piTFT screen as output device. The user will load a sample piano score into the system before starting, and the system will “listen” to the user playing the piano with the microphone and compare each single input note with the sample note one at a time. If the input is different from the sample (e.g., played higher or lower than the loaded score), the screen will display the mistake in real time.

Vibration Analysis

We built a software and hardware package that users can use to measure the vibrations of a system. With this information they can then account for these vibrations when setting their acceleration values. To demonstrate its functionality, we used a single-axis system containing a stepper motor that rotates a threaded rod to move a plate back and forth. We 3D-Printed a case to hold both the Raspberry Pi and the accelerometer, which we attached to the plate using binder clips. We communicated with the accelerometer to collect the acceleration data from the system as the Raspberry Pi communicated with the Teensy microcontroller to slowly increase the frequency the plate moved at. This information was then stored in a file and processed on the Raspberry Pi to compute the frequency response.

Heartbeat Monitor

Our project aims to design a heartbeat detection system with face recognition. In our project, we identify users by face recognition and users can see past heartbeat data of his/ her own. Users register by entering his/her name and entering one hundred faces. This facial recognition function will provide safe access to the user’s data. The camera is used to detect the heartbeat by putting a finger on it. The detector will process the image, generate the heartbeat pulse, and display the heartbeat trace and rate on the pi-TFT. The program will record the history heartbeat rate and can also view real-time Electrocardiography or past ECG in this user account. We are going to use the Pygame to create the interactions with the user on the piTFT screen, where users can see the past heartbeat data.

Raspberry Pi Arcade

For our project we decided to create an arcade game machine and corresponding prize dispenser. The system consists of two parts each containing their own Raspberry Pi for control. The game portion includes a Raspberry Pi 4 with a 7 inch monitor, which is running the MAME gaming emulator so we can play classic arcade games. The python program reads the memory of the emulator so we could include a points awarding system. The gaming system also includes buttons and a joystick for user control. The prize dispenser is set up as a vending machine using a Raspberry Pi Zero W. The Pi controls motors to rotate coils and dispense prizes as well as runs controls with an associated GUI to allow users to select and pay for prizes based on the amount of tickets they have earned. The interface between the prize dispenser and the gaming console are RFID cards and readers/writers. By scanning the card on the RFID reader/writer on the console side, the amount of points awarded will be recorded on the card and by scanning the card on the prize dispenser side the user can pay for prizes and subsequently have tickets deducted from their card.

RFID Server Access

In the final project, we designed an RFID system for Cornell students to access to the Raspberry-Pi. From the Figure 1, we can learn that the system would allow students who are in the allowed list to log in by scanning the card and then they can communicate with each other after they have logged in. Considering security and privacy, we set up SSH before students log in. Meanwhile, the data such as Student IDs about students would be recorded for the administrator to manage the RFID system. The administrator who has supreme authority can kick off the users physically without permission of users. Moreover, we also designed a GUI on the Raspberry-Pi to control the system more efficiently.

Pac Man

In the project, we expanded the classic Pac-Man game with gesture controls. The design replaces the original keyboard operation with hand movements to control the direction of Pac-Man’s movement. The four gestures correspond to the four directions of up, down, left and right. In the first step, a Pac-Man game was designed with pygame. It has three levels: learning, easy level and hard level. Next, to complete the real-time gesture recognition part, PiCamera, computer vision and OpenCV are used. Finally, the game module and the gesture recognition module were connected through specific command files in linux( Raspberry Pi).

Music Notation

We built an automatic music notation tool. It consists of two running processes on a Raspberry Pi, a main program that records the audio and generates notations and a server that presents the final results. One just need to sing into the microphone attached to the Raspberry Pi and tap the beats on the touchscreen, a corresponding staff notation will be generated after the end button is pressed. The recorded beats, audio and corresponding staff notations are accessible through a web page hosted by the server. It also allows one to pause the recording to take a break and resume it afterwards. Its accuracy is bounded by the quality of recording, the quality of beats and the faults of pitch detection algorithms, but overall, it works.

Foosball

As avid soccer fans, inspired by the 2022 Qatar World Cup, we wanted to experience the intense and exciting game at home. Our project is designed for enhancing the traditional experiences in Table Foosball. Our electric foosball table can first replace the traditional scoreboard with the electric scores of both teams displayed in the PiTFT Screen, as well as the remaining time, historical scores, and match information like possession rate of both team. Besides, as an electric foosball table, audio module is connected to play a cheering sound after each goal. In addition, as the last defender in the game, a goalkeeper plays an important role in the whole game. Therefore, we add a automatic goalkeeper functions that “sees” the ball and “moves” to the corresponding location to block the ball. Our hardware system was built mainly with Raspberry Pi, using hardware such as camera, speakers, break beam sensors and servo motor. Raspberry Pi was used and control all the hardware components. Our software part includes OpenCV and Pygame. OpenCV is used to keep track of the red ball circle in the camera and get its location. The pygame is for displaying an user interface on PiTFT to let user interact with the program, display the scores and other game information, and playing cheering sound when a player goals.

Magic Potter

Magic Potter is a player VS player game based on Raspberry Pi, each player will have a magic wand and wave it in front of the camera. The system will recognize the trajectory of the wand and tansform into the corresponding spell. There were a total of three rounds in which each player waved his wand in front of the camera in turn. To show the game more clearly, the battle process and the winner will display on the Pi TFT. And for the relationship between different spells, it needs players to explore. For each round, the stronger spell wins.

Edgetop

In our daily lives, our devices often have idle computing power, such as when we sleep, eat, or commute. Even when we are actively using them, their efficiencies are usually not fully utilized, leading to a waste of computing power. To address this issue, we propose the Edgetop project, which aims to reduce the amount of idle power by establishing a central local area network node that distributes computing power to our devices. This is achieved by separating input and output devices from the main computing components such as the CPU, GPU, memory, and storage. While wireless keyboards, mice, and earphones have been successfully implemented using technologies such as Bluetooth and Wi-Fi, the implementation of a wireless display has been limited by the limitations of bandwidth and latency. However, with the recent advancements in Wi-Fi technology, it is now possible to address these limitations and implement a wireless display system. The Edgetop project leverages this technology to allow users to connect to a central node and use it as a computing unit like a laptop or smartphone. A cloud server is used to store and check gateway information and user login information, while a laptop serves as the edge node providing the computing power. Raspberry Pi devices are used as edge devices, allowing users to connect to the edge node and use it to access the computing power. By separating the input and output devices from the main computing components, the Edgetop project aims to reduce idle computing power and improve the efficiency of our devices.

Personal Assistant

Our project is an Alexa-type personal assistant device. This device can receive voice commands to complete specific tasks. This is done by holding down a button and speaking to the device using PyAudio. Some of the commands this device can accept are checking the weather, getting the time, making and reading a note, and setting a timer.

Access Control

Even though some ‘smart’ gates and doors can recognize human faces via cameras nowadays, the efficiency and accuracy of facial recognition cannot be 100% guaranteed. This leads to some non-trivial questions: What if the door fails to recognize a person? Should the host just let the person stand by the entrance? What if the entrance was intruded by someone? Does it take too long to find out the intruders from video recordings? Do video recordings take up large memory spaces? Therefore, there are still many places to improve for such intelligent machines. In this project, we designed and implemented a more reliable and efficient integrated access control system based on Raspberry Pi, Pi camera, PiTFT, and servo motor. The system has four main features, which are facial recognition user managementremote access control and activity logging. With our system installed in your house or office buildings, the above embarrassment will never happen.

Light Tracker System

 

Navigation Car

In this project, our group design a navigation car that can navigate people in an unfamiliar environment. For example, a freshman new to Phillips hall struggling to find a professor`s office can be led to destination with this car. We assemble motors, raspberry pi and an external camera to our robot car. With the help of OpenCV library and apriltag, the car make detection during running to search for the destination that indicated by the tag.

Smart BB-8

BB-8 is so cute and popular in the world, we really want to create a similar robot that could run in the ball!! So we designed a BB-8 brother robot, which is a running transparent ball controlled by a pi-car inside it. Our robot loves singing so it sings on its way! Our robot is controlled by Android under wifi, and the Android application wraps up all the functional modules, including a locomotion module, line-tracker module, obstacle avoidance module and see history module, and live video stream from pi camera. The Moving Control module has buttons to move the robot forward, backward, left and right, depending on which button you pressed. The Line Tracking module is where the robot tracks the black line on the ground, so our robot may be used to send packages or other stuff in the future. The Obstacle Avoidance module helps our robot avoid any obstacles encountered along its way. The See History plays a very cute BB-8 video. And we are also able to see what the robot sees through the phone interface by looking at the live video stream.

Remote Control Robot

The remote control robot is a remote control car that can be remotely controlled by an android application on the phone. We built a Raspberry Pi 4 based robot with the following features: TCP communication between the robot and the application; Send live video from the camera to the application; Control the movement by commands from the application; Modify IP address and port in the application.

Animal Recognition

In the 5725 final project, our group invented an animal face recognition system using Raspberry Pi 4 and an attached legacy camera module. The whole software system is mainly separated into two sections: a web server on R-Pi and a machine learning algorithm. We have first constructed a basic server and trained a simple machine learning algorithm, then implemented more features of the server such as button interaction, page jumping, camera streaming, etc. and improved the ML model, and finally connected the two sections together. In the final prototype, the users are able to interact with the server to take photos of themselves, save it to a local path and analyze the result, which, in this case, is the animal that is most similar to the face in that photo. Finally, the result data will be updated to the server and showcased to audiences.

Claw Machine

For this project we took a generic arcade claw machine and redesigned it to be fully operational through the Raspberry Pi 4 and piTFT screen. To make the machine more exciting we implemented three modes that the user can choose from. The first mode fully utilizes the piTFT screen and in lieu of traditional joysticks, the user is presented with arrows and a drop button on the screen to retreive their prize. The second mode utilizes the piCamera attached to the claw as well as the touch screen which allows for the user to choose an object in the machine which prompts the piCamera to search for the object within the machine utilizing openCV. The user is able to see what the camera sees on the piTFT screen, but the Raspberry Pi has full control of the movement of the claw. The third mode implements wireless capability where a Raspberry Pi Zero is connected to an accelerometer and button which communicated with the Raspberry Pi 4 to allow the user have full control of the claw machine from a distance. The camera is initialized in accelerometer mode as well so the user also has the option to look at the claw machine normally or through the camera.

Interaction Robot

As advancements in robotics and AI continue to progress, the use of autonomous mobile robots has become increasingly common in the world. However, one of the biggest challenges facing these technologies is the ability of robots to effectively interact with humans. Our project aims to tackle this issue by developing a autonomous mobile robot that can be controlled through both gestures and voice commands.

Pi DJ

I made a Raspberry-Pi based drum pad / sequencer. It’s a device with a 4×4 keyboard and a display that contains a large number of short audio files and allows the user to bind the keys to audio files. With these bindings, the user is able to play the audio clips rhythmically to create music! The sequencer functionality makes this easier by automatically playing sound clips at a certain tempo, so that the user does not have to press so many buttons every measure. It produces audio via the 3.5mm jack on the Raspberry Pi, allowing to be used with headphones or with speakers, and the keyboard lights up interactively to indicate the current state of the user interface.

Ivyponic

During the semester, our team (Vinay Bhamidipati and Eric Moon) built Ivyponic. Ivyponic is a sealed, hydroponic indoor gardening solution that can automatically grow a plant under customizable conditions, including airflow, lighting, temperature, humidity, and nutrient density. Ivyponic grows plants without the need to micromanage or take care of the plants directly. We used the PiTFT screen to allow for customization of these conditions to optimize for a specific plant’s needs. We created an hydroponic growing system, using a pump to deliver fertilized water to plants, without the need of soil. We used the RPi to manage all variables for plant growth to maintain user-defined targets. We used PPM, temperature, and humidity sensors to gather the necessary environmental data.

Pi Lane Tracking

With more and more technologies supporting our daily-driven vehicles, it is interesting and fun to build our own lane tracking system. With a Raspberry Pi 4 and computer-vision algorithm, we also create a simple lane tracking system. We set up two motors on an acrylic board as our wheels for the car, and Pi camera attached in front of the car to detect lanes. By using openCV to process the captured image with openCV, the car could lean towards left or right or stay straight in order to stay in the middle of the lane.

Nerf Howitzer

The Nerf Howitzer is a nerf air gun mounted on a white platform that sits on a robot chassis. This chassis has two wheels powered by DC motors. The nerf gun is operated using two standard servo motors: one to press the trigger and one to change the vertical angle of the gun i.e. elevation of the barrel. A picamera is used to take images as the program is executed and the robot chassis turns both left and right. The images are then processed using the opencv2 library for object detection and distance measurement. A voice recognition library called vosk is also used to convert sound into a string for aiming and firing. The nerf howitzer uses a recording as opposed to an actual voice command and turns towards the direction of the target. The program then waits for a voice input which was also replaced by a recording and prompts the servo to turn causing the gun to fire.

Plant Watering System

Automatic Watering Systems can water the plant routinely when the soil is dry. After every watering process, the pump will stop functioning for six hours, which is decided by the potted plants, in order to avoid overwatering. After selecting the plant, the main UI page will show the desired environment, which includes sunlight requirement, humidity and temperature, and the current data about humidity, temperature and water level. The buzzer will also be triggered to notify users when the temperature is out of range. Also, we have a camera to automatically record the plant’s picture every day at noon.

 

 

 

Comments are closed