ECE5725 Spring 2021 Projects

PiDog

PiDog can follow your voice commands, be trained to recognize your hand gestures, and can also be taught new voice commands just like a real dog! Our goal with this project was to create an intelligent robot that simulates the training of a pet with the help of speech recognition, computer vision, and machine learning. PiDog knows how to follow direct voice commands to perform actions like moving forward, rotating, turning, etc., and every time it is given a voice command, it looks for any hand gestures to associate with the command. After a few tries, it will learn to follow the hand gesture directly! Once it knows a few hand gestures, you can also say new commands to it and show a series of gestures for it to perform complex actions like forward –> turn right –> forward –> rotate. Throughout this process, PiDog also interacts with you with its cute expressions and barks! The PiDog has been developed using a Raspberry Pi 4B as the final project of ECE 5725 Design with Embedded OS class at Cornell University.

PiCat

Are you a busy cat owner who would like to take care of your cats while you’re away? What are those cute felines doing when you are not home? Don’t worry! PiCat meets all your needs. PiCat is a cat reminder and laser toy that makes you a better owner. This multifunctional system implemented on a Raspberry Pi 4 has three functions: reminder, play, and photo. The reminder mode shows critical information about the cat on a PiTFT display with a colorful user interface, such as its birthday, as well as the date for the next health check and insurance update. The play mode keeps your cat active and engaged using a laser toy. The toy is implemented using a laser diode actuated by two servo motors. The laser toy can be activated automatically when a Pi Camera detects a cat using computer vision. Alternatively, the toy can be controlled remotely by the user through a webserver. The photo mode takes pictures of the cat periodically once the camera recognizes it. The system saves all the images locally and displays the latest photos on the webserver.

Color By Number

This project implemented a color by number game on the Raspberry Pi using a PiTFT screen. Using the touchscreen, users were able to select an image, which was then processed using openCV. Once the image processing was complete, an outline of the image would be displayed on a blank screen. When a user pressed a pixel on the PiTFT, pygame was used to determine which shape the user was pressing and display a selection of colors for the user to choose from. Different modes allowed the user to choose whether to color the picture using the original image colors as guidance (color by number) or to color the picture using any color they wanted (free color). The user could also switch between four images and save their work of art to the Raspberry Pi upon completion.

Plant Care

For our project we created a system that performs automatic plant care and monitoring. Currently it is able to support the care of two plants. The system uses various sensors to monitor the plant and its environment and configure its care accordingly. The soil moisture sensor is used to check each plant’s soil water content and water using a DC water pump if water is determined to be lacking. The temperature, humidity, and sunlight sensors are used to notify the user if any of the parameters are sub-optimal for each plant’s needs. The system utilizes web scraping of the website www.mygarden.org/plants to personalize its care to each plant. A camera and OpenCV are used to track plant height and health over time. There is also a web application connected to the system that can be used to view a live feed of the plants as well as track the plants’ statuses and environmental factors over time. The web application also allows users to water the plants even while they’re away from home. The combined system allows users to rely on the system for plant watering and monitoring.

Water Pong

The purpose of our final project is to make Water Pong a more accessible game to everyone. This robot can be used as a partner when there are not enough people, or as assistance to people who can not aim or throw by themselves. The robot detects cups and aims the catapult entirely autonomously. All the user must do is load and release the catapult.

Holo Tracker

HoloTracker is a project involving computer vision and multiprocessing with a Raspberry Pi to track the movements of an athlete performing his or her sport within view of surrounding cameras. The person in view of the cameras can see themselves being shown in a holographic, color display that isolates their body and creates the illusion of being a 3D rendering of that person. The inspiration for this project is that it combines many of the things that the team, Corson Chao and Jay Chand, enjoys and finds interesting, from sports and computer vision to holograms. HoloTracker is what we came up with. It has a fairly large setup that is visualized in Figure 1. The base station consists of one Raspberry Pi that is connected through HDMI to an external monitor, on which we run the Raspberry Pi Desktop by running the startx command on the console. This monitor is used to display four images that each reflect of the four sides of our display, which is a reverse-pyramid-shaped structure of plastic sheets that the “hologram” can be seen on. The rest of the setup consists of three USB cameras that connect to the Raspberry Pi. The person can stand inside the view of these cameras and move around there. The fourth USB slot is used to connect a keyboard from which we can control the Raspberry Pi.

Big Buck Revolution

We recreated the video game Dance Dance Revolution (DDR) using the Raspberry Pi. DDR is a rhythm game where the player uses their feet to press on four pressure pads when prompted by the game screen. The player earns points by pressing the appropriate pad in rhythm with the song. We will create a dance platform that inputs pressure pad presses to the Raspberry Pi and outputs colors using displays LEDs on the top and bottom of the platform. The Raspberry Pi processes user input presses, determine the player’s score, and displays a game window to prompt the user to press the pressure pads, display the score, and navigate menus. The software of the game was programmed in Python using Pygame.

CharcuterPi

For this project, we designed a system that has a robot retrieve food for a user. We had three different food items placed on a rotating charcuterie board and a robot that could drive between the user and the board. The goal of this project was to have the robot retrieve the food without the need for the user to get up. We used a Raspberry Pi Camera mounted above the charcuterie board, a Raspberry Pi Zero W, a servo motor, and OpenCV to rotate the charcuterie board. The board was rotated such that the desired food item was aligned with the robot. The robot was built using the Lab 3 parts, a color sensor, and a robot arm consisting of two servo motors. The Raspberry Pi 4 and piTFT were mounted on the robot, so the user could choose which food item they wanted by pressing a button. The Raspberry Pi 4 and the Raspberry Pi Zero W communicated with each other over WiFi using the Python Socket library. The robot used the color sensor to determine when to stop (i.e. when it reached the user or the board). We used inverse kinematics to move the robot arm to pick up the food on the charcuterie board.

Robotic Dog

We created a robotic dog that could perform several walking variations, including going forward, going backward, turning left, and turning right. We also implemented a web server for users to control the robot’s movements remotely. We mounted a camera onto our robot to provide users with first-person perspective and displayed the camera view on the webserver. We also used the PiTFT to display a GUI so users could stop/resume the robot’s action or quit the program. The PiTFT screen also showed the history of the 5 most recent robot’s movements.

Distracted Driving Monitor

The purpose of the project is to develop an integrated system that sits in a vehicle, collects vehicle information, and creates a driver report. The goal is to help fleet managers better understand their fleet and provide them with ways to improve their operations. This was a start into this aspirational project and we started with one portion of the project specifically, driver attention. At a high level, the code uses OpenCV to detect a face in an image. If a face is detected we then use DLib and apply the facial landmark predictor. If a distracted driving instance is detected we have a video buffer which captures X seconds before eyes are closed and Y seconds after the eyes are closed (X and Y are attributes which can be set). This will allow the fleet manager to see and confirm whether there was in fact an instance of distracted driving.

Predator and Prey: Hand Held Game

Pygame is a powerful gaming library that we learned to use throughout the semester. With the shared passion of games, we decided to create a handheld game that can be played on the monitor/piTFT. Our game is a double-player game called Predator and Prey, each player controlling four buttons to move upward, downward, leftward, and rightward. The game contains four levels – tutorial, easy, median, and hard – with the basic game mechanics being a catch-and-run game between the predator and prey, and more tools including trap, speed-up, portal, and hidden path added as the level goes up. Scoring system is also implemented.

Document Scanner

Not every household has a scanner at home, and many scanning apps nowadays require active subscriptions. Therefore, we plan to implement a free-to-use document scanner system that can be easily used at home and carried in pocket. The system will mainly focus on two kinds of functions: document scanning and form filling using Optical Character Recognition (OCR) techniques. The document scanning will use image processing techniques to detect document-like contours in the camera’s video frames. When a valid document is found, the user can apply a monochrome filter to the document, as well as scan the document. The scanned result will be displayed on the piTFT, and if the user is satisfied with the result document, a simple button click can save the image files to a designated Baidu cloud drive.

Remote Desktop Environment

A Minecraft Server was implemented upon the Raspberry Pi 4 in order to act as a sort of “test” for the networking and server-hosting abilities of this machine. Moreover, this satisfies the proposed and salient goal of promoting human interactivity through server-oriented or network-based technologies, even if it may be categorized as recreational. After ensuring that the capabilities of the Raspberry Pi 4 were sound, the undertaking of more advanced development began in the form of designing and injecting a fully Remote Desktop environment on the Raspberry Pi 4. This would allow any user the ability to access any desired personal computer or otherwise from anywhere on the planet, seamlessly streaming its hosted contents to it. Inputs and changes would then be transmitted and communicated back effortlessly.

Scouting Owl

Scouting Owl is a robot that allows the rescuers to detect and communicate with the survivor in an environment that is not yet suitable for in-person rescue. To perform the critical rescue task, we built a Raspberry Pi 4 based robot with the following functions: Establish a reliable half-duplex connection between the robot and the control device.  Fetch and display live IR camera feed along with sensor data on the display device.  Send commands from the control device to the robot.  Implement a function for the robot to map the surrounding area and display the map on the screen.

Gesture Controlled Bluetooth Speaker

Today, nearly everyone has bluetooth-enabled devices that carry their music wherever they go. While many people often bring a set of headphones to listen to their music, it’s always nice to share it with more people via a nice bluetooth speaker. For many college students, a portable, stylish bluetooth speaker is an essential item in the dorm. Our project was to create a gesture-controlled bluetooth speaker using two Raspberry Pis, where users can use hand gestures to control music playback. Additionally, the speaker emits LED lighting that visually suggests to users the state of the speaker.

Robotic Navigation

The goal of the Robotic Navigation project is to create an adaptive robot tour guide that can change its behavior based on the user to navigate around a known map. The inspiration for this project comes from a robot tour guide that can guide a human to selected locations within a predetermined map. The robot extends the applications from Lab 3 of ECE 5725 by adding additional sensors and functionality to the autonomous robot. Wall collision is prevented by using ultrasonic distance sensors to detect the walls of a hallway to allow for direction correction. Adaptive behavior is implemented by utilizing a camera to track a user’s distance from the robot itself.. The distance calculated is then utilized to modify the PWM signal to the robot’s tires. Additionally, a GUI was designed to allow the user to select a predetermined destination. The robot then uses a shortest path Dijkstra’s algorithm to plan a path between points. The robot travels to known locations in the hallway utilizing beacons and bluetooth beacon scanners.

Touchless Music Player

Touchless music player uses a Raspberry Pi to function as a jukebox by allowing users to make their own personalized playlist and select a song to be played from the playlist on a speaker connected to the Raspberry Pi. Each user has their own QR code that they need to scan so that their playlist can be displayed on piTFT from which they can select the song that they want to play through hand gestures.

ExPiRior

Technology has always played a crucial role in times of crisis, and the current pandemic has challenged innovation like no other time. In the past year, we have seen several instances of how the pandemic has channeled technological ingenuity like open-source ventilators, 3D printed masks, experimental vaccine methods being brought into production – the list goes on. To add to it, our project “ExPiRior” (derived from the Latin word for “test”) is an autonomous COVID testing site modelled after the coronavirus testing centers at Cornell. It is an embedded system consisting of the Raspberry Pi and a robotic arm that is aimed as a replacement to human presence at the testing sites. Given the rate of transmission and the prevalence of different infectious strains of the virus, we thought it necessary to minimize human-to-human contact to circumvent the rise in infection. Since testing centers face the highest number of potential positives, minimizing the human interaction at these centers by reducing the number of medical staff needed would assist in bringing down the chances of exposure.

RPiCloud Storage Service

Traditionally, people choose to use cloud drives on the Internet to store their documents. However, in traditional cloud drives, we are faced with many problems which annoy us for a long time. For example, the download speed is restricted. If we want to download some giant documents, it would waste us a lot of time. Besides, the memory size sometimes is limited and we can’t store giant information. More seriously, sometimes our information inside the cloud drive may be disclosed, which damages our privacy. Some people may choose to buy a hard drive and carry it with their laptops. However, it’s not convenient and burdensome. For these reasons, we came up with a novel solution to design a faster, more convenient, and safer cloud drive based on Raspberry Pi and external drives. Also, we can access our cloud drive in local or international networks.

Fair Cake Cutter

For the final project, we built a cake cutting tool that cuts a round cake evenly into a desired number of pieces. The goal of the project was to allow the user to use the PiCamera align their cake with the cake stand, and the robot will carry out the cutting motion. The full set up includes a separate cake stand that is attached to the robot structure on the left by the gear. The gear is attached to a continuous rotation servo, which is control by the raspberry pi. The number of slices determines how many rotations that the gear makes, which is controlled by the duty cycle. A DC motor is used to control the rack and pinion which are responsible for the cake cutting motion. Initially, the user is asked to align the cake with the center, which incorporated image and video capture using PiCamera, then openCV is implemented to determine the cake circumference and center of the cake. The user will then be asked to move the cake on the stand to align with the stand. Then, the user can input any number from 1 to 16 as the number of cake slices desired. The gear then rotates, and the rack and pinion carries out the cutting motion by sliding the cutter up and down. A panic stop button is also implemented so that the user can stop both rotating and cutting motion if needed.

 

Comments are closed