Most of you have probably played the Chrome dinosaur game when having a bad Internet connection. The game is about controlling a small dinosaur to dodge obstacles by jumping or ducking. It is a very simple but interesting game. Inspired by the Chrome dinosaur game, we implemented our version of the game with extended functionalities and control schemes to make the game more interactive and fun. The game can be played in two modes: object control and button control. In object control mode, users could use an object of color to control the movement of the dinosaur. In button control mode, users could simply control the dinosaur by pressing buttons on the piTFT
This project aims at developing a plane fighting game with motion sensing and mouth opening detection. The game will start with an initial loading screen. After writing a bash script which will run the game in foreground and have the detection running in background, the game will auto start on PiTFT when the Pi is plugged in. The plane will fly freely on the screen and is controlled by the user’s hand movement which is read through the BNO055 IMU sensor. There are three lives and three bombs in total when started, the player can then release the bomb by dropping the sensor. The enemies are of three types in total: One can be shot down by one hit, the other two has an indicator of remaining lives. When the sensor reads a z-axis acceleration which is greater than the preset threshold, the game control code will receive a command and get the bomb released and this operation will clear the entire screen of enemies.
For our project, we have developed a reptile breeding system that can automatically monitor and send alarm. The system used DHT11 sensors to detect the temperature and humidity of the reptile’s living environment. When the temperature and humidity are not within the appropriate range, it will automatically send a warning text message to the breeder’s mobile phone. The system used a camera and OpenCV to monitor the motions of reptiles. When the motions of reptiles are too frequent, the camera will take a latest photo and upload it to the webpage connected to the system for the breeders to view. The temperature, humidity and the number of motions will be stored in a database and displayed as a line graph on the web page. We have created a web page for breeders to view real-time temperature and humidity, historical data line graphs of temperature, humidity and motions and latest photo. The time range of historical data can be arbitrarily selected
With the great passion of game development, we team created a handheld game that can be played on the monitor by utilizing pygame library we’ve learned this semester. This game is a typical embedded project. This is due to its good interaction at the software and hardware levels. Our game is a one-player game called “Jump Jump Jump”. The player controls the virtual game well through the control of the hardware. Specifically, the player can control the character move leftward and rightward with accelerometer. The game contains different levels and powerups including rocket and life. Scoring system is also implemented. This game benefits from efficient human-computer interaction, data processing, and communication between elements.
Playing games that require precise touching positions like Gomoku on a small screen often drives players who have big fingers mad. Therefore, we redesigned the Gomoku game that only requires the player’s voice. Busying with your hands but want to play Gomoku in the meantime? all you need to do is to open the game and say “[a number] dash(-) [a number]”. The game will automatically place your chess piece at the exact position you commanded on the chessboard. This hand-free game playing style brings infinite possibilities to game development. In the hardware aspect, this game sets on a Raspberry Pi and connected with a USB microphone and a PiTFT screen for displaying the Gomoku game. And two LEDs represent two players in the game, which tells players when and who should make a speech command. In the software aspect, a python-based speech recognition library and Google cloud speech API were used for speech recognition in this project. The basic Gomoku game was realized by using pygame library.
Since the outbreak of the new coronavirus pneumonia in 2019, the disease has spread globally, and the epidemic has continued to the present, and the disease has become an epidemic with the largest number of deaths in human history. For a long time, people have been paying attention to the data of the new crown epidemic in various places. Our project is to use web crawlers to crawl the epidemic data on the website, and visualize the crawled data and display it on piTFT. We use python to implement a web crawler, select the webpage with the data we are interested in as the target webpage, crawl the data on the webpage every hour, and store it locally. Further read the data to generate maps and tables for visualization.
The COVID-19 pandemic has changed the way the world works! Not only have people’s jobs moved virtual, but also a lot of recreational activities have adapted themselves to an online platform. One of the activities that most people have not been able to participate in are personal gym training sessions. With this project, we aim to create a virtual gym trainer, ‘Workout Buddy’, which comprises a Raspberry-Pi based system that can let a user decide which body part they want to work on, and then starts counting and recording their activities while displaying the number of repetitions on the screen. Once the user is done with their workout, they can choose to send an email with their entire workout report to themselves with just a click on the touchscreen interface.
We are going to design and implement a game similar to Doodle Jump. It is based on Pygame and Accelerometer. There are an endless series of platforms. The goal is to keep the jumping alien from falling. Our final game box acts as an embedded device with a user interface based on the RPi 4. The integrated circuit including the Pi and accelerometer which carry out computation based on the users’ behavior and respond accordingly in real time.
Today, vehicles are widely used and parking a car becomes an inevitable problem. One situation people may encounter is that a car needs to park in a designated parking spot, for example, a private parking spot marked with a license plate number, and drivers can easily forget where their parking slots are. This project aims to design a robot car that can recognize license plate numbers in a parking lot and back itself into the corresponding location with its license plate number. In our parking lot, license plate numbers are composed of letters and numbers which simulate the real-world situation. In this project, the robot was placed at a starting line to detect the right parking spot. We used the RPi to control the movement of the robot.
Remember the game in which the player controls a snake to eat apples? Here’s a 3D LED version! A 6x6x6 3D LED cube is designed and implemented to run the snake game. The player controls the snake via six push buttons on a Raspberry Pi, which serves as a master device to indicate how the Arduino Mega should drive the 3D LED cube. Come play and try to get the highest score!
- GoBang is an easy but interesting chess game, it is uneasy to find someone to play with you, either your friends are busy or his/her skills are not as good as yours. Although a lot of virtual online GoBang games are available on the Internet, they can only be displayed on the screen and cannot provide the feeling of touching the chess pieces, which is also a vital part of playing chess. Therefore, our aim is to combine the software with hardware, so that people can not only play chess on the real chessboard with the real chess pieces but also play with a powerful AI that never gets tired. In addition, people can also practice their chess skills and prepare for the chess match.
Osu! is a rhythm game where players perform a series of actions using a keyboard and mouse along to the beat of the music. The basic idea is that circles show up on the player’s screen, and they have to click these circles along to the beat of the song. Clicking in the wrong location or not at the right time will lead to a lower score. For our final project, we wanted to create a system that automatically plays Osu! and is able to achieve high scores. Knowledge from several areas was used, including computer vision, multiprocessing, image processing, and computer networking.
According to the name of our final project, Item retrieving robot, our project is to design a robot that can identify items and retrieve them to the designated place. Our project mainly consists of 4 parts: 1) Target scanning and feature points extracting 2) Circuit layout arrangement and chassis control 3) Feature loading and target identification 4) Operating algorithm and parameters design of the robot
The purpose of the project is to make easier way of teaching. It embeds most important functions that is required in school teaching normals into a web, a server and a Raspberry Pi. These functions includes: Take attendance, Make surveys, Show timetables, and publish announcements.
Listening to music can be an enjoyable moment for celebration, an escape from the daily hustle, or simply warm company at midnight. What if the senses we use to feel the music can be expanded to different senses? Will it enhance the experience that a person can gain when interacting with music? We want to find out the answer by implementing a remote control music visualizer to discover the relationship between music and the sense of sight. In this project, we built a 6x6x6 RGB LED cube as a music visualizer. It enables the user to ’see’ the music which they are playing on their phone. The LED cube is controlled by six shift registers, six N-channel MOSFETs, and an Arduino uno. The music control and analysis is accomplished on Raspberry Pi with packages like Pandas and Librosa. Bluetooth is used to communicate between the mobile phone and the Raspberry Pi, while the communication between Raspberry Pi and Arduino is executed via serial.
The Multi-Raspberry Pi Compute Server is a project that is designed to showcase the computational advantages of a Raspberry Pi cluster computer. The cluster consists of four (4) total Raspberry Pis, out of which, three (3) are configured as “workers” and one (1) is configured as a “manager”. Edge detection using Sobel filters with and without multithreading with OpenMP is utilized to test the cluster’s performance. An end user application for image processing using this cluster is developed to assist programmers who can benefit from the computational advantages of the high number of cores with respect to the low cost of a Raspberry Pi.
The purpose of this project is to create a self-contained system that monitors and alerts the driver when distracted. There are several categories within ‘distracted driving’ that we considered. Using OpenCV and DLib to detect a face and extract facial features from picamera footage, we can detect whether the driver is drowsy, is facing away from the road, or is looking away from the road. In addition, a capacitive touch sensor is used to detect whether the driver’s hands are on the steering wheel. An OBDII scanner is used to log the speed of the vehicle in real-time. To avoid unnecessary alarms, the alarms are paused when the car is safely stopped.
This is the final project of Cornell University ECE 5725 Embedded Operating System. The project is about a Raspberry Pi robot mover with a manual mode and an auto mode. In the manual mode, the user can control the robot with an Android mobile phone and watch the real-time video via the Pi camera mounted on the front of the robot. When the user switches to the auto mode, the robot will be able to follow the user by capturing and remembering the user’ shoes’ color. The project aims to make people’s lives more convenient and keep social distance under this epidemic environment, like helping people to carry heavy goods or assisting people to deliver objects.
Fruit Ninja is a classic and famous screen-touch game which caught people’s love over several years. Inspired by this fascinating game, we want to develop a motion controlled fruit ninja game. Players can hold a “sword” in their hands and use it to “cut” the fruits in front of camera to get scores.
We designed a smart guest management system that is intended to help public buildings such as apartment complexes and office buildings to improve management efficiency. Unlike modern smart locks for homes, our systems provide administrators with more flexibility and with a wide range of functionality for various daily needs, especially in the covid era, such as the acquisition of express and takeaway information, management of personnel entry and exit, activation of building alarms, etc..
We began our design process with empathy fieldwork: observations, engagements, and immersions that we hoped would reveal unaddressed needs and pain points. These needs and pain points, in turn, would serve as the motivation for our product and guide the rest of the design process. Eventually, our empathy fieldwork introduced us to an interesting pain point that arises when college students cook meals. We noticed that our roommates and friends often use personal devices to look up recipes or stream TV while cooking. Many students find that using a personal device is difficult because hands become wet or dirty while cooking, and as a result, they end up repeatedly washing and drying off their hands in order to touch their phone screen to scroll through a recipe, start the next TV episode, or adjust their device’s position as they move around the kitchen. This process is annoying a significant pain point. Furthermore, due to the pandemic, more and more college students are cooking from home, so this pain point is also prescient and a great one to address.
People do not have enough memes in their lives, and we wanted to build a robot that would autonomously find Cornell students and show them memes. This is why we came up with the Memeba. It is powered by Raspberry Pi and Python, and it intelligently traverses unmapped environments to avoid obstacles and find people. Once the Memeba detects a person in front of it, it stops to display the latest memes from the internet!
In this semester, we were getting familiar with Linux operating system through the previous four lab sessions, which is the most secure kernel, making it safe, flexible and reliable as an OS for servers. In this project, we are using the Raspberry Pi,a 4-core ARM CPU, 2G RAM, fully functionlized Linux machine. A thinking grows up to our minds that we can use RPi 4 as the server for a web application which is applied to generate the Linux programming environment. Then the next decision required to be consider is what kind of functionality of the website are we going to design. As new cornell students, when we first came here, we found that a lot of stuffs we need to buy for living. It is a huge expense if we buy all these new daily necessities online, thus the deman of a second-had trading platform becomes more essential. Therefore, the first thinking raised is to construct a platform which could enable the students to sell their useless stuffs and other students to buy.
Driving around Ithaca is challenging due to the narrow streets with steep slopes and sharp turns. Speed limitation is a good way to ensure safe driving under such road conditions. This PiSpeed Camera project intends to implement a device that can be placed at the roadside measuring the speed of passing vehicles. The project used Raspberry Pi 4 to be the microcontroller; two ultrasonic sensors to measure and calculate the speed of cars, and a Raspberry Pi Camera to capture an image of the license plate. The OpenCV library was used to process the captured image to crop the area of the license plate. Finally, optical character recognition (OCR) was used to transform the image into text information. A local server on Raspberry Pi was also set up so that the detected history over a speeding car with its license plate can be displayed on a website.
Having watched several videos of projects developed by the previous team on the website, as fanatic game players, we decided without hesitation to make an RPG game using some kind of game consoles (which was later designed as a gyroscope) as its controller. The first game that comes to mind is ‘Feeding Frenzy’, which was a very simple game that used to be so famous. To make the game more interesting, we also decided to add some functions and characters, which we will talk about later.
We implement a multifunctional, interactive pan tilt system. We use raspberry pi, camera, servos and gimble as hardware support and opencv, flask and python for software development. First of all, our pan tilt system achieves face recognition, and the faces of our team members can be recognized by training with Haar cascade. Secondly, we also apply the face tracking, controlling the servos to follow the face in real time. Finally, the flask framework is used to monitor and control the pan tilt through the web, which accomplish remote video viewing, and rotation of specific angles.
The personal note writer was intended to be a tool to print out user inputs drawn onto a canvas. Gesture control and vision tracking allowed the user full control of a user interface directly connected to a 2D plotter for immediate output. The goal was to use a self trained machine learning model to recognize words (or numbers) written on the canvas and print their outputs in regular typeface. The inclusion of math and word models would make word identification as well as solving simple arithmetic possible. The team created a compact drawing interface controlled by vision tracking and gesture recognition. The interface is connected directly to a 2D plotter which takes image input, mapped to an array of vectors, and translates those vectors into 3-axis motion of a pen. The x and y axis were used for drawing and the z-axis was used to lift the pen off the paper and put it back down when necessary. A significant effort was directed at getting the character recognition models working. Though they worked well offline (not on the Raspberry Pi but on local machines), package incompatibility meant that the team needed to make the hard decision to cut it from the project. In any case, the work that went into building the models will be discussed here so that later iterations of the design can focus more heavily on model integration.
In this project we attempted to utilize these components; Three different biomedical sensors are tested using different methods and integrated into the system we designed. The three sensors are high accuracy temperature sensor, GSR sensor and pulse sensor, and their data are sent to an Arduino Uno. Then, the Uno transmits the data to Raspberry Pi 4. We intended to extract data from these biomedical sensors in real-time and then to do a decent judgment whether the person is lying or not. In order to show a straightforward result in the final demo, a simple game of guessing ball in cups was implemented and a dashboard of showing the nerves level was implemented both using the pygame library.
In our project we designed, assembled, and tested an automated air hockey table. Our goal was to have one of the players in the traditional air hockey game be controlled by motors, and the other player be a human competing against the “robot” player. We wanted the movement of the robot player to be educated and strategic decisions based on the position and movement of the puck. While we all enjoy playing air hockey, the issue with the game is that it always requires having another person with you to play it. For only children, people without many friends, or people being isolated due to COVID, this creates a huge restriction in the ability to play air hockey. With our automated air hockey table we solve this problem, allowing people to enjoy the fun game even if they are all alone. At the heart of our system is a Raspberry Pi 4, which does everything from touchscreen UI and motor control to object tracking and strategic planning. We started with a normal air hockey table and finished with a robot opponent able to play against a human on the same table.
In 5725 lab3, our team finish a four-wheel car and control the car to move forward and backward and, we change the speed of the car to move faster and slower. Then an intriguing notion that comes to our mind is to make a two-wheel car to balance itself! Thus, in this final project, we choose to implement our self-balanced car ‘WALL-E’. The reason why the robot gets its name ‘WALL-E’ is that we set a gimbal with a camera on the top of this car. Its looks just like the ‘WALL-E’ robot. We set an Android app connected to the phone to control the gimbal and rotate the camera, thus we can see the world in different directions from WALL-E’s eye. The Robot WALL-E’s method of balancing and moving stably by reading the moving acceleration to compute corresponding angles. The change of angle will decide the motor speed like when the robot is moving forward, and the angle change will increase the forward speed to balance WALL-E.