ECE5725 Fall 2018 Projects

Violin Virtuoso

Our team set out to create a virtual ‘air-violin’ using the Raspberry Pi 3 as a target architecture. By configuring two gloves with sensors, the user would be able to gesture their arms (as if playing the violin) to produce their desired note from the Raspberry Pi. The device would also make use of the PiTFT to let the user know which string and note they were about to play, along with whether or not they were producing sound.

Power Saving System

Our project is about saving power by sensing the presence of an individual in the room. People often leave their homes in a rush and forget to turn off home appliances like fans and lights to name a few. Our design senses the presence or absence of a person and will turn appliances of the room On or Off. The design comprises of Raspberry Pi as the central unit.We have decided to design a ‘Node’ that will comprise of Raspberry Pi 0W and a Passive Infrared (PIR) Sensor.

Space Object Tracker

For the space junkies or just the curious minded, it is always interesting to know what is flying far above our heads.Unfortunately, city lights or cloudy skies make it almost impossible to spot objects of interest. The goal of this project was to build an interactive display piece that can track and show the movement of space objects in the vicinity. While there are many thousands of objects flying around the Earth, we have narrowed the objects of interest to active satellites in Low-Earth Orbit (LEO). In total, about 1200 satellites are being searched for. The information gathered about each satellite includes satellite name, unique ID, year launched, corresponding launch number of that year, longitude, latitude, and altitude.

Remote Control Snake

The basis of the remotely controlled arcade game is a Pi Zero W connected by bluetooth to a Raspberry Pi 3. The Pi Zero W has a keyboard which the player uses to control the game. The Pi Zero W sends the player’s keystrokes to the Raspberry Pi 3 which will can change the snake direction, pause the game, resume the game, or quit the game. The output of the snake game is then drawn on an LED matrix (updated about 4 times a second) for the user to see. The end result is a small, remote controlled snake game with an arcade-like feel provided by the LED matrix.

Number Plate Recognition System

Our project aims to design a parking barrier modeling system with automatic number plate recognition.In our project, we used two Raspberry Pi. One for controlling the car stuck with printed number plate to move linearly toward the barrier and stop if it is not allowed to enter. The other is for controlling two Infrared IR sensors to trigger camera and servo respectively, and for image processing and recognition. The whole parking system runs as shown in the figure below: when the car moves close to the barrier within a certain distance, a sensor will react and transfer a signal to the RPi. Then, RPi will control the camera to take a photo, and processing the image get from Rpi camera using openCV and transfer individual number or letter images to a sequence of characters using OCR so that the input number plate can be checked whether matched the stored number plates string in “database”. After processing with OCR, the number plate will be recognized by RPi, then displayed on the PiTFT. If it matches with the number plates in the if-statement we set before, the barrier will lift up to let the car in, or the car will be barriered out. After the car passes through the barrier and reaches a certain distance, another sensor behind the barrier will react and transfer a signal to RPi. And RPi will control the barrier to lift down.

Smart Trash Can

The trash can can detect the user’s presence using sensors and opens its lid when the user is near by. In addition, the trash can supports voice recognition. The user can also open or close the lid by telling it to “open” or “close”. The trash can also has a status display that tells the owner whether it is empty, half full or full along with emoji displays based on its fullness. The fuller it gets, the unhappier the trash can will be. It can remind the user to take out the trash at the end of day if it gets full by sending the user an email. With the Raspberry pi, we turn the traditional trash can into a smart trash can and make life easier.

Object Finding Robot

The object-finding robot is inspired by the situation that there are lots of color blindness people having trouble with finding objects with the exact color they want, such as clothes and accessories. We designed this color-based object finding robot to make their life easier. Generally, what the robot does is it looks for objects with the given color and marches to the object, there is a graphic user interface where users can simply choose the color by tapping piTFT touchscreen. At a high level, our project is based on a robot controlled by two parallax servo, Raspberry Pi serves as the brain of the robot. By processing the inputs from Pi Camera and Infrared sensors, the robot is able to recognize different colors, navigate to the target color object, and avoid obstacles during the navigation.

Portable Game Console

Our ECE5725 project is a portable game console. The core part of the system is a CHIP8 emulator written in Python. Users use the numeric keypad to interact with the console, and PiTFT displays the menu and game. The external speaker is used to play sound. The whole system resides in a handmade box created in cardboard, which is environmental friendly. The Pygame module is the top level of our implementation and is used to display the user interface and coordinate all the other parts such as keypad input, sound output and game display.

Flappy Bird Game

Our project is a new version of Flappy Bird Game. It is a pygame program written in python. Users can play this game either on PiTFT screen or play it in their computer. It contains two play modes, single player and dual player modes. In single player mode, each player have 3 lives and their NetIDs and final scores will be writen to a score board, which keeps track of top 10 scores. The dual player mode gives users the opportunity to compete with their friends. We also implemented two background settings, daytime and night modes. The users could also choose either one background mode before playing the game. One difficulty for users is that the horizontal shifting speed will increasing as time goes on. We also implemented a hidden trick in the single player mode. When user achieves 10 scores, the flappy bird will evolve into flappy Joe.

Slam Clean Robot

In this project, we aimed to building up an autonomous indoor clean robot. Firstly, it will use odometry and distance information to construct indoor environment by SLAM algorithm. Then, it will combine the color information with shape information to recognize cans as our test trash. With pixel value in the image, it could transfer them into world coordination. At last, it will visit these trash.

Raspberry Pi AR Game

We developed an AR game based on RPi in our project. In this game, player can control a robot through WiFi to explore the environment around it. There would be some monsters and other items for player to find. Player can fight with monsters to gain experience or can be defeated by a monster. Enough experience can level up player and monsters can grow stronger, too.

RecycleBot

The objective of this project is to design and assemble a robotic arm that recognizes and picks out recyclable wastes from a workspace of mixed wastes. The arm should be able to approach its target with a good accuracy, and occasional mistakes can be tolerated. This recycling robot will both reduce the cost in waste management and also reduce amount of man made wastes in nature.

The Sweeper

For our ECE 5725 final project, we created a cleaning robot called the“Sweeper”, which functions are pick up “trash” and throw it to specific location and achieve “waste sorting”. The hardware of the “Sweeper” is a Raspberry Pi, a camera, five servos and software design on the Raspberry Pi based on python language with openCV and pygame library. We use OpenCV to identify the type “trash” by its color, the distance of target by its area shown in the video frame. It will run atomically and can be controlled by buttons displayed on the PiTFT screen to achieve start, resume and quit. When get a trash or drop a trash, the “Sweeper” will show what it got on screen.

Mindbot

An EEG, or electroencephalogram is a test that measures the electrical signals output by the brain through the use of electrodes placed on the head of the subject. EEGs were historically reserved for use in the medical field as a diagnostic test. However, in recent years, many engineers and have discovered that there exist many exciting applications for EEG technology in research, control engineering, and consumer products among many others. Thanks to the growth in this field, technology has not only improved, but so have resources for independent developers and price points of commercial headsets. One such headset, the MindFlex, is available at a very low cost as part of a fun family game. However we think that this headset is capable of much more than just a game controller. With “MindBot” we hope to show some of the possibilities that exist for use of EEG interfaces in modern technology and future advancements.

Multipurpose Operator

The objective of this project is to create a device that can perform operations on a plane. Nowadays, we can get bonuses from many mobile phone software by login and complete some particular tasks every day (for example we can earn credit points in some software by logging in, touch some special buttons or watch some videos). However, the process to get those bonuses can be boring and sometimes complicate. This brings to the first function of the device which is to perform a certain touch and swipe operation pattern on a mobile phone that helps the user to complete the tasks stated above. Adopting the idea from the 2D-plotter, the second function of the device is to plot a image on a paper with a pen mounted on the device.

Robot Cat

In this project, we develop a robot cat based on Raspberry Pi. We implement several functions to the robot cat. Firstly, we attach three ultrasonic sensors to realize the function of avoiding obstacles. It will turn left when the right sensor detects obstacles, turn right when the left sensor detects obstacles, and go backward when the front sensor detects obstacles. Secondly, we install OpenCV and use Haar cascades algorithm to realize the function of face detection. When it detects its owner, it will come towards its owner. Thirdly, we use pygame to realize the function of displaying the output of the Pi Camera, then users can take photos with four different cat patterns with a tap of button.

Self Parking Car

We have built a low-cost prototype for implementing a parallel parking algorithm on a mobile robot car, using a Raspberry Pi, camera, Ultrasonic sensors and an optical sensor. A hardware push button starts process of self-parking the car. On pressing the button, the robot moves forward, while scanning for vacant spots on the side using ultrasonic sensor array and the camera. On finding a suitable spot of appropriate dimensions, the robot moves forward and stops at an appropriate distance. The robot then makes a 45 degree turn and back up into the spot. The ultrasonic sensors at the back of the robot allows it to move backward till the obstacle. Then the robot makes another 45 degree in the opposite direction to become straight. After this using the front and rear sensors, the robot can park itself within the spot.

Raspberry Pi Smart Home

The project is a Smart Home system. This means that it consists of several sensory inputs and processes them to perform “smart” tasks. In this case, the tasks consist of providing lighting and cooling to the users. The project system consists of four input sources: a video camera, a temperature sensor, an ambient light sensor, and buttons for manual user input to override the automatic features. The temperature sensor and light sensor provide details about the environment. The video camera uses openCV, a computer vision library, to detect the direction of motion and keep track of the occupancy of a room. The video camera is meant to be placed near a doorway. The video camera runs off a separate processor, a Raspberry Pi Zero, which communicates via Bluetooth to the main Raspberry Pi 3 that reads the other sensory inputs. Based on these inputs, this Raspberry Pi 3 outputs signals to the light and fan to either turn them on or off.

Smart Door System

We Designed a smart door system based on RPI. It allows people access the door using their passwords, voices and faces. When someone ‘knocks’ the door, our system will automatically send a email with a link of a website to owner’s email address. The door sends high resolution and low delay live security monitor video to that specific website. Owner can access the video streaming by login to a website. Owner also can control the door remotely on the website. What’s more, our smart door system can connect with a mobile phone app called wechat. Owner can use the wechat to remote control the smart door system. And in each steps, our door can give user an audio instruction.

Rubik’s Cube Solver

For this project, we chose to build a robotic rubik’s cube solver using a Raspberry Pi. Our goal was to build a robot that could scan a rubik’s cube, compute the algorithmic solution, and execute the algorithm moves to solve the cube. We designed a fully custom mechanical chassis, consisted of both laser cut acrylic and 3D printed parts. We used a Raspberry PI for all processing and control, a Pi camera to scan the faces of the Rubik’s cube, 4 stepper motors for rotation of each face, and 4 solenoids for engagement of the claws on each face.

Claw Pi Machine

Our claw machine, which can catch dolls inside, has three mode. The first mode is the hand mode. In this mode, a user controls lever switches to move the claw. Control signal sends to Raspberry Pi and Raspberry Pi performs movement accordingly. The second mode is the voice mode. In this mode, a user controls the claw by voice. Raspberry records the input voice, performs speed-to-text, and moves the claw in specific direction and duration accordingly. The last mode is auto mode. In this mode, a user selects a doll on the piTFT touchscreen. Raspberry Pi performs real-time object detection to locate the selected doll. A close-loop control is performed to move the claw to that location and captures the doll.

Raspberry Pi Virtual Vibes

Our final project is to create an incredible music device called “Virtual Vibes”. The device consists of two parts: the main part is a Raspberry Pi base connected with 12 pairs of infrared emitter and sensor. For the second part, it is a specially designed stick with an add-on absolute orientation sensor. This is an intelligent and dynamic device which can detect how hard you hit the infrared beams and respond with the corresponding volume of sound. With these two parts cooperating with each other, users can have the opportunity to create music with multiple layers of sound and make their music smooth and rhythmic. What’s more, to implement the higher and more complex design we need to add more components, we will implement two buttons, when we press one of the buttons it will switch to another octave. And we try to implement a different type of the timbre based on the external speaker.

Intelligent Home Control System

In this project, we designed a system to control the electrical appliances and arm the house with a tracking camera. There are two parts in the system. In the ‘Controller’ part, users can turn on the lights and the fan by touching the screen. The temperature and humidity of the house show on the ‘Controller’ page. In the ‘Security’ part, users are required to set the PIN for their house. With the correct PIN, users can arm or disarm the house. When the house is in ‘Arm’ status, magnetic door sensor turns on. Once the door is open, the camera will start to track the face of intruder, taking pictures and sending an alert email to the user.

Rolling Ball

In this project, we create a game using Raspberry Pi. We use an absolute orientation sensor to develop the game and turn the sensor data from accelerometer and gyroscope into real time 3-dimensional orientation, so that user could control the game by body motion, a new way to interact with computer, instead of with traditional keyboard and mouse. In this design, we use pygame for game developing, and use piTFT as display interface of the game. In this game, users are going to move the ball from start point, which locates on one end of a randomly chosen map, to the end point, which locates on the other end. Game have different modes, with different level of difficulty and different time limits. For entertainment value, a background music will change according to the state of game.

Telepresence Vehicle

For our final project, we build a vehicle system that can be controlled remotely as long as there is wifi. The vehicle has a built in camera that transmits the video feed to the user at a base station. The user uses motion control from the base station to remotely control the vehicle. This project was inspired by remote control drones and vehicles. We wanted a way to remotely control a vehicle when users don’t have sight of the vehicle. As a result of this project, we were able to have a controller communicate with the vehicle by exchanging IP addresses for requests over wifi. The controller contains a sensor that captures motion data for vehicle control and then sends that data to the robot using socket UDP and wifi. The vehicle then receives commands transmitted from the controller and move accordingly. The user can see through the vehicle with an onboard camera that host a video stream using wifi. The stream is displayed on the controller in a user friendly way using Pygame.

Sand Flow

The work team has done for the sand flow system is to simulate the motion of sand to realize the sand matrix game on PiTFT screen. The piTFT screen is hand held by users so that users are able to control the flowing direction of the sand by turning around the PiTFT screen. The final project can be divided into two parts: hardware and software. The hardware part is to realize gravity sensitivity by using the accelerometer BNO055 from Adafruit. It is used to detect the angles how much the piTFT screen has been turned and rotated. The platform used is the Raspberry pi and piTFT to display the game. For the software part, team implements all three modes which include sand flowing with barrier, nozzle and swipe to flow based on the algorithm of cellular automation. Also, the color of the sand and the specific nozzle chosen for releasing sand are controlled by the GPIO buttons on the board.

Self Balancing Robot

Anyone who’s seen Boston Dynamics videos cannot help but be awestruck (and maybe even a little terrified) of the incredibly graceful and lifelike movement exhibited by their robots. Mobile robots such as these are dynamics and controls masterworks. Drawing inspiration from such great feats as these we thought it be appropriate to try to develop a self-balancing mobile robot. Such a robot, we envisioned, would not only be able to balance itself but also navigate based on user command and do all this while carrying other objects. This, we figured, could be accomplished with a balance control algorithm running on a Raspberry Pi. The balance control would receive orientation data from an accelerometer/gyroscope chip and would issue commands to a motor driver thereby controlling the wheels at the base of the robot. This seemingly straightforward idea would end up becoming a monumental undertaking. Throughout the following page we detail our first ever attempt at creating a self-balancing mobile robot.

Alarm Robot

We introduce you to Alarm Robot system built in the Cloud, which wakes you up in an entertaining and effective way. Users are allowed to select alarm music of your favorite artist and set the alarm time on the OFFICIAL Website before they go to bed. The robot will wake up in time in the morning playing music according to the Real-Time weather condition. With music, everyone can have a good mood towards a brand new day ! In order to turn the alarm off, users have to catch the robot moving in random direction and press on the quit button on the touchscreen. On the screen, user would also find the weather, the alarm time, and the song information on the alarm. After nice stimulation from music and some body movement, you would be ready to start off fresh morning.

Tracking Robotic Car

Based on Raspberry Pi, we create a tracking robot that can track a falling red balloon and try to pop the balloon before it lands. In order to recognize the balloon, we implemented the color detection by OpenCV. The precise and stable locomotive of the robot, which includes camera tilt kit and two DC motors, has been controlled via PID algorithm. Besides, leveraging python’s multiprocessor modules tremendously decreases image processing latency in real-time.

Home Intelligent Assistant

This project is designed to develop a Home Intelligent Assistant based on Raspberry Pi which realizes functions of providing residence with security alerts, daily information, and the ability to interact with this system.  In our project, we will be able to provide a reasonable priced (under costs of 100 USD) Home Intelligent Assistant, which can work in different working modes to provide users with home security, vocal outputs, web login servers and user-friendly interface.

Space Protector

Most people had played the various arcades while the video game machines were not that popular at that time. One of the most popular arcades would be the light gun game. We use the light gun connected to the arcade to shoot the flights, UFOs and zombies on the screen to get higher scores. We decide to make this light gun game machine to pursue an arcade-like system realizing the basic functions for our teenage memories.

Gesture Controlled Smart Mirror

WE MADE A GESTURE-RECOGNIZING MIRROR USING A RASPBERRY PI, OPENCV SOFTWARE, AND MAGICMIRROR API. THE GOAL OF THIS PROJECT WAS TO EXTEND UPON THE SMART MIRROR BY ADDING GESTURE RECOGNITION WITH THE INTENTION OF TRANSITIONING BETWEEN PAGES IN THE MIRROR USER-INTERFACE. THIS ALLOWED FOR AN END-TO-END USER INTERACTION WITHOUT RELYING ON ANY PHYSICAL TOUCH.

Email Spam Filtering Using Raspberry Pi

Access to internet and social media has resulted in an exponential increase of digital marketing and targeted advertising. One of the ways targeted advertising is achieved is through e-mails. Most of the times, the advertisement e-mails are unsolicited and provide no useful information to the users. These are better classified as spam. E-mail spam filtering is becoming all the more important and relevant in today’s digital age, given the massive amount of targeted advertising that’s in place. Even though e-mail spam filtering isn’t a new domain per se, of late it’s being treated from the perspective of Artificial Intelligence, particularly natural language processing and machine learning. This project targets the domain of e-mail spam filtering using machine learning. A classifier is trained using supervised Machine Learning algorithm called Support Vector Machine (SVM) to filter e-mail as spam or not spam. Each e-mail is converted to a feature vector. The SVM is trained on Raspberry Pi and the result displayed on the piTFT screen. In addition to displaying whether the e-mail is spam or not, the display also gives the user information about potential reasons for why the e-mail has been classified as spam. The database used for training will be a toned-down version of the SpamAssassin Public Corpus; only the body of the e-mail will be used.

Comments are closed