ECE5725 Fall 2020 Projects

Perfect Shot Backboard

Our objective was to create a mechanism which causes you to make a basket whenever you throw a basketball at a hoop. To do this, we used computer vision techniques to process a live feed of a basketball shot and used the Raspberry Pi to move the backboard using a series of filtering and controls algorithms. This project involved computer vision, path estimation, dynamics, motor control and use of a multi-thread operating system.

Under Cover System

During the COVID-19 pandemic, it is essential for everyone to follow the prevention guidelines to stop the spread of this horrible virus. Social distancing and facemask wearing are extremely significant, especially for the service industry. However, for convenience stores, it is difficult for owners to keep track of their customers and ensure that every person has a proper mask on. It either requires a lot of human resources or creates unsafe conditions. It can also cost a lot financial wise. Therefore, in order to solve this dilemma and create a safe environment for our community, we introduce you to our project called “Under Cover”.

Can Collector

Our project aim to design a fully automated robot which is able to collect the target object during the job routine. The robot shall classify the cans as beer cans and cola cans lying on the ground. our robot will run on the ground and detect cans. If a can is detected, the robot will recognize the can and push forward the can to the designated place according to its category. After classification, our algorithm will implement the segmentation algorithm which labels all pixels that belong to the can. With the can segmentation output image, we can compute the angle for the robot to push the cans . Finally, we will design a way to push the can to its destination.

Stream Sensor Base Station

This project is part of a larger MEng project to design a network of floating sensors to collect data on plastic pollution in streams. All that data will be transmitted and collected in this base station for easy retrieval. For this purpose, the base station consists of a LoRa radio receiver, to receive wireless data transmissions from the sensors, connected to a Raspberry Pi 4, which handles data storage and retrieval. For the purposes of testing, I used a dummy sensor with a LoRa transmitter, which transmits sample sensor datasets. A database on the Raspberry Pi stores and organizes the received data, and a piTFT touchscreen displays a user interface which makes it easy to navigate the data stored on the the database.

Hydroponics Monitoring System

The Hydroponics Monitoring System (HMS) is an embedded system designed to assist in the monitoring of plant growth. The system uses a Raspberry Pi at its core, and features a controllable pump, pH sensor, and ppm sensor which allow for the user to easily understand the state of the hydroponics enclosure. The system also features a website which contains information such as the data collected from these sensors along with pictures taken of each plant over time.

Ball Tracking Robot

The goal is to create a robot with the ability to autonomously track and follow a tennis ball. The user may choose several robot modes through hand gestures showing digit 0 to 3 to the camera or by pressing the buttons on the touchscreen display. The modes includes autonomous ball-tracking, as well as manually moving the robot forward and backward. Information such as ball distance or robot modes are displayed on the main display.

Pi Car

This project is based on lab 3 where the robot platform developed in that lab planned to be reused with added functionality/components. Externally, image sensors (quality resolution undetermined as of now) will be placed around the robot with images of its surroundings being constantly taken and ‘analyze’ to compare whether the object in question is there. One of them will be used as a live feed of the robot’s view. This project will have two modes: remote-control and autonomous. Remote Control is where the user has full control of the robot’s movements and while the user is driving the robot, is immediately alerted when the object has been identified through a live feed. Autonomous mode will leave the moment entirely up to the robot (by an algorithm developed by both team members). When the object has been identified, the robot will stop moving, face the direction of the object and emit a sound indicating that it has found the object. It is also planned to include LEDs to the raspberry pi to indicate the status of the recovery robot.

Sparky; Robo Companion

Sparky is an intelligent pet-like robot which listens to its owner and is able to accomplish unique tricks in some ways similar to a real live trained dog. The basis for this project is the Raspberry Pi 4 system combined with a piTFT display, a Raspberry Pi Camera, a microphone, and a motorized chassis. Using both OpenCV and the Python face_recognition library, we are able to display on-screen video feedback of our camera signal with additional image processing to detect faces. A connected microphone allows Sparky to actively record sounds and listen for key words to act upon before doing a desired action, to do so Sparky uses the picovoice library along with custom created files using picovoice console . Lastly, the Raspberry Pi 4 is seated up above an acrylic frame in addition to a rechargeable battery pack, 4 AA batteries for our two DC motors, the motor controller, and our two wheels for locomotion.

Hide and Seek

This project takes the traditional game of hide-and-seek at home and merges it with technology and the outdoors. Players need a phone with a cellular hotspot to play. Each player (a hider and a seeker) will use a Raspberry Pi system and a GPS module. The hider will have 60 seconds to hide before each player will have the ability to see how far away the other player is. Each player will randomly receive power ups during the game that will “zap” and temporarily disable the other player’s screen. Both the hider and seeker can move during the “seeking” phase of the game, but the hider must walk.

Crowd and Cough Detector

Cough and crowd detector for COVID -19 Monitoring is an embedded system that tracks the number of people and number of coughs in the room. This project is inspired by the current pandemic situation and FluSense. FluSense is a device developed by researchers in UMass Amherst that uses cough data and predicts trends in infectious respiratory illnesses, such as influenza. We believe that a system that is able to detect coughing and track the number of people in public places will be useful for COVID forecasting and tracking. We envision this system being used in hospital waiting rooms, common gathering spaces in dorms and school/university buildings, etc.

Pi Kart Live

Pi Kart Live is an augmented reality game developed using the Lab 3 robot. We were inspired by the recent release of Nintendo’s Mario Kart Live: Home Circuit, a mixed-reality version of the popular Mario Kart series of video games. Instead of being situated in a purely virtual video game environment, the game involves a physical RC car with an attached first-person-view camera that can be raced around any indoor space. Only a few physical racetrack gates are required to be set up; everything else, including typical interactive Mario Kart items, are rendered onto the live camera feed in their real-world positions. Our project emulates most of the features of Mario Kart Live, albeit with fewer resources at our disposal. The Lab 3 robot, called the Pi Kart, is controlled through VNC using arrow keys on a laptop. We attached a forward-facing camera to Pi Kart, which streams video to the laptop. Our racetrack “gates” consist of printed fiducial markers which encode game objects. When Pi Kart approaches these markers, it detects them and renders the game objects onto the screen; if it touches the object, Pi Kart will respond by changing its motion in a predefined fashion.

Hurdy Gurdy

In this project, the team connected the worlds of classical acoustic instruments and mechatronics to create something new and unique. The instrument in question is a Hurdy-Gurdy, which were common in the medieval to renaissance eras in Europe. It is similar to a violin in some respects, however the bow is replaced by a wheel which is traditionally turned with a crank. To bring this instrument into the digital age, we used a Raspberry Pi, along with several motors to emulate the cranking of the wheel and the pressing of keys against the strings to change notes. In the software side, MIDI files as well as live input were translated into the mechanical actions necessary to produce the appropriate notes and pace. For example, if we received the note A we would trigger the appropriate linear actuator to sound A on the string. Finally, a UI application was made to configure and run the project.

Music Motion

The project design consists of three main components: the game GUI, the accelerometer controllers, and the song movements. The game GUI uses PyGame, and the entire project is done in Python. The GUI has a home screen with options to choose a song, start, or quit. The player can navigate options on the home screen by touching the PiTFT screen or using an external mouse. Choosing a song will open up a game session displaying the current motion for each controller as well as upcoming motions. A value for the player’s score will update depending on the type of motion and timing of the player.

Covid Access Control System

We designed our project to meet the following criteria: When visitors come in front of our system, a face mask must be detected to proceed to the next step. After the successful detection of a face mask, the body temperature module is initialized to make sure the visitor is not having a fever. Afterward, the fingerprint module is used to check the identity of the visitor to make sure that the visitor is: a Cornellian, covid-negative, and has followed the conduct code from Cornell University. After a full check from the above information, the door can be unlocked and the visitor will gain access to the school facilities.

Home Pi Security System

Home Pi, our project, is a security system designed for protecting the front doors of our home. This system streams live video and audio CCTV based on Picamera and Microphone to a webserver, which an android phone could access to. This system provides fast face recognition and real-time semantic segmentation based on Tensorflow and OpenCV powered by Coral TPU. An Android application should be able to access the system remotely.

Raspberry Pi Karaoke System

There are around 6500 languages in the world. Unfortunately, a lot of artificial intelligence technology today exists only in major languages like English, Spanish, Mandarin (Chinese), etc. A major barrier to the development of artificial intelligence technology in all the other languages is the lack of training data. However, data collection is an expensive task, and we must invent ways to continuously collect and refine training data, while minimizing the cognitive load on people contributing to training data for AI models. To this end, we propose to create a Raspberry Pi karaoke system that displays song lyrics on the piTFT screen and records speech input from users listening to the karaoke track through an earphone connected to the Pi. A PiTFT screen, a LED panel, a microphone, and an earphone are the primary components of the Karaoke System. After a user chooses a song on the GUI displayed on the piTFT, the music video is played on the piTFT and the real-time frequency spectrum of the music playback is displayed on the LED panel. The user can listen to the music from the earphone and sing to the microphone. The singing is recorded with the microphone and after the song is finished, the singing is scored based on its correlation and consistency with the playback. The voice recording would be then uploaded to the cloud along with other user and song metadata, which could be used to further train an automatic speech recognition (ASR) deep learning model on the cloud. Our goal is to create a diverse and high quality voice to text training corpus for ASR by engaging users in a fun task such as karaoke singing in languages that have musical content with the corresponding lyrics but an immature ASR technology.

Polyphonic Video Theremin

The theremin is an icon of musical experimentation, used in avant-garde movie soundtracks, concertos, and even pop music through the ages. However, we feel its potential for wider use is limited by monophonic-only playback, specialized hardware, and poor configurability of control inputs. The goal of this project was to develop a polyphonic video-based system that preserves the theremin’s distinctive mode of interaction while utilizing polyphonic playback, off-the-shelf hardware, and audio control that is user-mappable to the three orthogonal spatial axes . Using simple cameras, speakers, and a Raspberry Pi, we were able to create a working proof-of-concept, paying homage to the legendary instrument while expanding the toolkit available to the theremin player.

Never Lost

The idea for the Never Lost project originated because both of us carry around reusable water bottles that we sometimes misplace. We wanted to create a device that would keep track of whether our water bottle, or any other item, was present and if not, give the the location of where it was left. After running through multiple ideas, we decided to create a system that would utilize RFID to tell if our item was present and GPS to record our location, which could be displayed to us at a later point if our item was missing.

Aruco Tag Guided Drawing Robot

The objective of this project was to create a mobile robot which would draw shapes on paper with the shapes input by users on a touch screen. The robot should steer using location feedback gathered using an external camera using computer vision to localize the drawing field and robot in the camera view. The project is mainly based on the autonomous wheeled robot that was developed in lab 3 with an added functionality of drawing. The user would provide the input line drawing using the piTFT touchscreen. The pixel coordinates of the drawn line on the TFT will then be translated into world coordinates and the robot moves along the corresponding coordinates and leaves a trace with the marker that is attached to it. The robot gets feedback about its location from a camera that is installed on the top looking down at the robot and then decides about where to go next.

 

Comments are closed