Our objective is to design a fully functional Cooking Robot that will automatically dispense the ingredients needed for a user input recipe. This design saves users’ time when cooking and guarantees the correct composition of the dispensed mix of ingredients. Our project focuses both on functionality and endurance of the machine, aiming to approximate a real-life appliance.
Our final project is to design and build a health monitor that can measure blood pressures, heart rate, and body temperature through an inflatable handcuff, and then display the test results on a computer screen. The device is consisted of four majors parts: hardware (computer monitor, handcuff, tygon tubing, and 3-way splitter), analog circuit, motor and valve control circuit, and Raspberry Pi 3. The goal is to integrate all the circuit elements, so that all the health indicators can be measured through the handcuff.
The goal of this project is building a robot, which can dance with the rhythm, and showing a flowing wave of the music. By using the Raspberry Pi as the controller, we designed and built a six levels LED strip panel using 120 LEDs to show a dynamic moving music waveform. By using these LEDs, we shown music wave with different colors, which depended on the frequency spectrum. At the same time, we used frequency and amplitude of music to decide what kind of movements and how fast is the speed should our robot dance.
Near space photography is the practice of obtaining photos within the upper atmosphere so that images of the Earth’s surface and Horizon can be acquired from a very different perspective than surface photography. In order to do that, a system must be created that can rise to the upper atmosphere while simultaneously capturing these images and reporting the location of the embedded system. The simplest way to do this is to create an untethered system lifted by a weather balloon that utilizes radio frequencies to automatically send packets to a ground station network detailing the systems location as measured with a GPS attachment. Using this system, pictures and position packets can be sent at variable times so the system can be successfully retrieved after landing. This solution for the objective given above utilized a Raspberry Pi as the main embedded system that was coupled with a Raspberry Pi camera for pictures, an Adafruit Ultimate GPS Breakout for position acquisition, a Baofeng UV-5R Transceiver for position packet transmission, and an external battery for power.
In this project we have built a magic wand game system based on Raspberry Pi. First, the user is instructed to deliver a spell either “Wingardium Leviosa” “Stupefy” or “Lumos” and wave the wand in a certain pattern either “down-left”, “up-left”, or “up-right”. If both the spell and the wand gesture is operated correctly, a video showing the corresponding movie scenes where the leading character is casting the spell will be played on the piTFT screen.
We’ve all been there. That moment when the alarm clock rings in the morning and it feels like you’re getting hit on the head with a jack-hammer and you feel the irresistible urge to hit that snooze button to get the sweetest 10 minutes of extra sleep, which often turns into 1 or 2 hours… Because this problem is so common and relevant to today’s college students, especially at Cornell (where the weather isn’t usually great), we wanted to create a gadget that can solve this problem. We present to you our Anti-Snooze Alarm Clock: our solution to one of many problems of modern day college students. Our Anti-Snooze Alarm Clock is essentially a mobile autonomous robot with basic functionalities of an alarm clock. The system consists of the Raspberry Pi 3 (the central microcontroller), piTFT screen (for user interface), the motor control (for robot’s movement), the Speaker (to annoy the sleeper), RTC (Real-Time Clock) module (to keep track of time offline), PIR Motion sensor (for robot to recognize that the sleeper is trying to grab it so that it can decide to run away), and IR sensor (for robot to detect object in its path to avoid collision).
We built a CNC controller in bare metal capable of reading in a GCode file, interpolating motor positions, and setting the motors to move a spindle to those positions in a closed feedback loop. In order to accomplish this in a bare metal environment, the RaspberryPi needed to be configured to setup the parts of the processor’s peripherals that would be needed – timers, interrupts, UART controller, GPIOs, etc. Once configured, we were able to start building out modules to handle path planning and stepper motor control.
Our project is based on the popular Magic Mirror project that is available online here. We decided to improve this product by adding user interaction. With the incorporation IR break beam sensors, users can now interface with the mirror and control the displays. The IR sensors elevate this mirror to a fully fledged product that users can interface with and use on a daily basis.
For our project, we made a robot that is able to acquire different colors of targets, drive towards them, and then return to the start position. Along the way of driving towards the target, corrections are made based on the position of the centroid of the target. We used the PiCamera for capturing images, OpenCV for processing the images, the microphone on a Logitech webcam for listening for speech commands, and the Google Speech Recognition API for doing speech to text conversion.
A lot of homes have some form of plants, all of which need some sort of scheduled watering. A sprinkler or even drip irrigation system could work but would not be able to really handle a diverse set of plants. These systems are usually too expensive to implement around the house, and generally are more appropriate outdoors where stray water will be absorbed into the surrounding soil or ground. We designed a robot to help with the collection of issues by watering specific targets at a designated time. The robot will monitor the soil for moisture levels and provide water then the moisture levels drop too low. It will be able to handle multiple plants which can be spread around the robot.
The goal of the RPiMapper project was to build a mobile robotics platform for autonomous mapping of a static environment using multiple ultrasonic range sensors. Environmental mapping was done using an occupancy grid method based on the algorithm detailed in Probabilistic Robotics. Localization was done by tracking the rotation of the wheel spokes with limited success. Future iterations should focus on improving the method of localization.
In this project, a cleaning robot butler was successfully built. This project was inspired by ‘Rosie the robot maid’ from the popular children’s cartoon ‘The jetsons’. The cleaning robot butler had two modes: a manual mode and an autonomous mode. In the manual mode the robot was controlled using a Sony Playstation 3 controller connected to it via bluetooth, while During the autonomous cleaning mode, the robot utilized a Pi Camera as a tool for vision which allows the robot to see the environment. The robot observes any objects that fall in its line of vision and guides itself to the object. It then picks the object up and begins to move itself to a designated trash area. This project was done using a Raspberry Pi as the development platform, the program for this project was written using python and the python openCV module and the picamera module was used with the Raspberry Pi camera to achieve autonomous control.
Machines such as turbines, pumps and compressors degrade over time and many of the faults can be diagnosed by vibration analysis techniques such as the Fast Fourier Transform (FFT). This project aimed to create a raspberry pi based portable system to diagnose machine health issues using mechanical vibration analysis.
Our project idea is to utilize a Raspberry Pi 3 to build a webcam server that provides a platform for users to remotely monitor their homes over the internet. DropBox uploading of captured images and email notifications will be triggered if a human is detected to warn the user of potential intruders. One usage of this project is to provide a home security system for homeowners while they are physically absent. The hardware of the system includes a Raspberry Pi 3, a Pi Cam, and a pan-tilt kit with two servos that provides 180 degrees of up/down and left/right rotation. Our final software is based on experimenting with existing OpenCV libraries by using various combinations of motion, face, and human body detection to find the optimal combination of performance versus accuracy for a practical security system.
Have you ever wondered about how to get your little robot to observe and detect “danger” in its surrounding when it’s moving? Our final project has focused on designing a fully functioning robot with a ball launching mechanism, and utilizing some of the fantastic OpenCV tools to help the robot identify the pre-defined target along its way. Besides its ability to identify the targets automatically, the robot also has a complete and friendly user interface for users to manipulate the robot as they see fit.
In this project, we built a music player and four-track recorder that could be run solely on the Raspberry Pi. Our software allows users to have the experience of playing a piano keyboard or drumpad set on a touchscreen interface, with the option of recording up to four tracks in either instrumental mode. Any combination of recorded tracks can be played back simultaneously.
Wearable technology has become a recent trend that many major corporations such as Apple, Fitbit, Google, and Garmin (to name a few) have pursued. In particular, the market for smartwatches has grown in the past four to five years; however, there hasn’t been one that has taken advantage of the Raspberry Pi to date. The purpose of this project was to create such a watch, appropriately called PiWatch, using the capabilities of the Raspberry Pi (especially the PyGame library) and a TFT touchscreen. Like most smartwatches on the market today, the PiWatch was designed with a passcode-enabled lock screen and a homescreen showcasing all the apps developed for this project.
The goal of this project was to design a human-machine interactive system with an implementation of a tiny game named “Three in a row”. The rule is very simple: the two players (in this case a human and this robot) take turns to place a piece on an empty cell in the 3×3 game board, and whoever gets three pieces in a row first wins the game. In the end, we have successfully built a robot that could recognize game on board and use a robotic arm to grab and place game pieces onto the board.
We began this project by exploring ways to allow a user to paint in a virtual environment and produce an image via a computer-aided system. We came across an old plotter and worked to modify the device to meet our needs. Our final system involved the use of a wireless ‘canvas’ – a touch screen onto which a user draws an image – and the plotter system which uses these movements to produce a scaled-up physical rendering of the image drawn. The final prototype involves two Raspberry Pis: one which sits on the plotter and controls the motors, and a second handheld Pi with a touchscreen attached.
Nowadays autonomous driving has become the big trend in the automotive industry. Some modern cars have integrated the lane keeping assist into their cruise control function. Our final design project expanded that and achieved a more advanced lane keeping and changing assist system. The project aims to simulate the real-life highway situation and we designed an auto robot car based on Raspberry Pi, which could perform auto lane-changing, safe distance keeping and auto lane-keeping. The algorithm is implemented in Python and ultrasonic distance sensors and infrared line followers are used in our robot.
Objective – Creating a wildlife tag for the pangolin, an endangered scaly mammal. This project describes an initial prototype for a novel method of data collection using an accelerometer to observe movements, behaviors, and activities of the pangolin. The corresponding activity level data can be used in a myriad of studies. Therefore this project is named the Pangorometer, or a Pangolin Accelerometer.
The smart music player knows exactly what the user wants to hear by taking in weather and ambient noise information and generating a custom playlist of songs played directly from Spotify. Our system uses the Raspberry Pi to run a music selection algorithm that parses different attributes of songs and sorts them based on weather information (e.g. if it is rainy, relaxing music is played). The Pi then sends the song ID to the user’s personal computer and plays directly from the computer. The weather is also displayed on an LED matrix so that the user can see what information the algorithm is using to sort the songs. Pushbuttons allow a user to skip and “dislike” songs. A USB microphone allows the user to find the ambient noise of the environment and potentially add that information to the algorithm. The smart music player reads the environment and interacts with user devices to introduce cool new songs to a user.
In a $15.3B deal in early 2017, Intel acquired Mobileye, a computer vision company that develops technology for self driving cars. Among other functions, Mobileye’s technology can detect lane departure in a moving vehicle and is used by virtually all car manufacturers including Tesla and BMW. Our interest in this feature and in driving technology led us to our desire in developing an inexpensive, Raspberry Pi-based dashcam with a lane departure warning system. Lane departure is the leading cause of fatal crashes; inexpensive, widely-available lane departure warning systems could save many lives.