Playing an instrument is a hobby a variety of people around the world can appreciate. It is not however, easy to learn how. Furthermore, playing an instrument can sometimes be tedious. What if there was a way to enjoy the sounds of live music, by only having to source 12V and 5V? BassBot is an important step into that future.
The purpose of this project is to make a guitar effect pedal that allows a user to record a clip of audio, choose from a list of effects, and then set parameters for that effect before modifying the sampled audio clip itself. After the audio effect has been applied, the user will then be able to play back both the original and modified versions of the audio clip. Once the audio sample has finished playing, it will then be looped to repeat over and over again in the same vain as a looping backing track, or a backup rhythm guitarist. When the user is finished, he/she will be able to apply new effects to the clip or record a new audio sample.
Robotic swarm systems are becoming increasingly popular for accomplishing sophisticated tasks. One area where robotic swarm systems seem to have potential is in construction. This project involves using Raspberry Pis to control a network of Cozmo robots in order to build a structure out of blocks. This was accomplished by writing a Python script to control the rovers via the Cozmo SDK, and a controller script to manage the different race conditions involved in the group construction (who gets to pick up which cubes, where cubes can be placed down). The communication between the rovers and the controller is managed by the Linux operating system, which provides a robust and reliable means of asynchronous data transfer between Raspberry Pis. Our system design allows for a variable number of cubes and rovers to be used.
Honey bees are the premiere agricultural pollinator bringing in more than $150B annually. They are capable of robust sustained operation in unpredictable environments far beyond what is possible with state-of-the-art robots, and many fields of swarm technological research look to them for design guidance. Today they are threatened by a myriad of causes, some still not fully known. It is crucial, therefore, to gain as much information about them as possible, though this task is hard to accomplish given that their natural environment is very hostile to the usual observational equipment. That’s why we’ve become the Big Brother of the bee world. Our system of tagging and tracking bees as they leave and enter the hive requires very little human intervention after set-up. It also detects pollen and has been designed such that another user can easily adjust threshold values at will.
The number of electronic devices in every home are increasing by the year and each of these devices have a remote to control them. Smart TVs, Refrigerators, Air Conditioners, Heaters, Doors, Showers and what not. We want to introduce an infrastructure where we have one remote for all these appliances and the control options for the devices change based on the context. The system displays controls for only those devices that you are in range of and unlike a universal remote where the person has to remember which button achieves what function for a specific device, this system has a touch display which clearly defines the function of that button in the new context.
The Laser Scanner is a Raspberry Pi embedded system device able to digitize objects into .obj mesh files for reproduction using 3D printing. The device does this by utilizing a line laser and the PiCam to perform computer vision. The laser is positioned 45 degrees askew from the laser and projects a bright red line on one vertical slice of the object. The camera detects the slice’s distance from the center to give a mesh slice. The object is spun on the rotating tray and the process is repeated until the full object is scanned. The generated .obj file is finally emailed to the user.
To complete the water irrigation system, we worked with students in the chemical engineering graduate program. These students had been working on the irrigation model and were testing it using a specific set of weather/pressure data. The main directive we were concerned with was properly measuring and irrigating the correct amount of water as reported by the algorithm they had implemented. To do this, we used a Raspberry Pi, a solenoid valve, a flow meter, a bucket, plastic tubing, and waterproofing materials (silicone caulk and epoxy).
Air Canvas is a hands-free digital drawing canvas that utilizes a Raspberry Pi, a PiCamera, and OpenCV to recognize and map hand gestures onto a PiTFT screen. The user’s “brush” can be modified in size and color by using built-in buttons. The direction of the brush is controlled completely using open source OpenCV software and modified to map the pointer finger onto the screen using Pygame following a calibration screen to measure and record the color of the user’s hand. The idea for Air Canvas was a result of our interest in digital drawing and smart photo recognition software.
In this project, we set out to transform a traditional remote control car to a semi-autonomous, self-parallel-parking vehicle. With a myriad of sensors, including a camera, proximity sensors, and line detectors, we aimed to allow the car to drive around a predetermined track, while being on the lookout for possible obstacles and pedestrians that may run in front of the vehicle. The Raspberry Pi was to be the brain of the car, processing sensor inputs and sending PWM drive signals to our motor driver. Given the 1GB of RAM and four-core 1.2GHz ARM processor, the Pi was the perfect candidate for processing all available information. The system is built upon a multithreaded architecture, with all modules written in Python. A modular codebase allowed for a clear system architecture and made it easy for fixes and improvements to be made. This also allows for concurrent programs, with each signalling back to a single drive thread that continually reacts to variables that have been set in the global space. This polling-reactive nature allows for quick response times and efficient data transfer.
In this project, we used a Raspberry Pi to process images and control servos to create a small robot that would mimic a person’s movements. The Raspberry Pi received images as input from a PiCam and processed them in real time. We used OpenCV to identify the person as well as find their chest, elbow, and arm points. These points were then used to find angles between the upper arm and forearm as well as the angle of the upper arm at the shoulder. With this angle, we were able to map the value to a PWM pulse width to control each of our four servos in order for our robot to copy the movements of the person.
In this project, we designed a security box capable of facial recognition, voice recognition, and NFC detection. When something is detected by the PIR (passive infrared sensor), the alarm system goes off and only deactivates when the two-factor authentication is satisfied. In addition, we setup the security box to send out SMS notifications whenever the PIR detected something, the box was unlocked, or the box was relocked.
Coopy is a bluetooth based system which links a robot to a remote control base station. The robot implements facial detection in order to properly orient itself with people. The images captured by the robot are then wireless streamed to the base station for viewing. The base station has a touchscreen and can be used to interact with the robot camera feed and driving system. From the base station, Coopy’s new friend has the option to view the video stream, initiate the friendship wobble, or steer Coopy around with a remote control.
We created an embedded system to control the playback of music streaming from Spotify. Our system is able to stream music from any Spotify album and play it through our speakers. Various aspects of the streaming can be controlled through the PiTFT touchscreen and a microphone, such as playing/pausing, adjusting the volume, and switching to the next/previous tracks in the album. Having two methods of control adds to the capabilities of the system and gives the user more flexibility.
The Multiplayer Game Pi is a single-player or two-player gaming console built on a Raspberry Pi using RetroPie, which is software that combines Raspbian, EmulationStation, and more to run game emulators on the Pi. Featuring two hand-made wireless bluetooth controllers and convenient portability, the console has its own small screen for on-the-go use but can also connect to a larger external monitor via HDMI.
The automated foosball goalie allows one to play foosball without another human player. The goalie is attached to a servo and moves on a linear bearing according to the direction of the incoming ball. A camera is attached to the goalie and uses computer vision to determine the location of the ball. More specifically, color detection is implemented, allowing the goalie to move left or right depending on the location of the ball. The goalie stops the ball by positioning itself directly in front of the approaching ball.
Current GPS solutions require you to pay attention to the screen and distract users from paying attention to the constantly changing situation on the road. Most of the common GPS technologies have audio navigation direction to inform the user about upcoming turns. However, if a bicycle rider is using the system, she is forced to wear earphones to hear the directions and that makes riding on the roads unsafe. Here we are proposing an interactive system consisting of two wristbands that would inform the rider about the upcoming turns. Once the raspberry Pi knows that there is an upcoming turn, it could inform the user about it by actuating vibration in the wristbands. The increasing frequency of the vibration would inform the user that the turn is getting closer and closer. The right wristband would inform about the right turns, and the left wristband would inform about the left turns.
Previous projects that have implemented a gesture-based interface use smart gloves and colored tape to make the fingers easier to track. We wanted to be able to realize gesture based control simply, and in the most organic way possible. For us, this meant tracking the actual hand with no additional hardware. We wanted a similar kind of user experience as the Leap Motion Controller, the current industry standard in gesture control. One way in which our system differs from the Leap Motion Controller is that none of the intensive processing is done on the actual device; rather it is consigned to a computer whose minimum specifications are consistent with the performance of an Intel I3 processor. We wanted to do all the processing in situ. We used the simplest setup we could think of – just a single Raspberry Pi and camera module – and saw where that took us. We added a PiTFT to act as a convenient user interface, and to demonstrate the computational ability of the Raspberry Pi by showing the output of the realtime image processing.
Our project objective was to create a loop recorder using the Raspberry Pi. As a baseline we aimed to make the loop recorder capable of taking input from MIDI keyboards while having support for multiple MIDI tracks and instruments. Our further goals included expanding the functionality to take vocal and electric guitar input.
One of the struggles as an amateur musician is the inability to detect nuances and identify where they fall flat in a performance. We help pianists document and visualize where they could use improvement. From the Pi, the student can record a performance, process the recording, and display the results against a professional recording. This allows the musician to see gaps in their performance- if they rushed a certain portion or were unable to figure out the expressivity and nuances of a musical phrase. This project is an example of an embedded system because it requires external hardware and makes use of the Raspberry Pi’s Linux kernel extensively.
We process the signals received from a USB microphone, use filters to remove unnecessary noise and smooth amplitude over time, use FFT to document the notes being played while adjusting for harmonics and interval size, and display this in a simple, user-friendly interface without requiring excessive processing time or memory usage.
We designed and implemented three different systems for automatic dog maintenance. The first of which is a food refill system. At a pre-programmed time of day, the food tank will open up and release food into a bowl until its full, indicated by an IR sensor embedded in the bowl. The second part is a fetch playing system. We use a DC motor to accelerate a ping pong ball inside a cylindrical cage. Using a magnet attached to the rim of the cage and a hall effect sensor we can determine when the cage has reached its maximum speed. At the maximum speed we use a servo motor to lift the ball over the edge of the cage, causing it to fly in a random direction. The final system is an automatic petting arm. When something comes within range of the arm, it begins making a petting motion. The arm has to joints, giving two degrees of freedom. The three parts combined compose a comprehensive system for automatic dog maintenance.