Guest Blog: Using the Solar Pi Platter in a Raspberry Pi Based Robotic Car
Editors Note: We were tickled to receive this guest blog by the fine folks over at Rocket Blue Automation. They are running a kickstarter on this product and we have already bought one.
[callout size=”col-12″ title=”Checkout the Solar Pi Platter Kickstarter” button_title=”Go to Kickstarter” button_link=”https://www.kickstarter.com/projects/1647124460/solar-pi-platter?utm_source=SwitchDoc&utm_medium=email&utm_campaign=SwitchDoc” button_size=”normal” button_rounded=”true” button_color=”red”]
The Pi Car came about through a convergence of happy circumstances. I needed to test the Pi Platter in a real-world situation and we needed something to demo it. At the same time several members of my local makerspace, Solid State Depot in Boulder, CO, were engaged in building robotic entries for the Sparkfun AVC competition. It was fun watching their machines whirring along. One of the design goals for the Pi Platter was for the PWM outputs to be able to support motor speed control so I got to thinking about the Pi Platter as a robot controller.
Solid State Depot has an extensive collection of donated electronic components. Searching netted me both an old Ardubot base and a Sharp distance sensor, a good starting point.
Motor Control
The Ardubot was designed to be controlled by an Arduino with only forward/reverse control at a fixed speed but helpfully contained a prototyping area. I started hacking. The L293D has an Enable input that is often used as a PWM input. This signal was tied high. I dremeled the traces and connected these inputs directly to the Pi Platter PWM outputs. I used a pair of NPN transistor inverters for the two driver control signals for each motor so a single signal driven by Pi GPIO controls motor direction.
Fairly high PWM frequencies are typically used with motors. The Pi Platter supports three different frequencies (732 Hz, 2930 Hz and 46875 Hz). The highest speed works very well. At 732 Hz the motors worked but tended to stall under load at lower RPMs. The middle frequency was problematic with the motors having very little power and being incredibly noisy.
Early driving tests lead to another change. Originally the motors were powered directly from the Pi Platter’s 5 volt output and it quickly became apparent that they were underpowered. SSD’s junk bins yielded another good find: an old LT1171 boost converter IC in a package easily soldered to the proto-area. Easy to use and not particularly demanding of inductor value or layout. I now had a much beefier power source for the motors that could also be directly powered from the battery.
A Mouse Eye View
The Pi Car had to have video of course, for fun as well as part of the Pi Platter testing. One of the Pi Platter’s “claims to fame” is a higher performance USB subsystem that can support multiple high and low speed USB peripherals simultaneously. Video and simultaneous Wifi as high speed devices along with the low speed Pi Platter micro seemed a good test.
An inexpensive Logitech USB camera fit perfectly. Motion software is too slow but mjpg-streamer easily handles 15 fps at 320×240 pixels. The resulting stream can be displayed by a web browser or within my control application.
The Sharp distance sensor was mounted above the camera and connected to a Pi Platter analog input with Vref set to 4.096 volts. A table in the control software maps the ADC value to distance.
Rounding Out
A USB reading lamp made the perfect steampunk headlight and can be turned on and off using the Pi Platter’s per-port USB power switching.
An inexpensive INA169-based current sensor in series between the battery and the rest of the system and wired to the Pi Platter’s other analog input gives current draw. The Pi Car idles at around 700 mA from a charged battery and can pull upwards of 1500-2000 mA when the motors are running and loaded.
Finally a Qi receiver module connects to the Pi Platter’s USB power input for wireless charging.
Future Plans
I’d like to add an IMU connected directly the Pi’s I2C interface and modify the Pi Car’s software to be able to have a sense of it’s location. Then use OpenCV to be able to read signs directing the car around as well as directing it to find a charging station.
1 Trackback / Pingback
Comments are closed.