Hardware and Software Used on Pi Robot

(Updated Jan 5, 2011)

The sensor inputs and drive motors on Pi Robot are controlled by the Serializer microcontroller made by the Robotics Connection. The Serializer itself is essentially just an input-output device and must be connected to a computer to do useful processing. This can be done using either a USB cable to an onboard computer or wirelessly using either Bluetooth or Xbee radios. The Serializer is a really nice controller if you work in .Net languages (C#, VB.NET, etc) or Visual C++.  And recently I have released a Python library for the Serializer so that you can use the board on Windows, Linux or MacOS X.  The Python library can be used either on its own or as the basis for a node in ROS (Robot Operation System from Willow Garage).

Pi's drive motors are 7.2V Gearhead motors from Robotics Connection that come with integrated quadrature wheel encoders and custom connectors for the Serializer.  The servos on the pan-and-tilt head and the arms are Dynamixel AX-12+ servos from Robotis (purchased from Trossen Robotics) and they are control using the ArbotiX controller from Vanadium Labs.  The video camera on Pi Robot is a 802.11g wireless D-Link 920. This camera is capable of 30 frames per second and seems to work well with a Linksys router.  The sensor near the top of the camera that looks like eyes is actually a Ping sonar sensor.  And the sensor below the camera lens is a Sharp SP2D12 IR sensor.  The camera requires 5V which is supplied off the main battery using a 5V switching regulator from Robotics Connection.

Pi now runs with an onboard computer, currently a Zotac IONITX-F-E which uses a 1.6 GHz dual-core Intel Atom 330 CPU and NVIDIA ION video chipset.  Power is supplied from the oboard 12V battery (see below) and an M3-ATX PicoPSU power supply. (This model is particularly nice because it can run the CPU over a wide range of input voltages from 6V-24V so you don't have to worry if your battery voltage falls below 12V.)

Pi' base frame is built from the light weight aluminum framing kit from Vex Robotics. The green wheels on Pi are also from Vex and the white wheels on the old Pi were taken from an "abdominal roller" exercise toy.  The aluminum tubes on Pi's arm are from Lynxmotion.  The upper arms are 1.5" tubes and the lower arms are 3.0" tubes.  They are connected to the Bioloid frame pieces using these connectors.  Other sensors include current, voltage, light and force sensors from Phidgets.   The onboard CPU and ArbotiX are powered by a 12V Lithium Ion Portable Power Station from Battery Geeks and the Serializer and drive motors use a pair of 8.4V NiMH batteries connected in parallel from All-Battery.com.

Pi's new omnidirectional vision system uses a custom made hyperbolic mirror mounted over a USB webcam that can process up to 90 frames per second. The mirror is mounted on top of a acrylic tube obtained from McMaster-Carr and the rest of the mounting is made from hand cut plastic from Tap Plastics.

In early December 2010, Pi received a Hokuyo URG-04LX-UG01 laser scanner from an anonymous donor.  You can see the scanner at the front of the base in some of the videos and it is used for doing SLAM with ROS.


Software

The code for both Pi and Peppy was originally written in C# using Microsoft's Visual Studio 2005 running on Windows XP.  Recently I began porting the code to Python so that (a) I could run it on either Windows or Linux and (b) so that I could eventually use the Robot Operating System (ROS) from Willow Garage.  (See the November blog entry, Pi Robot Meets ROS). For vision processing I am using the amazing RoboRealm package which comes with a 30-day free trial and is only $89 to buy. However, since RoboRealm only runs on Windows, I will have to gradually shift to OpenCV for cross-platform vision processing. In the C#-days, the neural network routines were done using the most excellent open source Aforge.Net package.  The histogram analysis was done using EmguCV which provides a .Net version of the OpenCV vision package. And the Dynamixel servos were controlled using the open source Dynamixel libraries from Forest Moon Productions. However, these will now be replaced with PyBrain for neural networks, OpenCV for vision, and the ArbotiX Python and ROS drivers for Dynamixel control.