Overview
Quad Bot is a small robot designed and built from scratch. It has a simple robot arm with 4 degress of freedom, and a small colour camera JPEG camera. It has evolved and is now controlled over Wifi, allowing control with any Android phone.
There are 4 independently controlled wheels, allowing maximum control.
Below is a system overview. At the heart is a Xilinx Spartan 3 FPGA, using one of the boards I made here. This is programmed with a Microblaze softcore which controls the rest of the system. I have memory mapped Uarts, servo controllers, LEDs etc to the Microblaze.
Mechanical
Drive
The drive consists of 4 independent wheels. Each wheel can be individually controlled – speed, direction and angle. This allows a great degree of movement and robust 4 wheel drive. Also the differential can be implemented electrically rather than mechanically.
Parts
I used Pololu geared motors and wheels. I used the 100:1 ratio motor, this provides good torque and still a high speed. The wheels fit the motor shaft, they also have teeth on the inside to allow an optical encoder to determine speed.
The angle of each wheel is controlled with a metal geared servo – MG955. There are loads of these on ebay, mostly from China/Hong Kong.
I got a pack of various nuts and bolts, and 0.6mm, 1mm, 2mm and 3mm aluminium sheet. All from ebay.
Wheels
This is one of the 4 wheels. It is designed so the centre of the wheel rotates when the servo turns.
Base
The main base is cut from 3mm aluminium. Each wheel is attached to 1mm aluminium to the base which gives a small amount of suspension. The aluminium is easy to work with and makes it easy to attach new components I think I got the aluminium from ebay.
The wheels are also square, this makes it easier to calculate angles when turning.
Robot Arm
[NOTE, the grabber end is not finished]
The robot arm is controlled with 4 servos (The 5th servo on there is for the camera). It is designed to keep the weight as close the base as possible.
The base servo is a metal geared type as used in the wheels, as it needs the most torque, the next is an ordinary hobby servo, the last 2 which rotate and close the gripper are micro servos.
Electronics
Brain – Homemade Xilinx FPGA board
The main control board is one of the FPGA boards I made a few years ago see Project:
Using the Xilinx EDK and ISE it’s possible to make a design with an embedded Microblaze 32bit CPU, then there there is enough space on the FPGA to add custom memory mapped peripherals.
The program can be encoded into the bitstream on the FPGA and automatically run on startup. This limits the program size, but it is big enough for my needs.
- Microblaze CPU instantiated in FPGA
- Custom RTL Wheel speed detectors
- Custom RTL servo controllers, allowing 256 resolution
- Custom RTL PWM controller for the motors
- 16MB SDRAM (Not used)
Power Supply
A 7.4v Lithium Polymer 4000mha battery is used for power. This give a range of about 7.0v-8.2v. The picture above shows the DC/DC power supply unit it contains a 5.0v and 3.3v 1A DC/DC IC. Most of the digital electronics runs off the 3.3v, the micro servos and Camera run from 5v, the large servos and wheel motors run directly off the battery.
Wheels
The motors are driven by 2 x Dual Pololu Motor Driver ,this allow up to 13v and 1A per motor. These give to full H bridge control to each motor.
On each wheel is a Pololu Reflective Sensor, this enables speed measurement of each wheel. There is only one sensor per wheel, so only speed, not direction can be determined.
The output from the sensor is a varying analogue voltage, this is conditioned with a comparator with hysteresis.
Above shows the trace while the wheel is turning. The top signal is the raw output from the sensor, the bottom signal is the digital output from the comparator.
The circuit above drives all 4 wheels, connects the wheel angle servos and contains the comparators for the wheel speed sensors. It takes two powers supplies – 5v for the comparators and motor driver logic, 7v for the motor power and servo power.
All this can be controlled by 3.3v logic via the ribbon cable.
AVR Servo control board
A special board was made to control the servos for the arm and camera. This reduced the number of control pins needed on the main FPGA board. This board provides power and signal to the servos, it allows you to set the servo position via an RS232 interface. An AVR ATMEGA64 microcontroller is used.
- Controls 8 servos
- 2 banks (4 servos in each) can be have their power controlled
- 2 x high power switches controlled by MOSFET
- Can set max and min range for each servo , stored in EEPROM
- Max servo speed can be set
- Controlled by one wire (RS232)
STDIO Display
In order to debug and have a status output I salvaged an old project. The LCD is a 128 x 64 matrix display, connected to an ATMEGA128. I reprogrammed it to act as a serial terminal, it accepts a sub-set of escape sequences. There is no character generation built in to the LCD so I needed to generate text in the assembly code.
Wireless
Wireless connection is done with a WifFly RN-131G wifi module. I made a simple board to break out the required pins and mount the status LEDs.
The pins bring out power and the serial port. The device is configured to automatically connect to the adhoc network setup on the Android device. Once a connection is established all data is sent out the serial port to the FPGA to be read by the Microblaze core. Data sent from the Microblaze is relayed through the TCP wifi connection.
Camera
The camera is a LinkSprite JPEG serial module. It has a maximum baud of 115200 so you can not get great FPS, but it’s almost good enough. at lowest res ox 160×120 it produces images of about 3.5KB.
The camera is connected to a UART generated in the FPGA which the Microblaze core has control over.
Software
FGPA Microblaze core
The main software on the robot is running in the Microblaze softcore. The program is burned in the prom for the FPGA so no need to load it at run time, this limits the size but it’s enough for these needs.
The main tasks of the software is to coordinate the UARTs, and to take pictures on the camera. The Wifi module sends a data stream to the FPGA, the core has to decode the stream and send back any picture data.
Here are some example commands:
enum rf_commands_e
{
//Variable length commands
COM_SEND_TEXT_MESSAGE = 0, //NOTE, MUST stay at 0
COM_SEND_PICTURE_DATA,
//Sent from host to robot
COM_SET_SPEED,
COM_SET_ANGLE,
COM_SET_DIR,
COM_CAM_ARM_SERVO_DIRECT,
COM_SET_CAM_ARM_SERVO,
COM_WRITE_TO_MICROBLAZE,
COM_TAKE_PICTURE,
COM_SEND_PIC_PACKET,
//Sent form robot to host
COM_HARD_RESET=0x80,
COM_CHILD_CONNECTED,
COM_CHILD_DISCONNECTED,
COM_PROG_PIECE_RECIVED,
COM_PICTURE_SIZE,
COM_PICTURE_SENT
};
The wifi bandwidth is high BW, high latency from robot to Android, and low BW, low latency from Android to robot. The wifi module does not like data being sent in both directions at the same time, so it’s important to coordinate the data.
The picture data is broken up in to 512byte chunks which gets sent to the Android device each time the robot receives motion data, this ensures they don’t both get sent at the same time.
I had to make careful use of interrupts on the Microblaze core and ping-pong a buffer of data for the camera JPEG image. This meant the last image frame can be sent back at the SAME TIME as taking the next picture. This almost doubled the FPS over a simple loop.
Android App
A very simple app was made to connect to the robot and control it. It connects over a TCP connection. It needs to send motion data, and receives the JPEG image and combines the 512byte chunks to display.