Robot Magellan

Update 5/21/2007

After getting all the bits implemented, getting the image processing implemented in python and getting all the major components working together on the day before the PDXBot event.  (I had finalized my image processing code on Friday, the event was Sunday and I was helping at the Saturday event.)  I noticed that my cone recognition wasn't working as expected...  So that night I put in image logging logic to my program and at 7AM before the start of the 11am event, I finally got to see what types of images I was getting from the camera connected to the NSLU2.  They where awful!  I started some debug on the slug and found that none of my attempts to adjust the camera brightness settings worked.  My robot was blind!

My guess is that because the SlugOS is built as a big en dean OS that the camera failed because the byte oder of the two byte control values that get programmed into the camera registers assumed a little en dean byte order.  I will rebuild the slug os using LE and see if things clear up.

Anyway, there was a ton of scrambling the very last minutes and I was very nervous going to the event.

See my PDXBot 2007 SRS-Magellan results page for more information, and post event analysis.:

I'll start a new web page for the ongoing updates and applications of lessons learned, and archive this page.
The following is the archive including the code I ran on the event.  pdxbot07/markiii_jalv2_pdxbot07.tar.bz2

Update 5/6/2007
I'm feeling pretty good about my PIC software and the PIC/NSLU2 / python integration.  I've implemented some data gathering to allow post run analysis's and investigations as well as some python programs for parsing and graphing the saved data from each run.  See analysis page for some details :)

As usual the current updated code base (including the analysis code) is in the markiii_jalv2.tar.bz2 file.  I've updated my JalV2 compiler to v2.2.  The PIC code is stabilizing, and now I'm going to focus on the image processing this week.  I'm also going to rebuild the sensor turret, and create a nice and neat circuit board so that I don't have the huge rats nest of jumper wires that can be yanked by a passer by.

This week: turret, image processing, circuit board, bump sensor.
Next week: debug, test, calibration.


Update 4/30/2007
This weekend I did the first real integration between the PIC/JALV2 code and the NSLU2-Open Embedded/Python code.  I got to re-discover discovered the fun differences between big and little endean 16 bit integers, as well as I hit a bug in the JALV2 compiler that thankfully has been fixed if you pull down the latest version of JALv2 .  Its a spooky bug, that wasn't exposed until I added a global for endian-ness, then the heading tracking code went crazy.  It seems that some data space was getting shared that shouldn't have been.  Oh well, good news is that the new version works well again.  It would have sucked to be forced to use a non-open compiler as doing this in all open source is one of my project goals.

I've updated the markiii_jalv2.tar.bz2 file with the current code base.  I've gotten to the point where I'm now using CVS to prevent me from loosing a working version and providing support for developing on my desktop or laptop without having to deal with guessing what copy is the right version.

The Magellan_py/magellan.py has the current program that runs on the NSLU2.  (running it on boot up is currently a start up script in /etc/rc3.d/S99zzRobot, TODO: find a nicer way.)

The way point test of running out turning around and coming back the way it came didn't work so well.  The compass readings don't seem to be 100% true for one heading and 180 deg off that heading.  i.e. lined up on my desk facing the front reads 44.6 deg, if I rotate the robot 180.0 deg I expect to read 224.6 but, what I get is 192.6 degrees.  I'm going to attempt a compass calibration and see if this fixes up things for me.

Lastly I'm not happy with those "mattracks" I'm using to slow the robot down.  I'm going to try plane wheels to see if they are good enough.

Update 4/22/2007
I have the acoustic sensors working to locate and track targets (provided the ground scatter isn't causing trouble)  The idea is to have the turret center the camera on an object, take a picture and identify the cone.  The ground scattering problem will be solved by putting some black foam under the sensors... I hope....

I have done my first attempt at controlled motion and odometry calibration.  The robot goes 670cm in 30,000 encoder tics.  I'm now fiddling with the steering control logic.  The brain dead algorithm for controlling the speed applied to the steering is unstable and oscillates badly.  My next attempt that is a bit better takes the error and computes a target servo value and averages with the current value.  See video's of its performance below.  It works "ok" but still falls into some oscillating modes.  I think I'm just going to have to get out the graph paper and do some math on this to make it work the way I want.

My solid state relay, didn't work at first.  It turned on, but when I ground the input pins together the switch remains closed.  I got some advice from the PARTS mailing list.  It seems that my SSR is for AC output and needs the voltage load to go to zero or change direction for the switch to work.  So by putting the relay between the H-bridge and the moter allowed it to work, where between the battery and the H-bridge fails.

See the updated markiii_jalv2.tar.bz file for my current test software (currently working in Magellan_test) within the tar ball.  The code is not well commented and does some odd things with timers and encoder tick counters to pace the compass sampling and how often the steering is updated per wheel rotation (BTW its 1477 encoder tics per 1 rotation of the front wheels and I want to update the steering 8 times per revolution)

Update 4/6/2007:
I've tested the servos and now I've tested the i2c interfaces to the motor controller, both ultrasonic range finders and the digital compass.  See updated test code in the markiii_jalv2.tar.bz file.  Getting the I2C handshake for the ultrasonic took me much of the weekend.  The aconame site doesn't have good info unless you are using a brain-stem (I'm not).  However; some useful and more complete documentation and sample code is available at http://www.robot-electronics.co.uk/shop/Examples.htm.  Getting the second SRF08 to use I2C address 0xF0 took a number of tries to get right.

The MD03 motor controller was pretty easy to drive.  However; I discovered that full on power spun my poor wheel watch encoder wheel so fast that it spun off the mounting I had rigged up :(  {note to self: don't drive motor at full power}

I also modified my servo code to not implement spin loops within the timer ISR.  The code is a bit more complex and I needed to use the Oscilloscope to debug it, but at least I'm no longer loosing 20% of my processing power to the servo logic.  (something that always bugged me with my initial sumo code but I never got around to changing because for the sumo it didn't matter)  Also I learned that the servos I'm using take a lot of power.  (my bench supply doesn't source the transient load enough and I'm getting brown outs when moving the turret and steering when not on NiCads.

It seems that the power electronics and all the important sensors and components are working.  Its ready to move on its own once I get it programmed!

Update 4/5/2007:
Have the electro mechanical design 98% finished.  The thing looks like a robot.  The only missing bits are the bump sensor (whisker attached to a switch, and the failsafe switch connected to a DC relay (on order).   Besides these two items the robot is ready to program and and debug.  I expect some integration issues to pop up but hopefully I'll be able to address those before mid May.  See the bottom of this page for some pictures of my creation....;)

Updated : 3/31/2007
My goal is to have something that works by mid May for the PDXBot'07 Magellan contest.  So far its looking like I may just make it!  (at least I think I'll have something that will move)

The "current" Plan:

The plan is now to use a NSLU2 and as the Linux computer and interface the NSLUG and the Mark-III over the serial port.  See my page on interfacing a Python program and the JAL code.    Its a USB host and is well supported tested by the NSLUG community.  All this and the stinking thing is < 100$.  Also the Open Embedded community also supports the NSLU2. 

See my Feb 3 2006 high level presentation to PARTS

Getting started with OpenEmbedded and NSLUG, my notes from brining up the openslug build on my Ubuntu 6.x systems. 

I flip-flopped my planning to move my robot to a Traxxas E-Maxx platform but I had all that stuff from my attempt at this a few years ago and with kids in college I couldn't get myself to waist all the stuff and money I had purchased the last time I was planning to make this robot just to replace it with newer but similar mechanical design. 

I'm using a radio shack F-350 super-duty, featuring "MATTRACKS",   radio controlled monster truck.  I've managed to hack (gut) the Radio Shack F350 RC toy in a manner similar the the pdf's in the above link. 

Currently my Robot design consists of:
Mechanical design:
I've removed all the electronics from the toy, keeping the suspension and motor.  I removed the fount wheel drive to make the steering servo work better.  I've attached an encoder disk to the shaft of the drive motor using a propeller coupling from Tammies Hobbies for an RC airplane.  I'm planning to use a hacked up WheelWatcher with that for some speed control and odometery. 

The original steering servo from the RC car had a non-standard interface so I ripped it out and used a standard Futaba steering servo.  I discovered the wonders of "all-thread" and ball couplers.  I hadn't looked at hobbies shops before this.  Now I wish I had.  The RC car and airplane accessories and components are much the same as what I was needing to get my robot working. 

At this point my TODOs are:

Sensor design:

My sensor design is cool.  I"m sort of proud of it.  I have one digital camera and two ultrasonic range finders mounted on a servo / turret.  The plan is to have the turret center on an object and then have the camera take a picture, analyze it, and if  not a cone look for another object.  It will execute this process when it thinks its close to the cone.  Once it thinks its locked onto a cone it will implement a proportional navigation algorithm to maneuver in and touch the cone.

The idea sounds cool in theory.  We'll see if I can make it work by mid may....

Control and compute design:

[todo: add this section]

Pictures:

March 31, 2007:
April 5, 2007:
April 22, 2007
Old notes: Notes   <-- some useful tidbits here.