Thunderborg on DiddyBorgV2 movement variability
Forums:
Hi!
I am currently doing a project to create a testbed for Intelligent Transportation Systems using DiddyBorg V2. The general idea is that the robots should follow an internal map with help from camera input and drive in a regular traffic scenario.
My question is related to the control of movements. When i try to calibrate (especially spin movements) the robot performs good 90, 180 and 360 degree turns. The problem comes when the robots have multiple commands before and after a spin. Lets say the robot goes 20cm forward, performs a 90 degree turn (perfectly), goes another 20cm forward and another 90 degree turn. The last 90 degree turn is often between 85-95 degrees. The surface is the same and it turns in the same direction, but with different results. I have tested with normal AA batteries and also a lithium battery pack, with same results.
Is there any way to define a more fine grained calibration? Is it possible to read some metric from the motors with respect to rotation or ticks?
piborg
Wed, 02/28/2018 - 11:27
Permalink
Improving movement repeatability
Unfortunately there is no way for the DiddyBorg V2 to measure how far it has really gone, all we can do is try and improve on the movement itself.
Given that the first move is correct my first suggestion would be to try waiting for a short period once the motors have been turned off after each move. This will given them an opportunity to stop between commands and may improve the consistency.
Look for the
PerformMove
function in the script:All of the moves use this function to actually set the motor speeds. What you want to do is add another sleep at the end of the function for a fixed time like this:
If that does solve the problem you can try reducing the
1.0
second delay between moves until it starts becoming unreliable again. If it does not improve things at all then this is not the problem and we need to try something else instead :)oleah
Wed, 02/28/2018 - 14:28
Permalink
Thanks for your reply!
Thanks for your reply!
This might work and i will try it out. On the other side the robots are supposed to drive continuously and if the second sleep is notably high then some other solution might be needed. If you know of any way the robot can adjust it self (a sensor or something else), it would be highly appreciated :)
il diavolo
Wed, 02/28/2018 - 22:30
Permalink
My DiddyBorg V1 uses rotary
My DiddyBorg V1 uses rotary encoders, repurposed mouse scroll wheels, running on the middle wheels, one each side with a simple mathematical "differential". The results are not perfect but are better than the "dead reckoning" method that you are using.
I also have an XloBorg which gives compass bearings which I use to get reasonably accurate angular movements.
I also work with a low "throttle" (maxpower * 0.6), the slow speed allowing the software time to work it all out, especially when using a camera and OpenCV to find and move towards a target.
oleah
Thu, 03/01/2018 - 10:18
Permalink
Interessting! Is any of the
Interessting! Is any of the mentioned stuff open source? What kind of compass are you using? I have done some tests with this [1] but it seems to be too inaccurate for indoor use.
[1] https://www.seeedstudio.com/Grove-3-Axis-Digital-Compass-p-759.html
piborg
Thu, 03/01/2018 - 18:07
Permalink
Do you have any photos?
If you have any photos of your customised DiddyBorg we would love to see them :)
il diavolo
Thu, 03/01/2018 - 22:35
Permalink
Photos
I did post some photos on the forum a year or so ago but I can't remember where! Things have moved on since then anyway, I'll sort out some new ones soon.
Oleah, everything is based on modified PiBorg code and what can be found on Google, I can't claim much originality. The compass is on the PiBorg XyLoBorg but I had to work at the software to correct the output for variation and deviation. It was some time ago I did this and although I use the resulting code frequently to "aim" Diddy my brain struggles to remember the details.
The same is true for the rotary encoders, both the compass and the encoders run in separate threads. I'll try and publish some de-cluttered scripts in a few days.
piborg
Fri, 03/02/2018 - 13:01
Permalink
Found them :)
I had a bit of a hunt at lunchtime and found those older images :)
My DiddyBorg pictures
il diavolo
Fri, 03/02/2018 - 20:24
Permalink
Photos of my DiddyBorg
Yes, thanks PiBorg, I remember them now. These are as Diddy is now, it looks pretty much the same but the webcam is now fixed rather than on a servo. The four ultrasonic sensors are now all arrayed on the front rather than two each fore and aft. The Gertboard has gone and I use a stand alone Darlington transistor array to switch the torches on and off.
Diddy's current task is to head towards a target (a coloured light) set at the same level as the webcam while avoiding low obstructions (empty coffee tins which the webcam can see over) using the ultrasonics. The compass is used to calculate the approximate bearing of the target if it gets out of sight after taking avoiding action so that Diddy knows which way to turn to get back on course.
I'm now working on the next task which will to identify an object or picture at the target. This will use the Pi camera.
The rotary encoders are now better and more reliably mounted than previously.
Software is mainly modified Piborg scripts in Python 2.7 with OpenCV 3.3.0 for dealing with image recognition and shape following. The Os is Rasbian Stretch and the brain is a Pi 3b. Manual control is by a PS3 controller and Diddy is set up as a WiFi access point and is run headless via VNC, either to my Windows 10 laptop or my "PortaPi", the latter made up of a Pi 2B, an official 7" screen, 2 BattBorgs and a Kano wireless keyboard.
If there's any interest I can post examples of my code for dealing with the compass and the rotary encoders.