To LUGNET HomepageTo LUGNET News HomepageTo LUGNET Guide Homepage
 Help on Searching
 
Post new message to lugnet.org.ca.rtltorontoOpen lugnet.org.ca.rtltoronto in your NNTP NewsreaderTo LUGNET News Traffic PageSign In (Members)
 Organizations / Canada / rtlToronto / 15792
15791  |  15793
Subject: 
Re: Gear lash....
Newsgroups: 
lugnet.org.ca.rtltoronto
Date: 
Thu, 2 Mar 2006 18:32:26 GMT
Viewed: 
659 times
  
In lugnet.org.ca.rtltoronto, Wayne Young wrote:

Can I ask a stupid question from a non-moving bot owner?  Why do you need that
much precision?  The light readings are going to be pretty variable, and the
robot targets and obstacles are all moving, so do you really need that much
precision to know the general direction you need to be going in?

Calum

Are you thinking the robot could adjust its direction on the fly? In my limited
experience, precision matters when the robot is close to the TO, especially if
it can cover a lot of ground in the time it takes for the sensors to come back
with a reading (or to average a series of readings in my case).

But if you are "close" to the TO...then your precision usually is higher
anyways--less movement, less slop?

Here's some made up numbers to illustrate what I don't understand...

If I read at time 0 at 30 degrees, a "contact".  By the time I come back to it,
at time 20, the contact is likely at 60 degrees.  Why should it matter that when
I return at time 20, that I'm pointed to 31, 34 or for that matter 40?  The
target's already at 60 degrees--so far away, any precision is useless?

Obviously the next step is to scan again and try and reacquire the light...every
time I get scan, I should be getting closer, the final "steps", my turning
accuracy should be higher if only for the fact I don't "lose" as much as a large
turn.  No need for high accuracy on large scans (eg, 360's)?

Am I smoking crack here?

BTW I deliberately did not build such a robot because I knew I wouldn't know how
to write the code to "process" the sensors and movement.  That's why I'm super
impressed with Dave's so far.

Since I'm taking the time to delurk, I'd like to give a shout out to David K. I
abandoned my previous 'bot platform (AB with castors), in favour of Dave's AB
skid steer with 4 driven wheels and 4 motors after reading his posts. The
castors were throwing off my aim and it was making me wonder if my software was
faulty. There's only one light sensor and the algorithm was stop, scan, drive.
If the 'bot doesn't drive towards the light with the new platform, I know it's
the software, instead of trying to factor out the castors.

I'm really happy to hear people are excited about this contest!

Calum



Message has 2 Replies:
  Re: Gear lash....
 
In lugnet.org.ca.rtltoronto, Calum Tsang wrote: <snip> (...) <snip> Seriously, my code for scanning is as simple as it gets-- NQC'd-- task scanner_task() { while(true) { if (found_light == 1) //found light source--keep scanner on light source { (...) (19 years ago, 2-Mar-06, to lugnet.org.ca.rtltoronto)
  Re: Gear lash....
 
(...) But suppose at time 20 your robot is pointed at between 10 and 30 degrees due to the steering defect. Then it has lost even more ground to the moving target. (...) Your thinking is probably correct if the robot can sample the light reasonably (...) (19 years ago, 2-Mar-06, to lugnet.org.ca.rtltoronto)

Message is in Reply To:
  Re: Gear lash....
 
(...) Are you thinking the robot could adjust its direction on the fly? In my limited experience, precision matters when the robot is close to the TO, especially if it can cover a lot of ground in the time it takes for the sensors to come back with (...) (19 years ago, 2-Mar-06, to lugnet.org.ca.rtltoronto)

33 Messages in This Thread:














Entire Thread on One Page:
Nested:  All | Brief | Compact | Dots
Linear:  All | Brief | Compact

This Message and its Replies on One Page:
Nested:  All | Brief | Compact | Dots
Linear:  All | Brief | Compact
    

Custom Search

©2005 LUGNET. All rights reserved. - hosted by steinbruch.info GbR