To LUGNET HomepageTo LUGNET News HomepageTo LUGNET Guide Homepage
 Help on Searching
 
Post new message to lugnet.roboticsOpen lugnet.robotics in your NNTP NewsreaderTo LUGNET News Traffic PageSign In (Members)
 Robotics / 18252
18251  |  18253
Subject: 
Re: Navigation using landmarks (Was: Re: lasers and RCX)
Newsgroups: 
lugnet.robotics
Date: 
Thu, 27 Jun 2002 19:49:16 GMT
Original-From: 
PeterBalch <PeterBalch@compuserve.#stopspam#com>
Viewed: 
621 times
  
Steve

ultrasonic beacons
If you could measure the
arrival time to within (say) a millisecond, then you'd know were you were
to within about 30 centimeters.  I guess that's do-able with ultrasound • and
an RCX programmed in C or machine code using LegOS or something.

If it were me, I'd program a PIC chip to do everything and then deliver the
results to the RCX. But I'm not a Lego purist.

I wonder how you estimate that error?

Tradition had it that with three radio beacons and three hyperbolae you'd
get three lines intersecting to form a triangle and the boat was most
likely inside the triangle. That was nonesense. You've a 50% chance of
being on either side of a line so a 1/8 chance of being in the triangle.

Estimating error and keeping track of it as you do calculations is
fiendishly complicated of course.

With radio Hyperfix, we used to keep a running average over the past N
fixes. My software watched where the ship was and tried to estimate what
the helmsman was doing - heading for port, seismic surveying or positioning
an oil-rig over a wellhead. It would adjust the algorithm parameters
accordingly. The helmsmen loved it but, unfortunately, the company lost the
documentation for the code. I'd long since left and no-one could work out
how it worked so the same code was copied and tranlated line-for-line
through several generations of products, languages and operating systems.

If you had the sound generation system emit an IR or radio pulse in time
with each 'squeak', you could possibly get all the information you need
with just two beacons.

More or less.

Two beacons plus IR gives you two hyperbolae with two intersections. You
could arrange it so one of the intersections was outside the room.

How hard is it to reject sounds from sources other than the acoustic
emitters?

With ultrasonics, probably not too hard. And you've got a good idea when to
expect each sound.

It might annoy the cat of course.

we use acoustic head-tracking systems

Hmmm. What about all the other ways people keep track of humans heads,
hands, etc.?

How about three large coils under the carpet? Each generates a small
magnetic field a different frequencies. The relative strengths of the
fields tells you where you are.

Do you remember how bitpads used to work with an x-y grid of magnetic wires
being activated in sequence.

Or a camera in the ceiling, a flashing LED on the robot and transmit the
x-y-coordinate back to the robot via IR. (That's how we used to keep track
of robot hands when I did artificial itelligence research.)

A rotating barcode scanner and barcodes on the walls has been used several
times.


I hope everyone in this RCX group has read all Rodney Brooks's papers.
(They're available for download on the MIT website and are mandatory for
anyone interested in mobile robots.) He recommends that the robot doesn't
make a traditional map but learns the relationships between experiences.
"If I hit that corner where both antennae get touched then turn right and
go forward for three seconds I'll reach a wide open space with a lot of
light coming from the left."

So you build up a linked network of places you've been, what happened there
and how to get to other places from there. You definitely don't make an
internal model of the world.

Brooks's papers "Intelligence without Representation" and "Elephants don't
play chess" explain the philosophy.

Peter

(I'm going to be out of touch for a while so I'm signing-off
unfortunately.)



Message has 1 Reply:
  Re: Navigation using landmarks (Was: Re: lasers and RCX)
 
(...) I meant the error due to things like refraction around solid objects, reflections, etc. Also - I didn't mean estimating and correcting for it at runtime - I just wanted an idea of what sort of magnitude of error those effects presented. (...) (...) (22 years ago, 28-Jun-02, to lugnet.robotics)

2 Messages in This Thread:

Entire Thread on One Page:
Nested:  All | Brief | Compact | Dots
Linear:  All | Brief | Compact
    

Custom Search

©2005 LUGNET. All rights reserved. - hosted by steinbruch.info GbR