To LUGNET HomepageTo LUGNET News HomepageTo LUGNET Guide Homepage
 Help on Searching
 
Post new message to lugnet.robotics.rcx.nqcOpen lugnet.robotics.rcx.nqc in your NNTP NewsreaderTo LUGNET News Traffic PageSign In (Members)
 Robotics / RCX / NQC / 1803
1802  |  1804
Subject: 
Re: direct manipulation of bits in RCX registers using NQC
Newsgroups: 
lugnet.robotics.rcx.nqc
Date: 
Fri, 16 Sep 2005 02:29:55 GMT
Viewed: 
6106 times
  
In lugnet.robotics.rcx.nqc, Mathew V. Jones wrote:


I guess that "firmware" must be the code that tells
RCX how to interpret user instructions (NQC, C, IC,
opcodes, or whatever), and convert them to binary
machine-specific code. Right?

   Correct. A NQC command (say, "Wait(10);") is converted to one or more
"bytecodes" (in this case, one bytecode, namely a string of 4 bytes (0x43 0x02
0x0a 0x00), the first of which is a command (0x43) while the following three are
information needed for that command (the last two bytes, for instance are the
"10" in the NQC command)). These four bytes are not valid machine language level
instructions either - instead the firmware (think of it as the RCX's operating
system - like Windows, but much more stable) executes these type of "bytecodes"
instructions one at a time, interpreting them as it goes (in other words, it
does not generate yet another layer of compiled code, but does what the bytecode
sequence asks, one step at a time.

The firmware version will dictate exactly what
communication is and is not available between the
user and the RCX.

   And a lot of other things, like what type of variables are used, how many,
how they are referenced, how fast things happen, etc.

now my bot avoids flat white walls 1-20 cm away, but
runs straight into 2 cm diameter chair legs), b) would
be great to do some actual distance estimation instead
of just turning away from everything that seems to be
"in the way".

   Tough. First, all you have to go on is the strength of the reflection. You
can't distinguish between a bright or large object (both producing a powerful
reflection) far away and a smaller or less reflective object close. At least,
not with a single reading.
   Try this. When the robot detects an object, have it pan from side to side.
That way you can get the angular size of the object, and a little more
information: a wall will not have a tightly defined angular size, instead
returning some non-zero reflection even at angles other than "head on", while a
chair leg will have essentially zero return unless you are pointing a sensor
right at it... unless, of course, there's a wall behind it as well... can you
see why this gets tough?
   As to distance estimation, that you can do better on, at least for walls. The
strength of the reflection alone isn't terribly accurate (is it a black wall? a
white wall? a wall at an angle?). But particularly if you are head-on to the
wall, the strength of the return* depends on the distance from the wall. So take
one reading at location X, and then move slightly forward a known amount and get
a second reading. If there's almost no change between them, the object is large
and far away. But if the second closer reading is much larger than the first
(say, double) and you only moved a couple of inches, then you must have been
very close to the wall in the first place. Does that make sense?

*yes, this depends on getting the "noise" level down rather low, either with a
good system, or by taking a lot of readings at each location to look for the
"brightest".

I also tried the 2x1 beam, it blanked off the lamp,
but the baseline sensor reading didn't change much.

   No. But it also dramaticly decreases the field of view of the sensor (now the
return has to get down the "tunnel" of the 2x1 beam, so only "head on"
reflections get in). For non-wall like objects, this can actually make the
return *decrease* at close range (which can, again, be useful).
   Once I had a working system, I tried shielding the phototransistor from the
LED, masking with a 1x2, and various "filters" (transparent LEGO bricks &
windows; I was going for a LEGO-only solution if possible). Everything I tried
reduced the accuracy and efficiency. Everything.

I'll keep plugging away, and let y'all know how I get on.

   Enjoy. It's a great problem.

--
Brian Davis



Message has 1 Reply:
  IR-based proximity measurement
 
I'm changing the subject line in the middle of the thread. Wonder what the server will do with that... I tried out Brian's "Max-picking" ping method, as well as a few other things. Here's a summary of what I found: 1) If the active sensor readings (...) (19 years ago, 16-Sep-05, to lugnet.robotics.rcx.nqc)

Message is in Reply To:
  Re: direct manipulation of bits in RCX registers using NQC
 
Hey, Thanks for all the helpful responses, everyone. Brian: (...) I guess that "firmware" must be the code that tells RCX how to interpret user instructions (NQC, C, IC, opcodes, or whatever), and convert them to binary machine-specific code. Right? (...) (19 years ago, 16-Sep-05, to lugnet.robotics.rcx.nqc)

13 Messages in This Thread:


Entire Thread on One Page:
Nested:  All | Brief | Compact | Dots
Linear:  All | Brief | Compact

This Message and its Replies on One Page:
Nested:  All | Brief | Compact | Dots
Linear:  All | Brief | Compact
    

Custom Search

©2005 LUGNET. All rights reserved. - hosted by steinbruch.info GbR