To LUGNET HomepageTo LUGNET News HomepageTo LUGNET Guide Homepage
 Help on Searching
 
Post new message to lugnet.roboticsOpen lugnet.robotics in your NNTP NewsreaderTo LUGNET News Traffic PageSign In (Members)
 Robotics / 18266
18265  |  18267
Subject: 
Re: Navigation using landmarks (Was: Re: lasers and RCX)
Newsgroups: 
lugnet.robotics
Date: 
Sun, 30 Jun 2002 03:53:46 GMT
Original-From: 
Steve Baker <sjbaker1@airmail.SPAMCAKEnet>
Reply-To: 
sjbaker1@airmail.NOSPAMnet
Viewed: 
618 times
  
Steve Baker wrote:

I was wondering whether one of those new fancy optical mice which use a
tiny camera to watch the motion of your desktop might be adapted with a
suitable lens to track the motion of a robot.

So, according to the Agilent Technologies web site, these sensors are tiny
cameras that take 1,500 pictures per second (!) - they claim 400 'clicks'
per inch at speeds up to 12 inches per second.

That seems pretty good for a robot - there aren't many that'll need to travel
faster than 12"/sec and 1/400th inch precision is clearly overkill.

I guess you could give it an alternative lens to reduce the resolution and
increase the maximum speed in proportion - but making the right lens for
this thing and getting the optical alignment right could be tricky.

It needs a specific Red LED to illuminate the scene.

The biggest problem seems to be that the optics that come with the chip
expect a 2.4 millimeter separation between the bottom of the optics and
the surface it's sensing.  My son's PC has an optical mouse (dunno whether
it's using the Agilent chipset)...I'd say that you could maybe get to a half
centimeter between the bottom of the actual lens and a hardwood floor and
still get some kind of readout.

Ground clearance is definitely going to be a problem. You'd maybe have
to mount it on a sprung skid that would keep it correctly aligned to the
ground.

A better lens would definitely help - you could increase both the ground
clearance and the maximum robot speed - at the cost of resolution.  But
we don't need 1/400th inch precision so that's no problem.

So, I guess the next issue is how to extract the data from the device and
get it into the RCX.  The chip produces quadrature signals - so it ought
to be possible.

Then you'd want to use two of these devices to derive rotation information
as well as forward and lateral speed.

----------------------------- Steve Baker -------------------------------
Mail : <sjbaker1@airmail.net>   WorkMail: <sjbaker@link.com>
URLs : http://www.sjbaker.org
        http://plib.sf.net http://tuxaqfh.sf.net http://tuxkart.sf.net
        http://prettypoly.sf.net http://freeglut.sf.net
        http://toobular.sf.net   http://lodestone.sf.net



Message is in Reply To:
  Re: Navigation using landmarks (Was: Re: lasers and RCX)
 
(...) <blush> (...) Well, where you get the chips is you pull apart a mouse these mice are pretty cheap these days. The software and control logic is all inside the mouse - they don't have any special drivers in the PC. Hence, the protocol is the (...) (22 years ago, 29-Jun-02, to lugnet.robotics)

11 Messages in This Thread:




Entire Thread on One Page:
Nested:  All | Brief | Compact | Dots
Linear:  All | Brief | Compact
    

Custom Search

©2005 LUGNET. All rights reserved. - hosted by steinbruch.info GbR