To LUGNET HomepageTo LUGNET News HomepageTo LUGNET Guide Homepage
 Help on Searching
 
Post new message to lugnet.generalOpen lugnet.general in your NNTP NewsreaderTo LUGNET News Traffic PageSign In (Members)
 General / 53948
     
   
Subject: 
Re: Brickshelf Backup Crawler for Mac
Newsgroups: 
lugnet.general, lugnet.cad.dev.mac
Date: 
Wed, 18 Jul 2007 16:28:14 GMT
Viewed: 
14008 times
  

  
   It requires wget (which Mac OSX doesn’t ship with), so it would have to be distributed in a DMG for Mac users.

On the other hand, Mac OS X does come with curl, so maybe it could be adapted to use that.

There are many instructions how to install wget and even some installers. curl can’t beat ‘wget -m’ afaik.

   Speaking of hosting LDraw files - what exactly is the story with the “Official Model Repository”? Is there any such actual repository? If not, would this be a good time for the LDraw community to implement the idea? (Forgive my ignorance if this topic has been resolved one way or the other - I’m just curious.)

No official repository afaik. Last I heard was that we want to build it, but that requires a developer with the time to build it!

   Anyway, Brickshelf must be going out with a bang as far as bandwidth is concerned.

Does anyone know the chances of just taking all the Brickshelf files and hosting them somewhere else? Not brickshelf’s functionality, just the files? That way they can be left up for much longer than July 31st.

James

   
         
   
Subject: 
Re: Brickshelf Backup Crawler for Mac
Newsgroups: 
lugnet.general, lugnet.cad.dev.mac
Date: 
Wed, 18 Jul 2007 17:28:25 GMT
Viewed: 
15602 times
  

In lugnet.general, James Reynolds wrote:
  
  
   It requires wget (which Mac OSX doesn’t ship with), so it would have to be distributed in a DMG for Mac users.

On the other hand, Mac OS X does come with curl, so maybe it could be adapted to use that.

There are many instructions how to install wget and even some installers. curl can’t beat ‘wget -m’ afaik.

I didn’t have any problem compiling and installing wget (although I did have to change the default install path by hand from /usr/local to /opt/local so that it would be in my pre-existing path). However, I suspect that most Mac users would be averse to doing that. Having said that, here is Peter’s sh script (all 844 bytes of it):

http://www.halibut.com/~tcobbs/ldraw/private/bsbackup.sh

To use the above script, run it with the URL of the gallery you want backed up as the only argument on the command line. I think it will only work if it’s in the current directory when you run it. Also, it seems to only back up images (not sure why that is). Make sure to put quotes around the URL on the command line, since brickshelf URLs usually include a question mark, which the shell will try to interpret.

Example:
./bsbackup.sh "http://www.brickshelf.com/cgi-bin/gallery.cgi?f=169512"
As mentioned, you must have wget installed on your system (which is there by default on most Linux distros). This script doesn’t use the -m option, and could be (fairly easily) modified to work with curl. I haven’t done that, though, and don’t plan to.

--Travis

   
         
     
Subject: 
Re: Brickshelf Backup Crawler for Mac
Newsgroups: 
lugnet.general, lugnet.cad.dev.mac
Date: 
Wed, 18 Jul 2007 17:53:48 GMT
Viewed: 
14129 times
  

In lugnet.general, Travis Cobbs wrote:
   it’s in the current directory when you run it. Also, it seems to only back up images (not sure why that is). Make sure to put quotes around the URL on

OK, if you download and run the script now, it should back up all files (not just images).

--Travis

   
         
   
Subject: 
Re: Brickshelf Backup Crawler for Mac
Newsgroups: 
lugnet.general, lugnet.cad.dev.mac
Date: 
Wed, 18 Jul 2007 19:08:45 GMT
Viewed: 
15923 times
  

&tIn lugnet.general, Travis Cobbs wrote:

   here is Peter’s sh script (all 844 bytes of it): • ; http://www.halibut.com/~tcobbs/ldraw/private/bsbackup.sh

Boy, that is a lot more compact than mine!

   Also, it seems to only back up images (not sure why that is).

It looks like it’s only considering links to /cgi-bin/gallery.cgi. However, links to non-image files from the Brickshelf folder pages point directly to /gallery/username/foo/bar.dat. (Ah, I see you have fixed this.)

   Make sure to put quotes around the URL on the command line, since brickshelf URLs usually include a question mark, which the shell will try to interpret.

That advice probably stands for my script’s -url option as well.

   As mentioned, you must have wget installed on your system (which is there by default on most Linux distros).

For anyone interested, I can confirm that wget 1.10.2 builds on my Mac with the following:

./configure
make
make install

As Travis noted, though, if you don’t have a /usr/local directory (as I do) you may want to configure a different installation location.

For what it’s worth, I just did an informal comparison between this script and mine. I timed the download of my account with each script. Keep in mind that network variations and in particular Brickshelf’s heavy traffic make this a relative comparison, not an absolute measure.

cd ~/bsbackup-test
time ./bsbackup.sh "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved"

This took 24:32 (using the initial version which did not retrieve my LDraw files).

time bscrawl.tcl -url  "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved" -dir ~/bscrawl-test  -pause 0.01

This took 31:20 to download all my files.

So, it looks like your script is a bit faster as well as a bit smaller.

Jim

   
         
   
Subject: 
Re: Brickshelf Backup Crawler for Mac
Newsgroups: 
lugnet.general, lugnet.cad.dev.mac
Date: 
Wed, 18 Jul 2007 20:17:57 GMT
Viewed: 
14635 times
  

In lugnet.general, Jim DeVona wrote:

   For what it’s worth, I just did an informal comparison between this script and mine. I timed the download of my account with each script. Keep in mind that network variations and in particular Brickshelf’s heavy traffic make this a relative comparison, not an absolute measure.

cd ~/bsbackup-test
time ./bsbackup.sh  "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved"

This took 24:32 (using the initial version which did not retrieve my LDraw files).

time bscrawl.tcl -url  "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved" -dir  ~/bscrawl-test -pause 0.01

This took 31:20 to download all my files.

So, it looks like your script is a bit faster as well as a bit smaller.

Of course, I just tried bsbackup.sh again - the current version that gets all files - and it took 32:05. There’s a little quicktime movie in there that probably accounts for most of the difference as the MPD files are quite small. So the actual performance difference may be negligible.

And now I’ll stop wasting bandwidth.

Cheers,

Jim

 

©2005 LUGNET. All rights reserved. - hosted by steinbruch.info GbR