To LUGNET HomepageTo LUGNET News HomepageTo LUGNET Guide Homepage
 Help on Searching
 
Post new message to lugnet.generalOpen lugnet.general in your NNTP NewsreaderTo LUGNET News Traffic PageSign In (Members)
 General / 53929
     
   
Subject: 
Re: Brickshelf Backup Crawler for Mac
Newsgroups: 
lugnet.general, lugnet.cad.dev.mac
Date: 
Tue, 17 Jul 2007 16:33:14 GMT
Viewed: 
12550 times
  

Thanks for that Jim. I got the script working for user backup, and Peter Bartfai (the author of the original script) also got one working for user backup (before seeing my email that I’d gotten it working). However, while I haven’t looked at Peter’s yet, mine definitely appears more difficult to use than yours. It requires wget (which Mac OSX doesn’t ship with), so it would have to be distributed in a DMG for Mac users.

I’ll probably post the scripts somewhere anyway, so that at least Linux people can them it if they so desire.

The original script grabbed all .ldr, .mpd, and .dat files from all of Brickshelf, and I ran that last night, so at least none of the CAD models will be lost. I don’t really have the ability to host them at the moment, but I may make them available at some point in the future.

--Travis

   
         
   
Subject: 
Re: Brickshelf Backup Crawler for Mac
Newsgroups: 
lugnet.general, lugnet.cad.dev.mac
Date: 
Tue, 17 Jul 2007 17:47:21 GMT
Viewed: 
12728 times
  

In lugnet.general, Travis Cobbs wrote:

   Thanks for that Jim. I got the script working for user backup, and Peter Bartfai (the author of the original script) also got one working for user backup (before seeing my email that I’d gotten it working). However, while I haven’t looked at Peter’s yet, mine definitely appears more difficult to use than yours.

Well, I had the advantage of writing it from scratch specifically for this purpose. It’s funny because I had been thinking about writing a Brickshelf upload utility (like Flickr Uploadr) but circumstances resulted in a download utility instead.

   It requires wget (which Mac OSX doesn’t ship with), so it would have to be distributed in a DMG for Mac users.

On the other hand, Mac OS X does come with curl, so maybe it could be adapted to use that.

   I’ll probably post the scripts somewhere anyway, so that at least Linux people can them it if they so desire.

OK. The more the merrier! (I think that Tcl and tcllib come with some Linux distributions, too, but I don’t know for sure, so a plain shell script could be a useful alternative.)

   The original script grabbed all .ldr, .mpd, and .dat files from all of Brickshelf, and I ran that last night, so at least none of the CAD models will be lost. I don’t really have the ability to host them at the moment, but I may make them available at some point in the future.

Wow! I can see how that would be a useful script for testing LDView, too.

Speaking of hosting LDraw files - what exactly is the story with the “Official Model Repository”? Is there any such actual repository? If not, would this be a good time for the LDraw community to implement the idea? (Forgive my ignorance if this topic has been resolved one way or the other - I’m just curious.)

Anyway, Brickshelf must be going out with a bang as far as bandwidth is concerned.

Jim

   
         
   
Subject: 
Re: Brickshelf Backup Crawler for Mac
Newsgroups: 
lugnet.general, lugnet.cad.dev.mac
Date: 
Wed, 18 Jul 2007 16:28:14 GMT
Viewed: 
14009 times
  

  
   It requires wget (which Mac OSX doesn’t ship with), so it would have to be distributed in a DMG for Mac users.

On the other hand, Mac OS X does come with curl, so maybe it could be adapted to use that.

There are many instructions how to install wget and even some installers. curl can’t beat ‘wget -m’ afaik.

   Speaking of hosting LDraw files - what exactly is the story with the “Official Model Repository”? Is there any such actual repository? If not, would this be a good time for the LDraw community to implement the idea? (Forgive my ignorance if this topic has been resolved one way or the other - I’m just curious.)

No official repository afaik. Last I heard was that we want to build it, but that requires a developer with the time to build it!

   Anyway, Brickshelf must be going out with a bang as far as bandwidth is concerned.

Does anyone know the chances of just taking all the Brickshelf files and hosting them somewhere else? Not brickshelf’s functionality, just the files? That way they can be left up for much longer than July 31st.

James

   
         
   
Subject: 
Re: Brickshelf Backup Crawler for Mac
Newsgroups: 
lugnet.general, lugnet.cad.dev.mac
Date: 
Wed, 18 Jul 2007 17:28:25 GMT
Viewed: 
15603 times
  

In lugnet.general, James Reynolds wrote:
  
  
   It requires wget (which Mac OSX doesn’t ship with), so it would have to be distributed in a DMG for Mac users.

On the other hand, Mac OS X does come with curl, so maybe it could be adapted to use that.

There are many instructions how to install wget and even some installers. curl can’t beat ‘wget -m’ afaik.

I didn’t have any problem compiling and installing wget (although I did have to change the default install path by hand from /usr/local to /opt/local so that it would be in my pre-existing path). However, I suspect that most Mac users would be averse to doing that. Having said that, here is Peter’s sh script (all 844 bytes of it):

http://www.halibut.com/~tcobbs/ldraw/private/bsbackup.sh

To use the above script, run it with the URL of the gallery you want backed up as the only argument on the command line. I think it will only work if it’s in the current directory when you run it. Also, it seems to only back up images (not sure why that is). Make sure to put quotes around the URL on the command line, since brickshelf URLs usually include a question mark, which the shell will try to interpret.

Example:
./bsbackup.sh "http://www.brickshelf.com/cgi-bin/gallery.cgi?f=169512"
As mentioned, you must have wget installed on your system (which is there by default on most Linux distros). This script doesn’t use the -m option, and could be (fairly easily) modified to work with curl. I haven’t done that, though, and don’t plan to.

--Travis

   
         
     
Subject: 
Re: Brickshelf Backup Crawler for Mac
Newsgroups: 
lugnet.general, lugnet.cad.dev.mac
Date: 
Wed, 18 Jul 2007 17:53:48 GMT
Viewed: 
14130 times
  

In lugnet.general, Travis Cobbs wrote:
   it’s in the current directory when you run it. Also, it seems to only back up images (not sure why that is). Make sure to put quotes around the URL on

OK, if you download and run the script now, it should back up all files (not just images).

--Travis

   
         
   
Subject: 
Re: Brickshelf Backup Crawler for Mac
Newsgroups: 
lugnet.general, lugnet.cad.dev.mac
Date: 
Wed, 18 Jul 2007 19:08:45 GMT
Viewed: 
15924 times
  

&tIn lugnet.general, Travis Cobbs wrote:

   here is Peter’s sh script (all 844 bytes of it): • ; http://www.halibut.com/~tcobbs/ldraw/private/bsbackup.sh

Boy, that is a lot more compact than mine!

   Also, it seems to only back up images (not sure why that is).

It looks like it’s only considering links to /cgi-bin/gallery.cgi. However, links to non-image files from the Brickshelf folder pages point directly to /gallery/username/foo/bar.dat. (Ah, I see you have fixed this.)

   Make sure to put quotes around the URL on the command line, since brickshelf URLs usually include a question mark, which the shell will try to interpret.

That advice probably stands for my script’s -url option as well.

   As mentioned, you must have wget installed on your system (which is there by default on most Linux distros).

For anyone interested, I can confirm that wget 1.10.2 builds on my Mac with the following:

./configure
make
make install

As Travis noted, though, if you don’t have a /usr/local directory (as I do) you may want to configure a different installation location.

For what it’s worth, I just did an informal comparison between this script and mine. I timed the download of my account with each script. Keep in mind that network variations and in particular Brickshelf’s heavy traffic make this a relative comparison, not an absolute measure.

cd ~/bsbackup-test
time ./bsbackup.sh "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved"

This took 24:32 (using the initial version which did not retrieve my LDraw files).

time bscrawl.tcl -url  "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved" -dir ~/bscrawl-test  -pause 0.01

This took 31:20 to download all my files.

So, it looks like your script is a bit faster as well as a bit smaller.

Jim

   
         
   
Subject: 
Re: Brickshelf Backup Crawler for Mac
Newsgroups: 
lugnet.general, lugnet.cad.dev.mac
Date: 
Wed, 18 Jul 2007 20:17:57 GMT
Viewed: 
14636 times
  

In lugnet.general, Jim DeVona wrote:

   For what it’s worth, I just did an informal comparison between this script and mine. I timed the download of my account with each script. Keep in mind that network variations and in particular Brickshelf’s heavy traffic make this a relative comparison, not an absolute measure.

cd ~/bsbackup-test
time ./bsbackup.sh  "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved"

This took 24:32 (using the initial version which did not retrieve my LDraw files).

time bscrawl.tcl -url  "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved" -dir  ~/bscrawl-test -pause 0.01

This took 31:20 to download all my files.

So, it looks like your script is a bit faster as well as a bit smaller.

Of course, I just tried bsbackup.sh again - the current version that gets all files - and it took 32:05. There’s a little quicktime movie in there that probably accounts for most of the difference as the MPD files are quite small. So the actual performance difference may be negligible.

And now I’ll stop wasting bandwidth.

Cheers,

Jim

 

©2005 LUGNET. All rights reserved. - hosted by steinbruch.info GbR