|
In lugnet.general, James Reynolds wrote:
|
|
|
It requires wget (which Mac OSX doesnt ship with), so it would
have to be distributed in a DMG for Mac users.
|
On the other hand, Mac OS X does come with
curl, so maybe it could be
adapted to use that.
|
There are many instructions how to install wget and even some installers.
curl cant beat wget -m afaik.
|
I didnt have any problem compiling and installing wget (although I did have to
change the default install path by hand from /usr/local to /opt/local so that it
would be in my pre-existing path). However, I suspect that most Mac users would
be averse to doing that. Having said that, here is Peters sh script (all 844
bytes of it):
http://www.halibut.com/~tcobbs/ldraw/private/bsbackup.sh
To use the above script, run it with the URL of the gallery you want backed up
as the only argument on the command line. I think it will only work if its in
the current directory when you run it. Also, it seems to only back up images
(not sure why that is). Make sure to put quotes around the URL on the command
line, since brickshelf URLs usually include a question mark, which the shell
will try to interpret.
Example:
./bsbackup.sh "http://www.brickshelf.com/cgi-bin/gallery.cgi?f=169512"
As mentioned, you must have wget installed on your system (which is there by
default on most Linux distros). This script doesnt use the -m option, and
could be (fairly easily) modified to work with curl. I havent done that,
though, and dont plan to.
--Travis
|
|
|
In lugnet.general, Travis Cobbs wrote:
|
its in the current directory when you run it. Also, it seems to only back
up images (not sure why that is). Make sure to put quotes around the URL on
|
OK, if you download and run the script now, it should back up all files (not
just images).
--Travis
|
|
|
&tIn lugnet.general, Travis Cobbs wrote:
Boy, that is a lot more compact than mine!
|
Also, it seems to only back up images (not sure why that is).
|
It looks like its only considering links to /cgi-bin/gallery.cgi. However,
links to non-image files from the Brickshelf folder pages point directly to
/gallery/username/foo/bar.dat. (Ah, I see you have fixed this.)
|
Make sure to put quotes around the URL on
the command line, since brickshelf URLs usually include a question mark,
which the shell will try to interpret.
|
That advice probably stands for my scripts -url option as well.
|
As mentioned, you must have wget installed on your system (which is there by
default on most Linux distros).
|
For anyone interested, I can confirm that wget 1.10.2 builds on my Mac with the following:
./configure
make
make install
|
|
As Travis noted, though, if you dont have a /usr/local directory (as I do)
you may want to configure a different installation location.
For what its worth, I just did an informal comparison between this script and
mine. I timed the download of my account with each script. Keep in mind that
network variations and in particular Brickshelfs heavy traffic make this a
relative comparison, not an absolute measure.
cd ~/bsbackup-test
time ./bsbackup.sh "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved"
|
|
This took 24:32 (using the initial version which did not retrieve my LDraw
files).
time bscrawl.tcl -url "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved" -dir ~/bscrawl-test -pause 0.01
|
|
This took 31:20 to download all my files.
So, it looks like your script is a bit faster as well as a bit smaller.
Jim
|
|
|
In lugnet.general, Jim DeVona wrote:
|
For what its worth, I just did an informal comparison between this script
and mine. I timed the download of my account with each script. Keep in mind
that network variations and in particular Brickshelfs heavy traffic make
this a relative comparison, not an absolute measure.
cd ~/bsbackup-test
time ./bsbackup.sh "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved"
|
|
This took 24:32 (using the initial version which did not retrieve my LDraw
files).
time bscrawl.tcl -url "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved" -dir ~/bscrawl-test -pause 0.01
|
|
This took 31:20 to download all my files.
So, it looks like your script is a bit faster as well as a bit smaller.
|
Of course, I just tried bsbackup.sh again - the current version that gets all
files - and it took 32:05. Theres a little quicktime movie in there that
probably accounts for most of the difference as the MPD files are quite small.
So the actual performance difference may be negligible.
And now Ill stop wasting bandwidth.
Cheers,
Jim
|
|
|