|
In lugnet.general, Travis Cobbs wrote:
|
Thanks for that Jim. I got the script working for user backup, and Peter
Bartfai (the author of the original script) also got one working for user
backup (before seeing my email that Id gotten it working). However, while I
havent looked at Peters yet, mine definitely appears more difficult to use
than yours.
|
Well, I had the advantage of writing it from scratch specifically for this
purpose. Its funny because I had been thinking about writing a Brickshelf
upload utility (like Flickr Uploadr) but
circumstances resulted in a download utility instead.
|
It requires wget (which Mac OSX doesnt ship with), so it would
have to be distributed in a DMG for Mac users.
|
On the other hand, Mac OS X does come with
curl, so maybe it could be
adapted to use that.
|
Ill probably post the scripts somewhere anyway, so that at least Linux
people can them it if they so desire.
|
OK. The more the merrier! (I think that Tcl and
tcllib come with some Linux distributions,
too, but I dont know for sure, so a plain shell script could be a useful
alternative.)
|
The original script grabbed all .ldr, .mpd, and .dat files from all of
Brickshelf, and I ran that last night, so at least none of the CAD models
will be lost. I dont really have the ability to host them at the moment,
but I may make them available at some point in the future.
|
Wow! I can see how that would be a useful script for testing LDView, too.
Speaking of hosting LDraw files - what exactly is the story with the
Official Model Repository? Is there any
such actual repository? If not, would this be a good time for the LDraw
community to implement the idea? (Forgive my ignorance if this topic has been
resolved one way or the other - Im just curious.)
Anyway, Brickshelf must be going out with a bang as far as bandwidth is
concerned.
Jim
|
|
|
|
|
It requires wget (which Mac OSX doesnt ship with), so it would
have to be distributed in a DMG for Mac users.
|
On the other hand, Mac OS X does come with
curl, so maybe it could be
adapted to use that.
|
There are many instructions how to install wget and even some installers. curl
cant beat wget -m afaik.
|
Speaking of hosting LDraw files - what exactly is the story with the
Official Model Repository? Is there
any such actual repository? If not, would this be a good time for the LDraw
community to implement the idea? (Forgive my ignorance if this topic has been
resolved one way or the other - Im just curious.)
|
No official repository afaik. Last I heard was that we want to build it, but
that requires a developer with the time to build it!
|
Anyway, Brickshelf must be going out with a bang as far as bandwidth is
concerned.
|
Does anyone know the chances of just taking all the Brickshelf files and hosting
them somewhere else? Not brickshelfs functionality, just the files? That way
they can be left up for much longer than July 31st.
James
|
|
|
In lugnet.general, James Reynolds wrote:
|
|
|
It requires wget (which Mac OSX doesnt ship with), so it would
have to be distributed in a DMG for Mac users.
|
On the other hand, Mac OS X does come with
curl, so maybe it could be
adapted to use that.
|
There are many instructions how to install wget and even some installers.
curl cant beat wget -m afaik.
|
I didnt have any problem compiling and installing wget (although I did have to
change the default install path by hand from /usr/local to /opt/local so that it
would be in my pre-existing path). However, I suspect that most Mac users would
be averse to doing that. Having said that, here is Peters sh script (all 844
bytes of it):
http://www.halibut.com/~tcobbs/ldraw/private/bsbackup.sh
To use the above script, run it with the URL of the gallery you want backed up
as the only argument on the command line. I think it will only work if its in
the current directory when you run it. Also, it seems to only back up images
(not sure why that is). Make sure to put quotes around the URL on the command
line, since brickshelf URLs usually include a question mark, which the shell
will try to interpret.
Example:
./bsbackup.sh "http://www.brickshelf.com/cgi-bin/gallery.cgi?f=169512"
As mentioned, you must have wget installed on your system (which is there by
default on most Linux distros). This script doesnt use the -m option, and
could be (fairly easily) modified to work with curl. I havent done that,
though, and dont plan to.
--Travis
|
|
|
In lugnet.general, Travis Cobbs wrote:
|
its in the current directory when you run it. Also, it seems to only back
up images (not sure why that is). Make sure to put quotes around the URL on
|
OK, if you download and run the script now, it should back up all files (not
just images).
--Travis
|
|
|
&tIn lugnet.general, Travis Cobbs wrote:
Boy, that is a lot more compact than mine!
|
Also, it seems to only back up images (not sure why that is).
|
It looks like its only considering links to /cgi-bin/gallery.cgi. However,
links to non-image files from the Brickshelf folder pages point directly to
/gallery/username/foo/bar.dat. (Ah, I see you have fixed this.)
|
Make sure to put quotes around the URL on
the command line, since brickshelf URLs usually include a question mark,
which the shell will try to interpret.
|
That advice probably stands for my scripts -url option as well.
|
As mentioned, you must have wget installed on your system (which is there by
default on most Linux distros).
|
For anyone interested, I can confirm that wget 1.10.2 builds on my Mac with the following:
./configure
make
make install
|
|
As Travis noted, though, if you dont have a /usr/local directory (as I do)
you may want to configure a different installation location.
For what its worth, I just did an informal comparison between this script and
mine. I timed the download of my account with each script. Keep in mind that
network variations and in particular Brickshelfs heavy traffic make this a
relative comparison, not an absolute measure.
cd ~/bsbackup-test
time ./bsbackup.sh "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved"
|
|
This took 24:32 (using the initial version which did not retrieve my LDraw
files).
time bscrawl.tcl -url "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved" -dir ~/bscrawl-test -pause 0.01
|
|
This took 31:20 to download all my files.
So, it looks like your script is a bit faster as well as a bit smaller.
Jim
|
|
|
In lugnet.general, Jim DeVona wrote:
|
For what its worth, I just did an informal comparison between this script
and mine. I timed the download of my account with each script. Keep in mind
that network variations and in particular Brickshelfs heavy traffic make
this a relative comparison, not an absolute measure.
cd ~/bsbackup-test
time ./bsbackup.sh "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved"
|
|
This took 24:32 (using the initial version which did not retrieve my LDraw
files).
time bscrawl.tcl -url "http://www.brickshelf.com/cgi-bin/gallery.cgi?m=anoved" -dir ~/bscrawl-test -pause 0.01
|
|
This took 31:20 to download all my files.
So, it looks like your script is a bit faster as well as a bit smaller.
|
Of course, I just tried bsbackup.sh again - the current version that gets all
files - and it took 32:05. Theres a little quicktime movie in there that
probably accounts for most of the difference as the MPD files are quite small.
So the actual performance difference may be negligible.
And now Ill stop wasting bandwidth.
Cheers,
Jim
|
|
|