Subject:
|
perl file copy question
|
Newsgroups:
|
lugnet.off-topic.geek
|
Date:
|
Wed, 5 Oct 2005 19:10:06 GMT
|
Viewed:
|
2389 times
|
| |
| |
I have a need from within a perl module to copy a large (and variable, it's
wildcard driven) number of files from one directory to another. TMTOWTDI of
course. What's the best way under the following constraints?
- the OS is linux
- The wildcard varies from run to run so it's going to need to be a variable
(that seems like no big deal at all just mentioning it)
- I want to avoid system() or backtick, as I do not want to spawn a subprocess,
too expensive in the context I am in
- I prefer not to schlep around with digging up filenames and separating them
from the directories from what glob returned
- I prefer to use only standard perl and the default module set, not stuff I'll
have to install, getting stuff installed is a major pain (it took months to get
DBI and DBD onto the production systems)
I thought of using glob($wildcardExpression) to get an array of filehandles and
then using File::Copy but where I broke down there is that copy wants two
filehandles or file names and the target names aren't individually known,
necssarily, unless I dug into the filehandle to find the name, then separated
the directory off or something.
This is something that seems easy in .sh or .ksh. I bet it's easy here too I'm
just not finding it, although I have been reading docs and searching for it for
a while.
|
|
Message has 1 Reply: | | Re: perl file copy question
|
| (...) Try File::NCopy (although I dont think it's a standard module) - (URL) however that you can easily obtain directory and filename from the path string - you don't need the file handle (I'm pretty sure File::Basename is standard): use (...) (19 years ago, 5-Oct-05, to lugnet.off-topic.geek)
|
5 Messages in This Thread:
- Entire Thread on One Page:
- Nested:
All | Brief | Compact | Dots
Linear:
All | Brief | Compact
|
|
|
|