Can I MGET multiple files into single MVS data set?
-
- Posts: 39
- Joined: Wed Feb 11, 2009 10:23 am
Can I MGET multiple files into single MVS data set?
Remote company is placing multiple files into a directory for me to pull from. At the time I invoke cozsftp I don't know how many files will be there.
Can I mget * //DD:dd-name
or is there a better way?
Can I mget * //DD:dd-name
or is there a better way?
David
"mget" is an FTP command.
In OpenSSH sftp, the "get" command can be used to download multiple files, but here is the documentation:
But "localdir" would have to be either:
- a Unix directory
- a z/OS dataset qualifier, which is treated as a pseudo directory. In this case, remote file names would be mapped into dataset names under this qualifier, so they would have to conform to z/OS dataset name rules and lengths.
- a z/OS partitioned dataset, which is also treated as a pseudo directory. Remote file names would have to be valid member names.
In your case, it would be nice if a local dataset name could be given for the target instead of a directory with "disp=mod" set so that all of the remote files were concatenated to the same local dataset. Unfortunately, this is not currently supported. This might be an enhancement that we could consider in the future.
An alternative mechanism would be to first connect with sftp or ssh to get a list of file names and then have a shell script use that list to generate a second sftp connection which downloads those files to a concatenated sequential dataset. We could probably help and provide a sample if this would work for you.
Another alternative would be to use Co:Z Launcher to run a remote script which can process files and copy them to a dataset or DD in the launching job step. This would require that the remote system install the Co:Z Target system toolkit, which may not be an option for an external client.
In OpenSSH sftp, the "get" command can be used to download multiple files, but here is the documentation:
So, you can do this:get [-P] remote-path [local-path]
Retrieve the remote-path and store it on the local machine. If the local path name is not specified, it is given the same name it has on the remote machine. remote-path may contain glob(3) characters and may match multiple files. If it does and local-path is specified, then local-path must specify a directory. If the -P flag is specified, then full file permissions and access times are copied too.
Code: Select all
sftp user@remote.host
sftp> cd /dir
sftp> get * localdir
- a Unix directory
- a z/OS dataset qualifier, which is treated as a pseudo directory. In this case, remote file names would be mapped into dataset names under this qualifier, so they would have to conform to z/OS dataset name rules and lengths.
- a z/OS partitioned dataset, which is also treated as a pseudo directory. Remote file names would have to be valid member names.
In your case, it would be nice if a local dataset name could be given for the target instead of a directory with "disp=mod" set so that all of the remote files were concatenated to the same local dataset. Unfortunately, this is not currently supported. This might be an enhancement that we could consider in the future.
An alternative mechanism would be to first connect with sftp or ssh to get a list of file names and then have a shell script use that list to generate a second sftp connection which downloads those files to a concatenated sequential dataset. We could probably help and provide a sample if this would work for you.
Another alternative would be to use Co:Z Launcher to run a remote script which can process files and copy them to a dataset or DD in the launching job step. This would require that the remote system install the Co:Z Target system toolkit, which may not be an option for an external client.
Using the get command to retrieve multiple files
This post is a close topic to my question that I am facing today.
I also need to perform a multiple get of files but I want to know if I can narrow it down to a specific date since the directory, I am landing remotely, contains a week's worth of file.
Any feedback would be appreciated.
Thanks.
I also need to perform a multiple get of files but I want to know if I can narrow it down to a specific date since the directory, I am landing remotely, contains a week's worth of file.
Any feedback would be appreciated.
Thanks.
You could use some sort of file name wild-card matching, but assuming that the names do not indicate the date/time, then this wouldn't help you.
Another alternative would be to make two connections in the job -
The first would be a sftp or ssh "ls -l" command that lists the remote directroy and redirects the output to a pipe or temporary file.
A local shell script or program would read the directory listing and build "get" commands.
A second sftp connection would run the generated get commands to download the selected files.
Another alternative would be to make two connections in the job -
The first would be a sftp or ssh "ls -l" command that lists the remote directroy and redirects the output to a pipe or temporary file.
A local shell script or program would read the directory listing and build "get" commands.
A second sftp connection would run the generated get commands to download the selected files.
Remote site does not support scripts
thanks for the quick response. I like the idea but in the scenario I am working with, nothing comes easy.
I just got off the phone with the admin from the remote site and essentially I can not host a script from their end.
So my question now has shift to perform an entire retrieval of the all files, could I lay the file bundle into a gdg dataset? How was this work?
I just got off the phone with the admin from the remote site and essentially I can not host a script from their end.
So my question now has shift to perform an entire retrieval of the all files, could I lay the file bundle into a gdg dataset? How was this work?
Here's some example JCL that demonstrates how to
The above example just downloads all files that match a name pattern to a DD that has DISP=MOD. For what you are asking for (all datasets with a certain date range), you will need to modify the awk script to pick off the files in your desired date range. Get a Unix guy who knows awk programming to help you if necessary.
An alternative to this would be to get a list of all files with a certain file pattern and then to generate a "get" plus a "rename" for each file so that the file gets renamed (moved) into a "processed" directory after it is successfully downloaded.
In the future, we think that a REXX API to the Co:Z SFTP client might be nice for writing automated scripts like this. For those that are really interested in this, please post your comments to this thread or contact us at info@dovetail.com.
Code: Select all
//SFTPCAT JOB (),'GOETZE',MSGCLASS=H,NOTIFY=&SYSUID
//JOBLIB DD DISP=SHR,DSN=GOETZE.COZ.LOADLIB
//*
//*********************************************************************
//*
//* Batch job to run the Co:Z SFTP client
//*
//* Tailor the proc and job for your installation:
//* 1.) Modify the Job card per your installation's requirements
//* 2.) Modify the PROCLIB card to point to this PDS, or wherever
//* the COZPROC procedure has been installed.
//* 3.) Customize the shell script below, and the //DOWNLOAD DD
//*
//*********************************************************************
//*
//COZSFTP EXEC PGM=COZBATCH
//STDIN DD *
# Customize these ...
coz_bin="/u/vendor/coz/bin"
export zopts="mode=text"
r_pattern="/home/goetze/my_files/*.txt"
r_user="goetze"
r_server="myco.server.com"
# These can be used to read the ssh password from a (secured) dataset
# if you don't want to setup public/private keypairs
export PASSWD_DSN='//GOETZE.COZ.SAMPJCL(PW)'
export SSH_ASKPASS=$coz_bin/read_passwd_dsn.sh
export DISPLAY=none
ssh_opts="-oBatchMode=no" # allows ssh to use SSH_ASKPASS program
ssh_opts="$ssh_opts -oConnectTimeout=60"
ssh_opts="$ssh_opts -oServerAliveInterval=60"
ssh_opts="$ssh_opts -oStrictHostKeyChecking=no" # accept initial host keys
# Use cozsftp to get a list files matching the shell variable r_pattern,
# then pipe the results to awk where the desired cozsftp options are set and
# the files transferred one by one to a local new dataset.
# Note that "-oBatchMode=no" must be specified before "-b"
# since ssh opts are first-sticky
{
$coz_bin/cozsftp $ssh_opts -b- $r_user@$r_server <<EOB
ls -l $r_pattern
EOB
} | awk '
BEGIN {
print "lzopts " ENVIRON["zopts"]
skip = 1
}
{
if (skip == 1) {
if ($1 == "cozsftp>") skip = 0
next;
}
print "get " $9 " //DD:MYDSN"
#(optional) delete the remote file after uploading
#print "rm " $9
}' | $coz_bin/cozsftp $ssh_opts -b- $r_user@$r_server
/*
//MYDSN DD DSN=GOETZE.SFTPCAT.DATA,DISP=(MOD,KEEP),
// DCB=(LRECL=80,RECFM=FB),SPACE=(CYL,(3,1))
An alternative to this would be to get a list of all files with a certain file pattern and then to generate a "get" plus a "rename" for each file so that the file gets renamed (moved) into a "processed" directory after it is successfully downloaded.
In the future, we think that a REXX API to the Co:Z SFTP client might be nice for writing automated scripts like this. For those that are really interested in this, please post your comments to this thread or contact us at info@dovetail.com.
Re: Can I MGET multiple files into single MVS data set?
Team,
Does Coz SFTP new release support disp=mod from remote server to mainframe local dataset?
We utilize the //DD:FILENAME within our control cards as a work around to this original issue.
thanks
Running:
Co:Z SFTP version: 3.0.1 (5.0p1) 2017-04-13 (patched)
Does Coz SFTP new release support disp=mod from remote server to mainframe local dataset?
We utilize the //DD:FILENAME within our control cards as a work around to this original issue.
thanks
Running:
Co:Z SFTP version: 3.0.1 (5.0p1) 2017-04-13 (patched)
Re: Can I MGET multiple files into single MVS data set?
I'm not sure that I understand your question.
Using Co:Z SFTP client:
- This was already supported:
lzopts disp=mod
get remote-file //MY.DSN
- And this was already supported, and IMO preferable:
get remote-file //DD:MYDD # DD card has DISP=MOD
Using Co:Z SFTP server:
- This was already supported:
ls /+disp=mod
put local-file //MY.DSN
Using Co:Z SFTP client:
- This was already supported:
lzopts disp=mod
get remote-file //MY.DSN
- And this was already supported, and IMO preferable:
get remote-file //DD:MYDD # DD card has DISP=MOD
Using Co:Z SFTP server:
- This was already supported:
ls /+disp=mod
put local-file //MY.DSN
Re: Can I MGET multiple files into single MVS data set?
I see. DD:MYDSN is the solution to the mget for sftp.
I guess i was slight confused by a statement made on an earlier post that made it seem that perhaps in the future we will have the ability to perform a simple get to a dataset without coding //DD:MYDSN.
Apr 28, 2011 9:06 am
[In your case, it would be nice if a local dataset name could be given for the target instead of a directory with "disp=mod" set so that all of the remote files were concatenated to the same local dataset. Unfortunately, this is not currently supported. This might be an enhancement that we could consider in the future. ]
I guess i was slight confused by a statement made on an earlier post that made it seem that perhaps in the future we will have the ability to perform a simple get to a dataset without coding //DD:MYDSN.
Apr 28, 2011 9:06 am
[In your case, it would be nice if a local dataset name could be given for the target instead of a directory with "disp=mod" set so that all of the remote files were concatenated to the same local dataset. Unfortunately, this is not currently supported. This might be an enhancement that we could consider in the future. ]
Re: Can I MGET multiple files into single MVS data set?
This was a response that was related to using wildcards with get. That's the original question on this thread.In your case, it would be nice if a local dataset name could be given for the target instead of a directory with "disp=mod" set so that all of the remote files were concatenated to the same local dataset. Unfortunately, this is not currently supported. This might be an enhancement that we could consider in the future.