I am attempting to sftp a large file (119M) to a z/OS dataset defined as follows:
ls /+unit=approved
ls /+space=cyl.150.10
ls /+overflow=wrap
ls /+recfm=vb,lrecl=255,blksize=0,dsorg=ps
I receive the following error message:
Couldn't write to remote file '//user1.mvs.dataset' : Failure
ID (mismatch (19 != 12)
The following messages were posted to the server log:
ZosDataset Opening dataset '//user1.mvs.dataset' for write with options blksize(0) dsorg(ps) recfm(v,b) cylspace(150,10) unit(approved)
ZosDataset[E]: InputBuffer will not hold 32768 bytes of new data
zosUtil[E]: Unexpected exception caught in zos_write
[00348] process_write failed
ZosDataset[E]: dataset write error: seek not allowed
Input Buffer will not hold 32768 bytes of new data
What version of Co:Z SFTP server are you using?
What other options do you have set?
(display them with:)
ls /+
How much data is transferred before you get this failure? If it is not too much, them please turn on a trace:
ls /+loglevel=RecordExtractor=F
Also, please indicate what sftp client and version you are using (psftp, etc...)
And email the log (trace) to us at info@dovetail.com and we'll take a look.
What other options do you have set?
(display them with:)
ls /+
How much data is transferred before you get this failure? If it is not too much, them please turn on a trace:
ls /+loglevel=RecordExtractor=F
Also, please indicate what sftp client and version you are using (psftp, etc...)
And email the log (trace) to us at info@dovetail.com and we'll take a look.
Re: Input Buffer will not hold 32768 bytes of new data
Hi, was there a solution to this? I've just hit the same problem.
Thanks,
Stefan
Thanks,
Stefan
Re: Input Buffer will not hold 32768 bytes of new data
Stefan,
Please try to collect the information requested on the previous post and we will try to diagnose.
Please try to collect the information requested on the previous post and we will try to diagnose.
Re: Input Buffer will not hold 32768 bytes of new data
I'm not sure what your parameters and data look like, but we are able to reproduce this error with:
- mode=text with linerule=flexible (the default) or linerule=LF/NL/CRLF
- overflow=wrap (the default)
- a remote PUT or a local GET that writes to a z/OS dataset
- an input file that has a line > 128K bytes
What should happen is that these huge lines should be wrapped onto multiple records, but they are not if > 128K.
We have identified the problem and will likely fix it in an upcoming release.
But, check to see if you are really using the correct mode and encoding for your data.
- mode=text with linerule=flexible (the default) or linerule=LF/NL/CRLF
- overflow=wrap (the default)
- a remote PUT or a local GET that writes to a z/OS dataset
- an input file that has a line > 128K bytes
What should happen is that these huge lines should be wrapped onto multiple records, but they are not if > 128K.
We have identified the problem and will likely fix it in an upcoming release.
But, check to see if you are really using the correct mode and encoding for your data.
Re: Input Buffer will not hold 32768 bytes of new data
Thanks very much for the quick response. We are using mode=text and defaulting the others. This is a z/OS batch job doing a get from a Linux server into a z/OS dataset. This was just one, repeatable failure from a large number of cycles of this job. So I will check whether the users expect this kind of fluctuation in the data.
Re: Input Buffer will not hold 32768 bytes of new data
It seams that we have the same problem. Did you fix this in some release? We are still on 1.7 version.dovetail wrote:I'm not sure what your parameters and data look like, but we are able to reproduce this error with:
- mode=text with linerule=flexible (the default) or linerule=LF/NL/CRLF
- overflow=wrap (the default)
- a remote PUT or a local GET that writes to a z/OS dataset
- an input file that has a line > 128K bytes
What should happen is that these huge lines should be wrapped onto multiple records, but they are not if > 128K.
We have identified the problem and will likely fix it in an upcoming release.
But, check to see if you are really using the correct mode and encoding for your data.
Re: Input Buffer will not hold 32768 bytes of new data
Yes, this was fixed in the 2.0.1 release.
Re: Input Buffer will not hold 32768 bytes of new data
thanks for info. That's one more reason to start with migrationSteveGoetze wrote:Yes, this was fixed in the 2.0.1 release.