Page 1 of 1

RecordStreamException: unclaimed data error from todsn

Posted: Thu Apr 07, 2011 12:19 pm
by bradv
We are using todsn to migrate some mainframe data that was archived on a UNIX server back to the mainframe where it came from. Out of 11,000+ files we're having great results but received the following error this morning:

todsn(RF01O.UL1DB.RF0121DP.JAN07)ÝE¨: caught RecordStreamException: unclaimed data at end of input stream; len=1181, maxLen=0, streamOffset=956955244, recordNumber=4087312

todsn(RF01O.UL1DB.RF0121DP.JAN07)ÝN¨: 956955244 bytes read; 4087312 records/940604815 bytes written in 76.112 seconds (11.982 MBytes/sec).

Any idea what causes this unclaimed data condition or how to resolve it?

Thanks,
Brad

Posted: Thu Apr 07, 2011 1:14 pm
by dovetail
Brad,

This is most likely a data corruption problem, but please post the Co:Z version and details of the todsn command (arguments, target dataset attributes, etc) that you are using

Posted: Thu Apr 07, 2011 3:34 pm
by dovetail
To be more clear: the problem is most likely due to corrupted input data.

Posted: Thu Apr 07, 2011 4:00 pm
by bradv
We are using the Co:Z package 1.7.5. Years ago the files were copied from z/OS and stored on a UNIX server in bzip2 format. When I run a data integrity check against the file bzip2 doesn't report any errors (I know that doesn't mean there isn't some type of corruption).

Here is the command we are executing in batch JCL:

BPXSPOOL +
cd /bto/sys/coz/bin; +
/bin/time /usr/local/bin/bunzip2 -v -1 < +
/bto/appls/rf01/O_crisacct_200703_01.bz2 | todsn -a -l rdw +
//RF01O.UL1DB.RF0101DP.JAN07

And the target dataset is pre-allocated on virtual tape just prior to the z/OS UNIX step (above) executing:

//ALLOC21 EXEC PGM=IEBGENER
//SYSUT1 DD *
//SYSUT2 DD DSN=RF01O.UL1DB.RF0121DP.JAN07,
// UNIT=TAPE,
// DISP=(NEW,CATLG),
// (DCB=LRECL=32756,BLKSIZE=32760,RECFM=VB)

Thanks,
Brad

Posted: Thu Apr 07, 2011 4:31 pm
by dovetail
Right; since you are using RDWs, check the file at the offset in the error message and see what the problem is with the data.