OK, I see how a forward-only stream just uses the local headers rather than the trailing full directory.
I had a look at the AbstractReader.Skip() code which appears to be opening the compressed stream and reading through it to get past the data and onto the next local header.
Why does it do this when the current local header holds the number of uncompressed bytes?
I know the stream is not seekable but surely it just needs a method of reading and ignoring that number of bytes on the 'current' streams rather than letting the decompression stream do it.
This way would have two big benefits
1) It should speed up skipped items immensely
2) It will allow files with non-supported compression types to be ignored (I have a Zip64 compressed file nested in the ZIPX file somewhere which is causing the whole thing to bomb)
Cheers
Simon
I had a look at the AbstractReader.Skip() code which appears to be opening the compressed stream and reading through it to get past the data and onto the next local header.
Why does it do this when the current local header holds the number of uncompressed bytes?
I know the stream is not seekable but surely it just needs a method of reading and ignoring that number of bytes on the 'current' streams rather than letting the decompression stream do it.
This way would have two big benefits
1) It should speed up skipped items immensely
2) It will allow files with non-supported compression types to be ignored (I have a Zip64 compressed file nested in the ZIPX file somewhere which is causing the whole thing to bomb)
Cheers
Simon