Skip to content

Commit da7d3ed

Browse files
committed
Bugfix for #15 for compressing large files
This is a potential quick fix for Issue #15 which occurs when attempting to compress files larger than ZIP64_LIMIT. `zinfo.file_size` is never initialized to the correct file size and thus the determination of whether zip64 is required is made based on the file size of 0. This later results in an exception being raised as though the file size increased during compression, since the file size is actually counted during compression and later saved over `zinfo.file_size`. It is important to note that this fix may not be cross platform. Different versions of Python do different things on Windows with the `st_size` parameter in the `os.stat` call. So, that may be worth investigating further. However, in the short term, this will fix the problem on Linux, Mac, and some Windows platforms without making it worse where it still doesn't work. I would leave it to the maintainers to make a broader decision on whether this fix is appropriate or if a better solution would be desired. I'm happy to help.
1 parent 8abfa34 commit da7d3ed

File tree

1 file changed

+4
-1
lines changed

1 file changed

+4
-1
lines changed

zipstream/__init__.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -262,7 +262,10 @@ def __write(self, filename=None, iterable=None, arcname=None, compress_type=None
262262
else:
263263
zinfo.compress_type = compress_type
264264

265-
zinfo.file_size = 0
265+
if st:
266+
zinfo.file_size = st[6]
267+
else:
268+
zinfo.file_size = 0
266269
zinfo.flag_bits = 0x00
267270
zinfo.flag_bits |= 0x08 # ZIP flag bits, bit 3 indicates presence of data descriptor
268271
zinfo.header_offset = self.fp.tell() # Start of header bytes

0 commit comments

Comments
 (0)