-
Notifications
You must be signed in to change notification settings - Fork 3.4k
[file_packager] split data files when file size exceeds 2Gi ArrayBuffer limit #24802
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
[file_packager] split data files when file size exceeds 2Gi ArrayBuffer limit #24802
Conversation
@sbc100 Is this aligned with what you were thinking? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we add some tests for this? What are the specific limits you are running into? Are those limits not fixed? i.e. does it need to be configurable?
I wanted to make sure i was on the right track before adding tests. |
This does seem like a reasonable approach. I'm still a little fuzzy on exactly why and when this might be needed in the real world, I think I need to go re-read our original discussion, but I also think including more information in the PR (i.e. in the description, or in comments) would be good. |
Updated the description, based on that if you want me to hard code the limit into the package i can take that approach |
…ackages # Conflicts: # tools/file_packager.py
@sbc100 Going to look at adding tests for this. Do you have an recommendations? Would you like me to generate temporary files (s) so that i can reach the 2Gi limit? |
@sbc100 This should be ready for review, not sure about the test failure if they are flaky or if its related to my change. I don't see them fail locally. |
Addresses #24691
Currently ArrayBuffer cannot be larger than Number.MAX_SAFE_INTEGER. So bundling a large amount of files using file_packager.py the
.data
will not be allowed to load. Breaking up files into multiple data files by passes this issue.