I have a ~100GB 262Million file SQLite database (MBTiles), that I am attempting to convert into a cotar.
The tar creation process fails after 120M files as the hash map containing the previous hashes overflows (NodeJS limit)
Some options:
- Split the output tar into multiple smaller chunks (20-30GB ea) so that they contain roughly 100M files each
- Switch the hash table to be a tree to support much larger table sizes (Would have creation performance imapacts)
- Rewrite the creation into a language that supports BigMaps by default.
I have a ~100GB 262Million file SQLite database (MBTiles), that I am attempting to convert into a cotar.
The tar creation process fails after 120M files as the hash map containing the previous hashes overflows (NodeJS limit)
Some options: