-
Hi folks, Can anybody share your experience with extracting big osm.pbf files? Particularly, the biggest one - europe-latest.osm.pbf which is > 30Gb.
I use ghcr.io/project-osrm/osrm-backend v6.0.0 image Are there some tricks to extract huge data? How much memory would be needed? Thanks UPDATE. Just tried on 64Gb, 8 Core CPU machine. The container reaches about 31Gb and keeps working on that level for some time but then it fails the same way. And it also takes about 20 min. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
OK. The problem is solved by limiting the memory usage for wsl in .wslconfig file (yes, it was on Windows)
It took about 19 hours to process Europe on my 64GB, 8 core machine. Actually, the memory limit can be smaller but it will impact the processing time. For ex, with memory=20GB it took me about 3,5 days to do the same on 6 core machine with 32GB RAM. Nevertheless, no mater how much physical memory you have you can still unpack pbf files using osrm-backend container. You just need to set memory limits explicitly |
Beta Was this translation helpful? Give feedback.
OK. The problem is solved by limiting the memory usage for wsl in .wslconfig file (yes, it was on Windows)
For ex.:
It took about 19 hours to process Europe on my 64GB, 8 core machine. Actually, the memory limit can be smaller but it will impact the processing time. For ex, with memory=20GB it took me about 3,5 days to do the same on 6 core machine with 32GB RAM. Nevertheless, no mater how much physical memory you have you can still unpack pbf files using osrm-backend container. You just need to set memory limits explicitly