Performance bottleneck when adding 8000+ PMTiles layers dynamically #5988
-
Hi MapLibre team 👋, We’re working on a bathymetric map application that loads around 8000 individual Our current workflow:
Here’s an example entry: {
"url": "https://bmapi.bathymaps.com.au/layers/Z1_very_low/region_1008_combined_shaded_relief_Deep_very_low.pmtiles",
"bounds": [135.9896909, -36.5179775, 136.3769589, -35.9886269],
"minzoom": 9,
"maxzoom": 10
} 🚨 ProblemOnce we pass a few hundred layers, map performance tanks. Adding all sources/layers takes minutes, and interaction becomes sluggish—even though most tiles aren’t immediately visible. We assume the bottleneck is not tile loading, but the cumulative cost of 🧩 What we’re exploring
🙏 Questions
Thanks in advance — and thank you for maintaining such a powerful open-source mapping library! |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
I'm not sure I fully understand the setup, but you should be able to use addProtocol to load only the relevant data from the relevant PMTiles and avoid adding all the layers and sources. But I might be wrong... |
Beta Was this translation helpful? Give feedback.
-
I take it you also have thousands of pmtiles archives? Why the need for so many? Can you combine the data before it makes its way to MapLibre? As for lazy-loading sources/layers: are you setting |
Beta Was this translation helpful? Give feedback.
The following is still not in production and is used to load offline files sliced by zoom 7 areas, but it is practically the same idea:
https://github.com/IsraelHikingMap/Site/blob/2030fa3290df62ac7c41e22f98d3b5e10abe5a34/IsraelHiking.Web/src/application/services/database.service.ts#L97