Model loading taking too much time #1374
Unanswered
vibsid0986
asked this question in
Q&A
Replies: 1 comment 2 replies
-
Hey @vibsid0986, can you tell how large your model is in terms of bytesize/lines of code? A thousand files isn't really a lot in the context of Langium, but depending on the file size it might take a while. Our own benchmarks show that langium roughly processes 20000 LoC per second, but it heavily depends on the complexity of your grammar and the performance of your scoping algorithm, etc. It might make sense to benchmark your workspace initialization to see where most of the time is spent. There are quite a few possible bottlenecks, some of which are language specific, others are runtime related. Giving a bit more context would help here. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello Team,
We conducted a trial by loading our model within a browser window containing a workspace comprising approximately 1000 files. The loading process consumed approximately 3 to 3.5 minutes. I am interested in understanding whether this aligns with the results observed during bulk testing on your end.
Please help us with it.
Thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions