Improving performance of a VS Code extension targeting a pre-existing language (caching, scope computation, built in functions etc.) #1621
Replies: 2 comments 4 replies
-
Hey @BjAlvestad,
Generally yes. That's often times the approach we're going for when offering built-in functions.
I would recommend you to use the get-with-provider methods available in the caches: langium/packages/langium/src/utils/caching.ts Lines 49 to 62 in b2c40e3 Supplying a
Yes, that's usually the way to go about this. We have a utility function available in the langium/packages/langium/src/utils/ast-utils.ts Lines 84 to 92 in b2c40e3
Exactly. Essentially, any change in a document, might result in a broken reference being fixed. This is why relinking is performed on files with broken references after any change. Note that this behavior can be adjusted in the IndexManager. The default is just there to work well with how the default global scoping works. In case your global scope works differently (like most languages actually, which are often based on some import functionality), it makes sense to adjust the behavior to take the differences in scope behavior into account.
It depends on the use case. For example, if the builtins only evolve with the language, we sometimes just put the builtins into a TypeScript file as a string. That way, they are always available for the language server to process. See also our guide on this topic. If you want to read the builtins from the filesystem, that's also completely valid. IIRC TypeScript does something like this. Note that this doesn't work as easily if you want your language server to run in the browser, where you would potentially need to fetch the builtins from a remote server. A general note from me: Once you start working on large workspaces with your language, optimization will always be a core part of your work. We have projects where 90% of our work is essentially just profiling a language and implementing optimizations for it. This isn't only about Langium (or Xtext), but most language projects. |
Beta Was this translation helpful? Give feedback.
-
Avoid linking error on built-ins I have not defined yetIn summary:Info on types in the language: Question: More detailed question/infoThis section may get updated with more notes as I investigate further. Notes.
Currently, assuming I have not added a built-in for So I am trying to see if I could override something in DefaultLinker code, so that if it is a normal ID, and it fails. Instead of creating an error object with createLinkingError that I somehow instead could simply make it behave as if there should be no cross-reference (or make the jump to go to a "generic built-in" that say that this built-in has not been added yet. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Summary
Linking errors triggering update in document, and their performance impact
In a workspace where you have many errors where reference cannot be resolved (due to scoping for some scenarios not being handled yet). Is it expected that a change in a healthy file where there are no cross references between it and the faulty ones will trigger an update where scoping get recomputed for all the files with error references as well?
I.e. It is not worth looking into caching for improving performance, before I first have expand my scope provider to cover all scenarios, and add built in libraries. So that it should then greatly improve the performance before I even implement any caching.
Bundling built in libraries
Regarding bundling built in libraries. Would the best option to just make some "fake" DSL files which describes the interface for the built in functions that are available, and just inject these via
loadAdditionalDocuments
inWorkspaceManager
?Caching
At its most basic level, would this just be to initialize an instance of
DocumentCache<string, VariableDeclaration[]>
, where an array ofVariableDeclaration
is what the scope gets created from in the end. And then check if key exists inside this, if so, return from it, otherwise calculate and store in cache and then after that return the values for scope?We need to provide URI to the DocumentCache in addition to the key when accessing it. Is the way to go about this to just walk along the containers until we reach the root, and then get the URI from that? E.g.
Some more background information
I have now a working extension that gives me cross reference for several parts (local variables, structs, and UDTs definer in other files). But I have not yet fixed cross reference for global variables and built in functions. The program is in a state where I could actually start using it were it not for performance is abysmal for very large projects (~25MB) so was originally planning to implement caching before continuing with fixing scoping for the remaining parts, since I assumed more cross references resolved would decrease the performance even further. But then I noticed something that made me question this assumption.
When I save a file with no external cross references, it seems that scope function gets called also on other files with broke cross references, but not if I remove the broken references. Is this the expected behavior that broken links will run every time any file changes?
If so, I guess I should look into fixing scoping for those before continuing with caching.
I note that under Modifications of a document section on Document Lifecycle page it is stated that:
So the system does not really know if the broken references are affected or not, by virtue of them being broken, and therefore will always trigger an update of these as well?
Also in that spirit. There are several built in functions in the language, that will not be able to find its cross reference in the workspace the user opens.
What would be the best way to build this into my program? Reading from additional target language code files, which I bundle with langium, that define these interfaces? Or to code them directly into langium? Or something else entirely?
Currently leaning towards the first one, and then just blend these bundled library files via
loadAdditionalDocuments
inWorkspaceManager
.Beta Was this translation helpful? Give feedback.
All reactions