Hi!
So I was trying to setup a Servo searchfox thing, and I hit a problem that may be annoying to work around (well, I can always get a bigger server).
The save-analysis files generated by servo are at https://crisal.io/tmp/save-analysis.zip, and they're 1.8GB in total, uncompressed.
It seems somewhat surprising that rls-analysis is using more than 8GB of ram for that, but if it's expected feel free to close, I guess we just need a bigger machine... Or maybe we can extend the API to ignore some data... I see that the implementation of read_crate_data is mostly json decoding, so not sure how actionable this is...
The analysis loader implementation is at:
https://github.com/mozsearch/mozsearch/blob/e45c3d6ccfc3d5986b3e84a9fff5a53178b92484/tools/src/bin/rust-indexer.rs#L120
And the usage is just below:
https://github.com/mozsearch/mozsearch/blob/e45c3d6ccfc3d5986b3e84a9fff5a53178b92484/tools/src/bin/rust-indexer.rs#L420
It OOMs when input_dirs has just a single item pointing to that directory, getting killed without ever getting past read_analysis_from_files.
I'll try with a bigger machine, but I thought that it might be interesting.
cc @jdm