-
Notifications
You must be signed in to change notification settings - Fork 72
Open
Description
Hi,
I got a total 755G index saved in my disk after encoding the whole wiki passage. The large index takes huge storage and long time to load to GPU. However, it requires less than 100G after loading to GPU, which could be the index compression mentioned in your paper. Is it possible to save and load the compressed index for better time and storage consumption?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels