Skip to content
Discussion options

You must be logged in to vote

@mc112611 are you using GPU for inference? That should speed things up for you. There are many other techniques such as using a smaller Retriever model, different FAISS index, and a smaller Reader model but again all these choices might affect your performance.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@mayankjobanputra
Comment options

Answer selected by mc112611
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants