OpenAI Embedding -> batch size is limited to self.batch_size = 64 #5274
Replies: 1 comment 1 reply
-
@michaelfeil Thank you for bringing this to our attention. I am currently looking into this issue and already found the commit that introduced the limitation in Haystack here: 5ca9635#diff-0e5bbc2d22af73487fde67b66b74be8970dc3ff499bff5199200f5f4843794b1R394 I couldn't find any limitation on the batch size on OpenAI's side, searched for example here: https://platform.openai.com/docs/api-reference/embeddings/create @vblagoje Are you aware of any known limitation of the batch size? If not I'd suggest we drop the limitation in Haystack and replace the line |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I noticed, that the batch size is limited to only 64. My latency is quite high, so a fair amount of time is just for waiting for the sequential requests. Any way, we can this can be increased?
haystack/haystack/nodes/retriever/_openai_encoder.py
Lines 41 to 44 in 00efa51
Beta Was this translation helpful? Give feedback.
All reactions