Skip to content

Commit 7c175e3

Browse files
authored
Update ascend.py (#30060)
add batch_size to fix oom when embed large amount texts Thank you for contributing to LangChain! - [ ] **PR title**: "package: description" - Where "package" is whichever of langchain, community, core, etc. is being modified. Use "docs: ..." for purely docs changes, "infra: ..." for CI changes. - Example: "community: add foobar LLM" - [ ] **PR message**: ***Delete this entire checklist*** and replace with - **Description:** a description of the change - **Issue:** the issue # it fixes, if applicable - **Dependencies:** any dependencies required for this change - **Twitter handle:** if your PR gets announced, and you'd like a mention, we'll gladly shout you out! - [ ] **Add tests and docs**: If you're adding a new integration, please include 1. a test for the integration, preferably unit tests that do not rely on network access, 2. an example notebook showing its use. It lives in `docs/docs/integrations` directory. - [ ] **Lint and test**: Run `make format`, `make lint` and `make test` from the root of the package(s) you've modified. See contribution guidelines for more: https://python.langchain.com/docs/contributing/ Additional guidelines: - Make sure optional dependencies are imported within a function. - Please do not add dependencies to pyproject.toml files (even optional ones) unless they are required for unit tests. - Most PRs should not touch more than one package. - Changes should be backwards compatible. - If you are adding something to community, do not re-import it in langchain. If no one reviews your PR within a few days, please @-mention one of baskaryan, efriis, eyurtsev, ccurme, vbarda, hwchase17.
1 parent 3b066dc commit 7c175e3

File tree

1 file changed

+13
-1
lines changed
  • libs/community/langchain_community/embeddings

1 file changed

+13
-1
lines changed

libs/community/langchain_community/embeddings/ascend.py

Lines changed: 13 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,6 +30,7 @@ class AscendEmbeddings(Embeddings, BaseModel):
3030
document_instruction: str = ""
3131
use_fp16: bool = True
3232
pooling_method: Optional[str] = "cls"
33+
batch_size: int = 32
3334
model: Any
3435
tokenizer: Any
3536

@@ -119,7 +120,18 @@ def pooling(self, last_hidden_state: Any, attention_mask: Any = None) -> Any:
119120
)
120121

121122
def embed_documents(self, texts: List[str]) -> List[List[float]]:
122-
return self.encode([self.document_instruction + text for text in texts])
123+
try:
124+
import numpy as np
125+
except ImportError as e:
126+
raise ImportError(
127+
"Unable to import numpy, please install with `pip install -U numpy`."
128+
) from e
129+
embedding_list = []
130+
for i in range(0, len(texts), self.batch_size):
131+
texts_ = texts[i : i + self.batch_size]
132+
emb = self.encode([self.document_instruction + text for text in texts_])
133+
embedding_list.append(emb)
134+
return np.concatenate(embedding_list)
123135

124136
def embed_query(self, text: str) -> List[float]:
125137
return self.encode([self.query_instruction + text])[0]

0 commit comments

Comments
 (0)