[community][fix](DeepInfraEmbeddings): Implement chunking for large batches (#21189)

**Description:**
This PR introduces chunking logic to the `DeepInfraEmbeddings` class to
handle large batch sizes without exceeding maximum batch size of the
backend. This enhancement ensures that embedding generation processes
large batches by breaking them down into smaller, manageable chunks,
each conforming to the maximum batch size limit.

**Issue:**
Fixes #21189

**Dependencies:**
No new dependencies introduced.
This commit is contained in:
Oguz Vuruskaner
2024-05-09 00:45:42 +03:00
committed by GitHub
parent f4ddf64faa
commit 5b35f077f9
2 changed files with 27 additions and 2 deletions

View File

@@ -17,3 +17,13 @@ def test_deepinfra_call() -> None:
assert len(r1[1]) == 768
r2 = deepinfra_emb.embed_query("What is the third letter of Greek alphabet")
assert len(r2) == 768
def test_deepinfra_call_with_large_batch_size() -> None:
deepinfra_emb = DeepInfraEmbeddings(model_id="BAAI/bge-base-en-v1.5")
texts = 2000 * [
"Alpha is the first letter of Greek alphabet",
]
r1 = deepinfra_emb.embed_documents(texts)
assert len(r1) == 2000
assert len(r1[0]) == 768