langchain_openai: clean duplicate code for openai embedding. (#30872)

The `_chunk_size` has not changed by method `self._tokenize`, So i think
these is duplicate code.

Signed-off-by: zhanluxianshen <zhanluxianshen@163.com>
This commit is contained in:
湛露先生 2025-04-28 03:07:41 +08:00 committed by GitHub
parent 79a537d308
commit 5fb8fd863a
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -517,7 +517,6 @@ class OpenAIEmbeddings(BaseModel, Embeddings):
_chunk_size = chunk_size or self.chunk_size
_iter, tokens, indices = self._tokenize(texts, _chunk_size)
batched_embeddings: list[list[float]] = []
_chunk_size = chunk_size or self.chunk_size
for i in range(0, len(tokens), _chunk_size):
response = await self.async_client.create(
input=tokens[i : i + _chunk_size], **self._invocation_params