langchain/libs/partners/openai/tests/unit_tests
Aubrey Ford b344f34635
partners/openai: OpenAIEmbeddings not respecting chunk_size argument (#30757)
When calling `embed_documents` and providing a `chunk_size` argument,
that argument is ignored when `OpenAIEmbeddings` is instantiated with
its default configuration (where `check_embedding_ctx_length=True`).

`_get_len_safe_embeddings` specifies a `chunk_size` parameter but it's
not being passed through in `embed_documents`, which is its only caller.
This appears to be an oversight, especially given that the
`_get_len_safe_embeddings` docstring states it should respect "the set
embedding context length and chunk size."

Developers typically expect method parameters to take effect (also, take
precedence) when explicitly provided, especially when instantiating
using defaults. I was confused as to why my API calls were being
rejected regardless of the chunk size I provided.

This bug also exists in langchain_community package. I can add that to
this PR if requested otherwise I will create a new one once this passes.
2025-04-18 15:27:27 -04:00
..
chat_models multiple: permit optional fields on multimodal content blocks (#30887) 2025-04-17 12:48:46 +00:00
embeddings partners/openai: OpenAIEmbeddings not respecting chunk_size argument (#30757) 2025-04-18 15:27:27 -04:00
fake partners[lint]: run pyupgrade to get code in line with 3.9 standards (#30781) 2025-04-11 07:18:44 -04:00
llms partners[lint]: run pyupgrade to get code in line with 3.9 standards (#30781) 2025-04-11 07:18:44 -04:00
__init__.py
test_imports.py
test_load.py openai[patch]: update imports in test (#30828) 2025-04-14 19:33:38 +00:00
test_secrets.py partners[lint]: run pyupgrade to get code in line with 3.9 standards (#30781) 2025-04-11 07:18:44 -04:00
test_token_counts.py openai[patch]: upgrade tiktoken and fix test (#30621) 2025-04-02 10:44:48 -04:00