mirror of
https://github.com/hwchase17/langchain.git
synced 2025-08-21 10:26:57 +00:00
This is a follow-on PR to go with the identical changes that were made in parters/openai. Previous PR: https://github.com/langchain-ai/langchain/pull/30757 When calling embed_documents and providing a chunk_size argument, that argument is ignored when OpenAIEmbeddings is instantiated with its default configuration (where check_embedding_ctx_length=True). _get_len_safe_embeddings specifies a chunk_size parameter but it's not being passed through in embed_documents, which is its only caller. This appears to be an oversight, especially given that the _get_len_safe_embeddings docstring states it should respect "the set embedding context length and chunk size." Developers typically expect method parameters to take effect (also, take precedence) when explicitly provided, especially when instantiating using defaults. I was confused as to why my API calls were being rejected regardless of the chunk size I provided. |
||
---|---|---|
.. | ||
__init__.py | ||
test_baichuan.py | ||
test_deterministic_embedding.py | ||
test_edenai.py | ||
test_embaas.py | ||
test_gpt4all.py | ||
test_gradient_ai.py | ||
test_huggingface.py | ||
test_imports.py | ||
test_infinity_local.py | ||
test_infinity.py | ||
test_llamacpp.py | ||
test_llamafile.py | ||
test_llm_rails.py | ||
test_model2vec.py | ||
test_naver.py | ||
test_oci_gen_ai_embedding.py | ||
test_ollama.py | ||
test_openai.py | ||
test_ovhcloud.py | ||
test_premai.py | ||
test_sparkllm.py | ||
test_vertexai.py | ||
test_yandex.py |