langchain/libs/core/tests/unit_tests/load
ccurme b1a7f4e106
core, openai[patch]: support serialization of pydantic models in messages (#29940)
Resolves https://github.com/langchain-ai/langchain/issues/29003,
https://github.com/langchain-ai/langchain/issues/27264
Related: https://github.com/langchain-ai/langchain-redis/issues/52

```python
from langchain.chat_models import init_chat_model
from langchain.globals import set_llm_cache
from langchain_community.cache import SQLiteCache
from pydantic import BaseModel

cache = SQLiteCache()

set_llm_cache(cache)

class Temperature(BaseModel):
    value: int
    city: str

llm = init_chat_model("openai:gpt-4o-mini")
structured_llm = llm.with_structured_output(Temperature)
```
```python
# 681 ms
response = structured_llm.invoke("What is the average temperature of Rome in May?")
```
```python
# 6.98 ms
response = structured_llm.invoke("What is the average temperature of Rome in May?")
```
2025-02-24 09:34:27 -05:00
..
__init__.py BUG: more core fixes (#13665) 2023-11-21 15:15:48 -08:00
test_imports.py BUG: more core fixes (#13665) 2023-11-21 15:15:48 -08:00
test_serializable.py core, openai[patch]: support serialization of pydantic models in messages (#29940) 2025-02-24 09:34:27 -05:00