mirror of
https://github.com/hwchase17/langchain.git
synced 2025-11-05 10:45:45 +00:00
a2863f87573e33258a97ebdf2deaf13bb490bf2f
6 Commits
| Author | SHA1 | Message | Date | |
|---|---|---|---|---|
|
|
ada740b5b9 |
community: Add ruff rule PGH003 (#30812)
See https://docs.astral.sh/ruff/rules/blanket-type-ignore/ --------- Co-authored-by: Chester Curme <chester.curme@gmail.com> |
||
|
|
723031d548 |
community: Bump ruff version to 0.9 (#29206)
Co-authored-by: Erick Friis <erick@langchain.dev> |
||
|
|
c2a3021bb0 |
multiple: pydantic 2 compatibility, v0.3 (#26443)
Signed-off-by: ChengZi <chen.zhang@zilliz.com> Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com> Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com> Co-authored-by: Dan O'Donovan <dan.odonovan@gmail.com> Co-authored-by: Tom Daniel Grande <tomdgrande@gmail.com> Co-authored-by: Grande <Tom.Daniel.Grande@statsbygg.no> Co-authored-by: Bagatur <baskaryan@gmail.com> Co-authored-by: ccurme <chester.curme@gmail.com> Co-authored-by: Harrison Chase <hw.chase.17@gmail.com> Co-authored-by: Tomaz Bratanic <bratanic.tomaz@gmail.com> Co-authored-by: ZhangShenao <15201440436@163.com> Co-authored-by: Friso H. Kingma <fhkingma@gmail.com> Co-authored-by: ChengZi <chen.zhang@zilliz.com> Co-authored-by: Nuno Campos <nuno@langchain.dev> Co-authored-by: Morgante Pell <morgantep@google.com> |
||
|
|
ab527027ac |
community: Resolve refs recursively when generating openai_fn from OpenAPI spec (#19002)
- **Description:** This PR is intended to improve the generation of
payloads for OpenAI functions when converting from an OpenAPI spec file.
The solution is to recursively resolve `$refs`.
Currently when converting OpenAPI specs into OpenAI functions using
`openapi_spec_to_openai_fn`, if the schemas have nested references, the
generated functions contain `$ref` that causes the LLM to generate
payloads with an incorrect schema.
For example, for the for OpenAPI spec:
```
text = """
{
"openapi": "3.0.3",
"info": {
"title": "Swagger Petstore - OpenAPI 3.0",
"termsOfService": "http://swagger.io/terms/",
"contact": {
"email": "apiteam@swagger.io"
},
"license": {
"name": "Apache 2.0",
"url": "http://www.apache.org/licenses/LICENSE-2.0.html"
},
"version": "1.0.11"
},
"externalDocs": {
"description": "Find out more about Swagger",
"url": "http://swagger.io"
},
"servers": [
{
"url": "https://petstore3.swagger.io/api/v3"
}
],
"tags": [
{
"name": "pet",
"description": "Everything about your Pets",
"externalDocs": {
"description": "Find out more",
"url": "http://swagger.io"
}
},
{
"name": "store",
"description": "Access to Petstore orders",
"externalDocs": {
"description": "Find out more about our store",
"url": "http://swagger.io"
}
},
{
"name": "user",
"description": "Operations about user"
}
],
"paths": {
"/pet": {
"post": {
"tags": [
"pet"
],
"summary": "Add a new pet to the store",
"description": "Add a new pet to the store",
"operationId": "addPet",
"requestBody": {
"description": "Create a new pet in the store",
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/Pet"
}
}
},
"required": true
},
"responses": {
"200": {
"description": "Successful operation",
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/Pet"
}
}
}
}
}
}
}
},
"components": {
"schemas": {
"Tag": {
"type": "object",
"properties": {
"id": {
"type": "integer",
"format": "int64"
},
"model_type": {
"type": "number"
}
}
},
"Category": {
"type": "object",
"required": [
"model",
"year",
"age"
],
"properties": {
"year": {
"type": "integer",
"format": "int64",
"example": 1
},
"model": {
"type": "string",
"example": "Ford"
},
"age": {
"type": "integer",
"example": 42
}
}
},
"Pet": {
"required": [
"name"
],
"type": "object",
"properties": {
"id": {
"type": "integer",
"format": "int64",
"example": 10
},
"name": {
"type": "string",
"example": "doggie"
},
"category": {
"$ref": "#/components/schemas/Category"
},
"tags": {
"type": "array",
"items": {
"$ref": "#/components/schemas/Tag"
}
},
"status": {
"type": "string",
"description": "pet status in the store",
"enum": [
"available",
"pending",
"sold"
]
}
}
}
}
}
}
"""
```
Executing:
```
spec = OpenAPISpec.from_text(text)
pet_openai_functions, pet_callables = openapi_spec_to_openai_fn(spec)
response = model.invoke("Create a pet named Scott", functions=pet_openai_functions)
```
`pet_open_functions` contains unresolved `$refs`:
```
[
{
"name": "addPet",
"description": "Add a new pet to the store",
"parameters": {
"type": "object",
"properties": {
"json": {
"properties": {
"id": {
"type": "integer",
"schema_format": "int64",
"example": 10
},
"name": {
"type": "string",
"example": "doggie"
},
"category": {
"ref": "#/components/schemas/Category"
},
"tags": {
"items": {
"ref": "#/components/schemas/Tag"
},
"type": "array"
},
"status": {
"type": "string",
"enum": [
"available",
"pending",
"sold"
],
"description": "pet status in the store"
}
},
"type": "object",
"required": [
"name",
"photoUrls"
]
}
}
}
}
]
```
and the generated JSON has an incorrect schema (e.g. category is filled
with `id` and `name` instead of `model`, `year` and `age`:
```
{
"id": 1,
"name": "Scott",
"category": {
"id": 1,
"name": "Dogs"
},
"tags": [
{
"id": 1,
"name": "tag1"
}
],
"status": "available"
}
```
With this change, the generated JSON by the LLM becomes,
`pet_openai_functions` becomes:
```
[
{
"name": "addPet",
"description": "Add a new pet to the store",
"parameters": {
"type": "object",
"properties": {
"json": {
"properties": {
"id": {
"type": "integer",
"schema_format": "int64",
"example": 10
},
"name": {
"type": "string",
"example": "doggie"
},
"category": {
"properties": {
"year": {
"type": "integer",
"schema_format": "int64",
"example": 1
},
"model": {
"type": "string",
"example": "Ford"
},
"age": {
"type": "integer",
"example": 42
}
},
"type": "object",
"required": [
"model",
"year",
"age"
]
},
"tags": {
"items": {
"properties": {
"id": {
"type": "integer",
"schema_format": "int64"
},
"model_type": {
"type": "number"
}
},
"type": "object"
},
"type": "array"
},
"status": {
"type": "string",
"enum": [
"available",
"pending",
"sold"
],
"description": "pet status in the store"
}
},
"type": "object",
"required": [
"name"
]
}
}
}
}
]
```
and the JSON generated by the LLM is:
```
{
"id": 1,
"name": "Scott",
"category": {
"year": 2022,
"model": "Dog",
"age": 42
},
"tags": [
{
"id": 1,
"model_type": 1
}
],
"status": "available"
}
```
which has the intended schema.
- **Twitter handle:**: @brunoalvisio
---------
Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
|
||
|
|
a0c2281540 |
infra: update mypy 1.10, ruff 0.5 (#23721)
```python
"""python scripts/update_mypy_ruff.py"""
import glob
import tomllib
from pathlib import Path
import toml
import subprocess
import re
ROOT_DIR = Path(__file__).parents[1]
def main():
for path in glob.glob(str(ROOT_DIR / "libs/**/pyproject.toml"), recursive=True):
print(path)
with open(path, "rb") as f:
pyproject = tomllib.load(f)
try:
pyproject["tool"]["poetry"]["group"]["typing"]["dependencies"]["mypy"] = (
"^1.10"
)
pyproject["tool"]["poetry"]["group"]["lint"]["dependencies"]["ruff"] = (
"^0.5"
)
except KeyError:
continue
with open(path, "w") as f:
toml.dump(pyproject, f)
cwd = "/".join(path.split("/")[:-1])
completed = subprocess.run(
"poetry lock --no-update; poetry install --with typing; poetry run mypy . --no-color",
cwd=cwd,
shell=True,
capture_output=True,
text=True,
)
logs = completed.stdout.split("\n")
to_ignore = {}
for l in logs:
if re.match("^(.*)\:(\d+)\: error:.*\[(.*)\]", l):
path, line_no, error_type = re.match(
"^(.*)\:(\d+)\: error:.*\[(.*)\]", l
).groups()
if (path, line_no) in to_ignore:
to_ignore[(path, line_no)].append(error_type)
else:
to_ignore[(path, line_no)] = [error_type]
print(len(to_ignore))
for (error_path, line_no), error_types in to_ignore.items():
all_errors = ", ".join(error_types)
full_path = f"{cwd}/{error_path}"
try:
with open(full_path, "r") as f:
file_lines = f.readlines()
except FileNotFoundError:
continue
file_lines[int(line_no) - 1] = (
file_lines[int(line_no) - 1][:-1] + f" # type: ignore[{all_errors}]\n"
)
with open(full_path, "w") as f:
f.write("".join(file_lines))
subprocess.run(
"poetry run ruff format .; poetry run ruff --select I --fix .",
cwd=cwd,
shell=True,
capture_output=True,
text=True,
)
if __name__ == "__main__":
main()
```
|
||
|
|
ed58eeb9c5 |
community[major], core[patch], langchain[patch], experimental[patch]: Create langchain-community (#14463)
Moved the following modules to new package langchain-community in a backwards compatible fashion: ``` mv langchain/langchain/adapters community/langchain_community mv langchain/langchain/callbacks community/langchain_community/callbacks mv langchain/langchain/chat_loaders community/langchain_community mv langchain/langchain/chat_models community/langchain_community mv langchain/langchain/document_loaders community/langchain_community mv langchain/langchain/docstore community/langchain_community mv langchain/langchain/document_transformers community/langchain_community mv langchain/langchain/embeddings community/langchain_community mv langchain/langchain/graphs community/langchain_community mv langchain/langchain/llms community/langchain_community mv langchain/langchain/memory/chat_message_histories community/langchain_community mv langchain/langchain/retrievers community/langchain_community mv langchain/langchain/storage community/langchain_community mv langchain/langchain/tools community/langchain_community mv langchain/langchain/utilities community/langchain_community mv langchain/langchain/vectorstores community/langchain_community mv langchain/langchain/agents/agent_toolkits community/langchain_community mv langchain/langchain/cache.py community/langchain_community mv langchain/langchain/adapters community/langchain_community mv langchain/langchain/callbacks community/langchain_community/callbacks mv langchain/langchain/chat_loaders community/langchain_community mv langchain/langchain/chat_models community/langchain_community mv langchain/langchain/document_loaders community/langchain_community mv langchain/langchain/docstore community/langchain_community mv langchain/langchain/document_transformers community/langchain_community mv langchain/langchain/embeddings community/langchain_community mv langchain/langchain/graphs community/langchain_community mv langchain/langchain/llms community/langchain_community mv langchain/langchain/memory/chat_message_histories community/langchain_community mv langchain/langchain/retrievers community/langchain_community mv langchain/langchain/storage community/langchain_community mv langchain/langchain/tools community/langchain_community mv langchain/langchain/utilities community/langchain_community mv langchain/langchain/vectorstores community/langchain_community mv langchain/langchain/agents/agent_toolkits community/langchain_community mv langchain/langchain/cache.py community/langchain_community ``` Moved the following to core ``` mv langchain/langchain/utils/json_schema.py core/langchain_core/utils mv langchain/langchain/utils/html.py core/langchain_core/utils mv langchain/langchain/utils/strings.py core/langchain_core/utils cat langchain/langchain/utils/env.py >> core/langchain_core/utils/env.py rm langchain/langchain/utils/env.py ``` See .scripts/community_split/script_integrations.sh for all changes |