Erick Friis
42d40d694b
partners/openai: release 0.2.11 ( #28461 )
2024-12-02 23:35:18 +00:00
Bagatur
e9c16552fa
openai[patch]: bump core dep ( #28361 )
2024-11-26 08:37:05 -08:00
Bagatur
e7dc26aefb
openai[patch]: Release 0.2.10 ( #28360 )
2024-11-26 08:30:29 -08:00
Erick Friis
d9d689572a
openai: release 0.2.9, o1 streaming ( #28197 )
2024-11-18 23:54:38 +00:00
Erick Friis
6d2004ee7d
multiple: langchain-standard-tests -> langchain-tests ( #28139 )
2024-11-15 11:32:04 -08:00
ccurme
5eaa0e8c45
openai[patch]: release 0.2.8 ( #28062 )
2024-11-12 14:57:11 -05:00
Bagatur
9611f0b55d
openai[patch]: Release 0.2.7 ( #28047 )
2024-11-12 15:16:15 +00:00
ccurme
66966a6e72
openai[patch]: release 0.2.6 ( #27924 )
...
Some additions in support of [predicted
outputs](https://platform.openai.com/docs/guides/latency-optimization#use-predicted-outputs )
feature:
- Bump openai sdk version
- Add integration test
- Add example to integration docs
The `prediction` kwarg is already plumbed through model invocation.
2024-11-05 23:02:24 +00:00
Bagatur
06420de2e7
integrations[patch]: bump core to 0.3.15 ( #27805 )
2024-10-31 11:27:05 -07:00
Bagatur
d5306899d3
openai[patch]: Release 0.2.4 ( #27652 )
2024-10-25 20:26:21 +00:00
Erick Friis
2cf2cefe39
partners/openai: release 0.2.3 ( #27457 )
2024-10-18 08:16:01 -07:00
Erick Friis
7d65a32ee0
openai: audio modality, remove sockets from unit tests ( #27436 )
2024-10-18 08:02:09 -07:00
Erick Friis
92ae61bcc8
multiple: rely on asyncio_mode auto in tests ( #27200 )
2024-10-15 16:26:38 +00:00
Bagatur
98942edcc9
openai[patch]: Release 0.2.2 ( #27119 )
2024-10-04 11:54:01 -07:00
Bagatur
eaffa92c1d
openai[patch]: Release 0.2.1 ( #26858 )
2024-09-25 15:55:49 +00:00
Bagatur
f7ae12fa1f
openai[minor]: Release 0.2.0 ( #26464 )
2024-09-13 15:38:10 -07:00
Erick Friis
c2a3021bb0
multiple: pydantic 2 compatibility, v0.3 ( #26443 )
...
Signed-off-by: ChengZi <chen.zhang@zilliz.com>
Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com>
Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com>
Co-authored-by: Dan O'Donovan <dan.odonovan@gmail.com>
Co-authored-by: Tom Daniel Grande <tomdgrande@gmail.com>
Co-authored-by: Grande <Tom.Daniel.Grande@statsbygg.no>
Co-authored-by: Bagatur <baskaryan@gmail.com>
Co-authored-by: ccurme <chester.curme@gmail.com>
Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
Co-authored-by: Tomaz Bratanic <bratanic.tomaz@gmail.com>
Co-authored-by: ZhangShenao <15201440436@163.com>
Co-authored-by: Friso H. Kingma <fhkingma@gmail.com>
Co-authored-by: ChengZi <chen.zhang@zilliz.com>
Co-authored-by: Nuno Campos <nuno@langchain.dev>
Co-authored-by: Morgante Pell <morgantep@google.com>
2024-09-13 14:38:45 -07:00
Bagatur
d9813bdbbc
openai[patch]: Release 0.1.25 ( #26439 )
2024-09-13 12:00:01 -07:00
Erick Friis
1d98937e8d
partners/openai: release 0.1.24 ( #26417 )
2024-09-12 21:54:13 -07:00
Bagatur
85aef7641c
openai[patch]: Release 0.1.23 ( #25804 )
2024-08-28 08:52:08 +00:00
Bagatur
a06818a654
openai[patch]: update core dep ( #25502 )
2024-08-16 18:30:17 +00:00
Bagatur
9f0c76bf89
openai[patch]: Release 0.1.22 ( #25496 )
2024-08-16 16:53:04 +00:00
Bagatur
fd546196ef
openai[patch]: Release 0.1.21 ( #25269 )
2024-08-10 16:37:31 -07:00
Bagatur
a4086119f8
openai[patch]: Release 0.1.21rc2 ( #25146 )
2024-08-07 16:59:15 +00:00
Bagatur
09fbce13c5
openai[patch]: ChatOpenAI.with_structured_output json_schema support ( #25123 )
2024-08-07 08:09:07 -07:00
Bagatur
7882d5c978
openai[patch]: Release 0.1.21rc1 ( #25116 )
2024-08-06 21:50:36 +00:00
Bagatur
752a71b688
integrations[patch]: release model packages ( #24900 )
2024-07-31 20:48:20 +00:00
Bagatur
b3a23ddf93
integration releases ( #24725 )
...
Release anthropic, openai, groq, mistralai, robocorp
2024-07-26 12:30:10 -07:00
Erick Friis
3dce2e1d35
all: add release notes to pypi ( #24519 )
2024-07-22 13:59:13 -07:00
Erick Friis
80f3d48195
openai: release 0.1.18 ( #24369 )
2024-07-17 22:26:33 +00:00
Erick Friis
81639243e2
openai: release 0.1.17 ( #24361 )
2024-07-17 18:50:42 +00:00
Bagatur
13b0d7ec8f
openai[patch]: Release 0.1.16 ( #24202 )
2024-07-12 13:58:39 -07:00
Bagatur
cb5031f22f
integrations[patch]: require core >=0.2.17 ( #24207 )
2024-07-12 20:54:01 +00:00
Erick Friis
71c2221f8c
openai: release 0.1.15 ( #24097 )
2024-07-10 16:45:42 -07:00
Bagatur
a0c2281540
infra: update mypy 1.10, ruff 0.5 ( #23721 )
...
```python
"""python scripts/update_mypy_ruff.py"""
import glob
import tomllib
from pathlib import Path
import toml
import subprocess
import re
ROOT_DIR = Path(__file__).parents[1]
def main():
for path in glob.glob(str(ROOT_DIR / "libs/**/pyproject.toml"), recursive=True):
print(path)
with open(path, "rb") as f:
pyproject = tomllib.load(f)
try:
pyproject["tool"]["poetry"]["group"]["typing"]["dependencies"]["mypy"] = (
"^1.10"
)
pyproject["tool"]["poetry"]["group"]["lint"]["dependencies"]["ruff"] = (
"^0.5"
)
except KeyError:
continue
with open(path, "w") as f:
toml.dump(pyproject, f)
cwd = "/".join(path.split("/")[:-1])
completed = subprocess.run(
"poetry lock --no-update; poetry install --with typing; poetry run mypy . --no-color",
cwd=cwd,
shell=True,
capture_output=True,
text=True,
)
logs = completed.stdout.split("\n")
to_ignore = {}
for l in logs:
if re.match("^(.*)\:(\d+)\: error:.*\[(.*)\]", l):
path, line_no, error_type = re.match(
"^(.*)\:(\d+)\: error:.*\[(.*)\]", l
).groups()
if (path, line_no) in to_ignore:
to_ignore[(path, line_no)].append(error_type)
else:
to_ignore[(path, line_no)] = [error_type]
print(len(to_ignore))
for (error_path, line_no), error_types in to_ignore.items():
all_errors = ", ".join(error_types)
full_path = f"{cwd}/{error_path}"
try:
with open(full_path, "r") as f:
file_lines = f.readlines()
except FileNotFoundError:
continue
file_lines[int(line_no) - 1] = (
file_lines[int(line_no) - 1][:-1] + f" # type: ignore[{all_errors}]\n"
)
with open(full_path, "w") as f:
f.write("".join(file_lines))
subprocess.run(
"poetry run ruff format .; poetry run ruff --select I --fix .",
cwd=cwd,
shell=True,
capture_output=True,
text=True,
)
if __name__ == "__main__":
main()
```
2024-07-03 10:33:27 -07:00
Bagatur
6168c846b2
openai[patch]: Release 0.1.14 ( #23782 )
2024-07-02 18:17:15 -04:00
Bagatur
af2c05e5f3
openai[patch]: Release 0.1.13 ( #23651 )
2024-06-28 17:10:30 -07:00
ccurme
5d93916665
openai[patch]: release 0.1.12 ( #23641 )
2024-06-28 19:51:16 +00:00
ccurme
bffc3c24a0
openai[patch]: release 0.1.11 ( #23596 )
2024-06-27 18:48:40 +00:00
ccurme
5bfcb898ad
openai[patch]: bump sdk version ( #23592 )
...
Tests failing with `TypeError: Completions.create() got an unexpected
keyword argument 'parallel_tool_calls'`
2024-06-27 11:57:24 -04:00
Bagatur
92ac0fc9bd
openai[patch]: Release 0.1.10 ( #23410 )
2024-06-25 17:40:02 +00:00
ccurme
75c7c3a1a7
openai: release 0.1.9 ( #23263 )
2024-06-21 11:15:29 -04:00
Bagatur
8698cb9b28
infra: add more formatter rules to openai ( #23189 )
...
Turns on
https://docs.astral.sh/ruff/settings/#format_docstring-code-format and
https://docs.astral.sh/ruff/settings/#format_skip-magic-trailing-comma
```toml
[tool.ruff.format]
docstring-code-format = true
skip-magic-trailing-comma = true
```
2024-06-19 11:39:58 -07:00
Bagatur
0a4ee864e9
openai[patch]: image token counting ( #23147 )
...
Resolves #23000
---------
Co-authored-by: isaac hershenson <ihershenson@hmc.edu>
Co-authored-by: ccurme <chester.curme@gmail.com>
2024-06-19 10:41:47 -07:00
ccurme
42257b120f
partners: fix numpy dep ( #22858 )
...
Following https://github.com/langchain-ai/langchain/pull/22813 , which
added python 3.12 to CI, here we update numpy accordingly in partner
packages.
2024-06-13 14:46:42 -04:00
ccurme
6e1df72a88
openai[patch]: Release 0.1.8 ( #22291 )
2024-05-29 20:08:30 +00:00
ccurme
9a010fb761
openai: read stream_options ( #21548 )
...
OpenAI recently added a `stream_options` parameter to its chat
completions API (see [release
notes](https://platform.openai.com/docs/changelog/added-chat-completions-stream-usage )).
When this parameter is set to `{"usage": True}`, an extra "empty"
message is added to the end of a stream containing token usage. Here we
propagate token usage to `AIMessage.usage_metadata`.
We enable this feature by default. Streams would now include an extra
chunk at the end, **after** the chunk with
`response_metadata={'finish_reason': 'stop'}`.
New behavior:
```
[AIMessageChunk(content='', id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde'),
AIMessageChunk(content='Hello', id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde'),
AIMessageChunk(content='!', id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde'),
AIMessageChunk(content='', response_metadata={'finish_reason': 'stop'}, id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde'),
AIMessageChunk(content='', id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde', usage_metadata={'input_tokens': 8, 'output_tokens': 9, 'total_tokens': 17})]
```
Old behavior (accessible by passing `stream_options={"include_usage":
False}` into (a)stream:
```
[AIMessageChunk(content='', id='run-1312b971-c5ea-4d92-9015-e6604535f339'),
AIMessageChunk(content='Hello', id='run-1312b971-c5ea-4d92-9015-e6604535f339'),
AIMessageChunk(content='!', id='run-1312b971-c5ea-4d92-9015-e6604535f339'),
AIMessageChunk(content='', response_metadata={'finish_reason': 'stop'}, id='run-1312b971-c5ea-4d92-9015-e6604535f339')]
```
From what I can tell this is not yet implemented in Azure, so we enable
only for ChatOpenAI.
2024-05-24 13:20:56 -04:00
ccurme
152c8cac33
anthropic, openai: cut pre-releases ( #22083 )
2024-05-23 15:02:23 -04:00
ccurme
4470d3b4a0
partners: bump core in packages implementing ls_params ( #21868 )
...
These packages all import `LangSmithParams` which was released in
langchain-core==0.2.0.
N.B. we will need to release `openai` and then bump `langchain-openai`
in `together` and `upstage`.
2024-05-20 11:51:43 -07:00
Bagatur
af284518bc
openai[patch]: Release 0.1.7, bump tiktoken 0.7.0 ( #21723 )
2024-05-15 12:19:29 -07:00