Erick Friis
42d40d694b
partners/openai: release 0.2.11 ( #28461 )
2024-12-02 23:35:18 +00:00
Bagatur
e9c16552fa
openai[patch]: bump core dep ( #28361 )
2024-11-26 08:37:05 -08:00
Erick Friis
d9d689572a
openai: release 0.2.9, o1 streaming ( #28197 )
2024-11-18 23:54:38 +00:00
Erick Friis
6d2004ee7d
multiple: langchain-standard-tests -> langchain-tests ( #28139 )
2024-11-15 11:32:04 -08:00
ccurme
5eaa0e8c45
openai[patch]: release 0.2.8 ( #28062 )
2024-11-12 14:57:11 -05:00
Bagatur
9611f0b55d
openai[patch]: Release 0.2.7 ( #28047 )
2024-11-12 15:16:15 +00:00
ccurme
66966a6e72
openai[patch]: release 0.2.6 ( #27924 )
...
Some additions in support of [predicted
outputs](https://platform.openai.com/docs/guides/latency-optimization#use-predicted-outputs )
feature:
- Bump openai sdk version
- Add integration test
- Add example to integration docs
The `prediction` kwarg is already plumbed through model invocation.
2024-11-05 23:02:24 +00:00
Bagatur
06420de2e7
integrations[patch]: bump core to 0.3.15 ( #27805 )
2024-10-31 11:27:05 -07:00
Bagatur
d5306899d3
openai[patch]: Release 0.2.4 ( #27652 )
2024-10-25 20:26:21 +00:00
Erick Friis
2cf2cefe39
partners/openai: release 0.2.3 ( #27457 )
2024-10-18 08:16:01 -07:00
Erick Friis
7d65a32ee0
openai: audio modality, remove sockets from unit tests ( #27436 )
2024-10-18 08:02:09 -07:00
Bagatur
98942edcc9
openai[patch]: Release 0.2.2 ( #27119 )
2024-10-04 11:54:01 -07:00
Bagatur
f7ae12fa1f
openai[minor]: Release 0.2.0 ( #26464 )
2024-09-13 15:38:10 -07:00
Erick Friis
c2a3021bb0
multiple: pydantic 2 compatibility, v0.3 ( #26443 )
...
Signed-off-by: ChengZi <chen.zhang@zilliz.com>
Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com>
Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com>
Co-authored-by: Dan O'Donovan <dan.odonovan@gmail.com>
Co-authored-by: Tom Daniel Grande <tomdgrande@gmail.com>
Co-authored-by: Grande <Tom.Daniel.Grande@statsbygg.no>
Co-authored-by: Bagatur <baskaryan@gmail.com>
Co-authored-by: ccurme <chester.curme@gmail.com>
Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
Co-authored-by: Tomaz Bratanic <bratanic.tomaz@gmail.com>
Co-authored-by: ZhangShenao <15201440436@163.com>
Co-authored-by: Friso H. Kingma <fhkingma@gmail.com>
Co-authored-by: ChengZi <chen.zhang@zilliz.com>
Co-authored-by: Nuno Campos <nuno@langchain.dev>
Co-authored-by: Morgante Pell <morgantep@google.com>
2024-09-13 14:38:45 -07:00
Bagatur
d9813bdbbc
openai[patch]: Release 0.1.25 ( #26439 )
2024-09-13 12:00:01 -07:00
Erick Friis
1d98937e8d
partners/openai: release 0.1.24 ( #26417 )
2024-09-12 21:54:13 -07:00
Bagatur
85aef7641c
openai[patch]: Release 0.1.23 ( #25804 )
2024-08-28 08:52:08 +00:00
Bagatur
a06818a654
openai[patch]: update core dep ( #25502 )
2024-08-16 18:30:17 +00:00
Bagatur
a4086119f8
openai[patch]: Release 0.1.21rc2 ( #25146 )
2024-08-07 16:59:15 +00:00
Bagatur
09fbce13c5
openai[patch]: ChatOpenAI.with_structured_output json_schema support ( #25123 )
2024-08-07 08:09:07 -07:00
Bagatur
7882d5c978
openai[patch]: Release 0.1.21rc1 ( #25116 )
2024-08-06 21:50:36 +00:00
Bagatur
752a71b688
integrations[patch]: release model packages ( #24900 )
2024-07-31 20:48:20 +00:00
Bagatur
b3a23ddf93
integration releases ( #24725 )
...
Release anthropic, openai, groq, mistralai, robocorp
2024-07-26 12:30:10 -07:00
Erick Friis
80f3d48195
openai: release 0.1.18 ( #24369 )
2024-07-17 22:26:33 +00:00
Erick Friis
81639243e2
openai: release 0.1.17 ( #24361 )
2024-07-17 18:50:42 +00:00
Bagatur
cb5031f22f
integrations[patch]: require core >=0.2.17 ( #24207 )
2024-07-12 20:54:01 +00:00
Erick Friis
71c2221f8c
openai: release 0.1.15 ( #24097 )
2024-07-10 16:45:42 -07:00
Bagatur
a0c2281540
infra: update mypy 1.10, ruff 0.5 ( #23721 )
...
```python
"""python scripts/update_mypy_ruff.py"""
import glob
import tomllib
from pathlib import Path
import toml
import subprocess
import re
ROOT_DIR = Path(__file__).parents[1]
def main():
for path in glob.glob(str(ROOT_DIR / "libs/**/pyproject.toml"), recursive=True):
print(path)
with open(path, "rb") as f:
pyproject = tomllib.load(f)
try:
pyproject["tool"]["poetry"]["group"]["typing"]["dependencies"]["mypy"] = (
"^1.10"
)
pyproject["tool"]["poetry"]["group"]["lint"]["dependencies"]["ruff"] = (
"^0.5"
)
except KeyError:
continue
with open(path, "w") as f:
toml.dump(pyproject, f)
cwd = "/".join(path.split("/")[:-1])
completed = subprocess.run(
"poetry lock --no-update; poetry install --with typing; poetry run mypy . --no-color",
cwd=cwd,
shell=True,
capture_output=True,
text=True,
)
logs = completed.stdout.split("\n")
to_ignore = {}
for l in logs:
if re.match("^(.*)\:(\d+)\: error:.*\[(.*)\]", l):
path, line_no, error_type = re.match(
"^(.*)\:(\d+)\: error:.*\[(.*)\]", l
).groups()
if (path, line_no) in to_ignore:
to_ignore[(path, line_no)].append(error_type)
else:
to_ignore[(path, line_no)] = [error_type]
print(len(to_ignore))
for (error_path, line_no), error_types in to_ignore.items():
all_errors = ", ".join(error_types)
full_path = f"{cwd}/{error_path}"
try:
with open(full_path, "r") as f:
file_lines = f.readlines()
except FileNotFoundError:
continue
file_lines[int(line_no) - 1] = (
file_lines[int(line_no) - 1][:-1] + f" # type: ignore[{all_errors}]\n"
)
with open(full_path, "w") as f:
f.write("".join(file_lines))
subprocess.run(
"poetry run ruff format .; poetry run ruff --select I --fix .",
cwd=cwd,
shell=True,
capture_output=True,
text=True,
)
if __name__ == "__main__":
main()
```
2024-07-03 10:33:27 -07:00
ccurme
5bfcb898ad
openai[patch]: bump sdk version ( #23592 )
...
Tests failing with `TypeError: Completions.create() got an unexpected
keyword argument 'parallel_tool_calls'`
2024-06-27 11:57:24 -04:00
Bagatur
0a4ee864e9
openai[patch]: image token counting ( #23147 )
...
Resolves #23000
---------
Co-authored-by: isaac hershenson <ihershenson@hmc.edu>
Co-authored-by: ccurme <chester.curme@gmail.com>
2024-06-19 10:41:47 -07:00
ccurme
42257b120f
partners: fix numpy dep ( #22858 )
...
Following https://github.com/langchain-ai/langchain/pull/22813 , which
added python 3.12 to CI, here we update numpy accordingly in partner
packages.
2024-06-13 14:46:42 -04:00
ccurme
6e1df72a88
openai[patch]: Release 0.1.8 ( #22291 )
2024-05-29 20:08:30 +00:00
ccurme
9a010fb761
openai: read stream_options ( #21548 )
...
OpenAI recently added a `stream_options` parameter to its chat
completions API (see [release
notes](https://platform.openai.com/docs/changelog/added-chat-completions-stream-usage )).
When this parameter is set to `{"usage": True}`, an extra "empty"
message is added to the end of a stream containing token usage. Here we
propagate token usage to `AIMessage.usage_metadata`.
We enable this feature by default. Streams would now include an extra
chunk at the end, **after** the chunk with
`response_metadata={'finish_reason': 'stop'}`.
New behavior:
```
[AIMessageChunk(content='', id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde'),
AIMessageChunk(content='Hello', id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde'),
AIMessageChunk(content='!', id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde'),
AIMessageChunk(content='', response_metadata={'finish_reason': 'stop'}, id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde'),
AIMessageChunk(content='', id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde', usage_metadata={'input_tokens': 8, 'output_tokens': 9, 'total_tokens': 17})]
```
Old behavior (accessible by passing `stream_options={"include_usage":
False}` into (a)stream:
```
[AIMessageChunk(content='', id='run-1312b971-c5ea-4d92-9015-e6604535f339'),
AIMessageChunk(content='Hello', id='run-1312b971-c5ea-4d92-9015-e6604535f339'),
AIMessageChunk(content='!', id='run-1312b971-c5ea-4d92-9015-e6604535f339'),
AIMessageChunk(content='', response_metadata={'finish_reason': 'stop'}, id='run-1312b971-c5ea-4d92-9015-e6604535f339')]
```
From what I can tell this is not yet implemented in Azure, so we enable
only for ChatOpenAI.
2024-05-24 13:20:56 -04:00
ccurme
152c8cac33
anthropic, openai: cut pre-releases ( #22083 )
2024-05-23 15:02:23 -04:00
ccurme
4470d3b4a0
partners: bump core in packages implementing ls_params ( #21868 )
...
These packages all import `LangSmithParams` which was released in
langchain-core==0.2.0.
N.B. we will need to release `openai` and then bump `langchain-openai`
in `together` and `upstage`.
2024-05-20 11:51:43 -07:00
Bagatur
af284518bc
openai[patch]: Release 0.1.7, bump tiktoken 0.7.0 ( #21723 )
2024-05-15 12:19:29 -07:00
Erick Friis
c77d2f2b06
multiple: core 0.2 nonbreaking dep, check_diff community->langchain dep ( #21646 )
...
0.2 is not a breaking release for core (but it is for langchain and
community)
To keep the core+langchain+community packages in sync at 0.2, we will
relax deps throughout the ecosystem to tolerate `langchain-core` 0.2
2024-05-13 19:50:36 -07:00
Bagatur
6ac6158a07
openai[patch]: support tool_choice="required" ( #21216 )
...
Co-authored-by: ccurme <chester.curme@gmail.com>
2024-05-02 18:33:25 -04:00
Bagatur
bef50ded63
openai[patch]: fix special token default behavior ( #21131 )
...
By default handle special sequences as regular text
2024-04-30 20:08:24 -04:00
ccurme
465fbaa30b
openai: release 0.1.4 ( #20939 )
2024-04-26 09:56:49 -07:00
Bagatur
799714c629
release anthropic, fireworks, openai, groq, mistral ( #20333 )
2024-04-11 09:19:52 -07:00
Erick Friis
9eb6f538f0
infra, multiple: rc release versions ( #20252 )
2024-04-09 17:54:58 -07:00
Bagatur
a8eb0f5b1b
openai[patch]: pre-release 0.1.3-rc.1 ( #20249 )
2024-04-10 00:22:08 +00:00
Bagatur
0b2f0307d7
openai[patch]: Release 0.1.2 ( #20241 )
2024-04-09 21:55:19 +00:00
Erick Friis
855ba46f80
standard-tests: a standard unit and integration test set ( #20182 )
...
just chat models for now
2024-04-09 12:43:00 -07:00
Erick Friis
e71daa7a03
openai[patch]: add test coverage to output ( #19462 )
2024-03-22 15:33:10 -07:00
Erick Friis
a9cda536ad
openai[patch]: fix core min version ( #19366 )
2024-03-20 15:38:29 -07:00
Erick Friis
f6c8700326
openai[patch]: release 0.1.0, message id and name support ( #19363 )
2024-03-20 15:11:39 -07:00
Erick Friis
69e9610f62
openai[patch]: pass message name ( #17537 )
2024-03-19 19:57:27 +00:00
Erick Friis
687d27567d
openai[patch]: unit test azure init ( #18703 )
2024-03-06 14:17:09 -08:00