Bagatur
236e957abb
core,groq,openai,mistralai,robocorp,fireworks,anthropic[patch]: Update BaseModel subclass and instance checks to handle both v1 and proper namespaces ( #24417 )
...
After this PR chat models will correctly handle pydantic 2 with
bind_tools and with_structured_output.
```python
import pydantic
print(pydantic.__version__)
```
2.8.2
```python
from langchain_openai import ChatOpenAI
from pydantic import BaseModel, Field
class Add(BaseModel):
x: int
y: int
model = ChatOpenAI().bind_tools([Add])
print(model.invoke('2 + 5').tool_calls)
model = ChatOpenAI().with_structured_output(Add)
print(type(model.invoke('2 + 5')))
```
```
[{'name': 'Add', 'args': {'x': 2, 'y': 5}, 'id': 'call_PNUFa4pdfNOYXxIMHc6ps2Do', 'type': 'tool_call'}]
<class '__main__.Add'>
```
```python
from langchain_openai import ChatOpenAI
from pydantic.v1 import BaseModel, Field
class Add(BaseModel):
x: int
y: int
model = ChatOpenAI().bind_tools([Add])
print(model.invoke('2 + 5').tool_calls)
model = ChatOpenAI().with_structured_output(Add)
print(type(model.invoke('2 + 5')))
```
```python
[{'name': 'Add', 'args': {'x': 2, 'y': 5}, 'id': 'call_hhiHYP441cp14TtrHKx3Upg0', 'type': 'tool_call'}]
<class '__main__.Add'>
```
Addresses issues: https://github.com/langchain-ai/langchain/issues/22782
---------
Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com>
2024-07-22 20:07:39 +00:00
maang-h
6c7d9f93b9
feat: Add ChatTongyi structured output ( #24187 )
...
- **Description:** Add `with_structured_output` method to ChatTongyi to
support structured output.
2024-07-15 15:57:21 -04:00
Eugene Yurtsev
2c180d645e
core[minor],community[minor]: Upgrade all @root_validator() to @pre_init ( #23841 )
...
This PR introduces a @pre_init decorator that's a @root_validator(pre=True) but with all the defaults populated!
2024-07-08 16:09:29 -04:00
mackong
11483b0fb8
community[patch]: set tool name for tongyi&qianfan llm ( #22889 )
...
- **Description:** The name of ToolMessage is default to None, which
makes tool message send to LLM likes
```json
{"role": "tool",
"tool_call_id": "",
"content": "{\"time\": \"12:12\"}",
"name": null}
```
But the name seems essential for some LLMs like TongYi Qwen. so we need to set the name use agent_action's tool value.
- **Issue:** N/A
- **Dependencies:** N/A
2024-06-28 09:17:05 -04:00
mackong
70834cd741
community[patch]: support convert FunctionMessage for Tongyi ( #23569 )
...
**Description:** For function call agent with Tongyi, cause the
AgentAction will be converted to FunctionMessage by
47f69fe0d8/libs/core/langchain_core/agents.py (L188)
But now Tongyi's *convert_message_to_dict* doesn't support
FunctionMessage
47f69fe0d8/libs/community/langchain_community/chat_models/tongyi.py (L184-L207)
Then next round conversation will be failed by the *TypeError*
exception.
This patch adds the support to convert FunctionMessage for Tongyi.
**Issue:** N/A
**Dependencies:** N/A
2024-06-27 15:49:26 -04:00
maang-h
5070004e8a
docs: Update Tongyi ChatModel docstring ( #23540 )
...
- **Description:** Update Tongyi ChatModel rich docstring
- **Issue:** the issue #22296
2024-06-26 13:07:13 -04:00
HuiyuanYan
bf3aefce93
community[patch]: Update tongyi.py to support MultimodalConversation in dashscope. ( #21249 )
...
Add the support of multimodal conversation in dashscope,now we can use
multimodal language model "qwen-vl-v1", "qwen-vl-chat-v1",
"qwen-audio-turbo" to processing picture an audio. :)
- [ ] **PR title**: "community: add multimodal conversation support in
dashscope"
- [ ] **PR message**: ***Delete this entire checklist*** and replace
with
- **Description:** add multimodal conversation support in dashscope
- **Issue:**
- **Dependencies:** dashscope≥1.18.0
- **Twitter handle:** none :)
- [ ] **How to use it?**:
- ```python
Tongyi_chat = ChatTongyi(
top_p=0.5,
dashscope_api_key=api_key,
model="qwen-vl-v1"
)
response= Tongyi_chat.invoke(
input =
[
{
"role": "user",
"content": [
{"image":
"https://dashscope.oss-cn-beijing.aliyuncs.com/images/dog_and_girl.jpeg "},
{"text": "这是什么?"}
]
}
]
)
```
---------
Co-authored-by: Bagatur <baskaryan@gmail.com>
2024-05-22 22:04:58 +00:00
Pengcheng Liu
4cf523949a
community[patch]: Update model client to support vision model in Tong… ( #21474 )
...
- **Description:** Tongyi uses different client for chat model and
vision model. This PR chooses proper client based on model name to
support both chat model and vision model. Reference [tongyi
document](https://help.aliyun.com/zh/dashscope/developer-reference/tongyi-qianwen-vl-plus-api?spm=a2c4g.11186623.0.0.27404c9a7upm11 )
for details.
```
from langchain_core.messages import HumanMessage
from langchain_community.chat_models import ChatTongyi
llm = ChatTongyi(model_name='qwen-vl-max')
image_message = {
"image": "https://lilianweng.github.io/posts/2023-06-23-agent/agent-overview.png "
}
text_message = {
"text": "summarize this picture",
}
message = HumanMessage(content=[text_message, image_message])
llm.invoke([message])
```
- **Issue:** None
- **Dependencies:** None
- **Twitter handle:** None
2024-05-21 11:58:27 -07:00
ccurme
19e6bf814b
community: fix CI ( #21766 )
2024-05-16 15:41:03 +00:00
Cheese
0ead09f84d
community: Implement bind_tools
for ChatTongyi ( #20725 )
...
## Description
Implement `bind_tools` in ChatTongyi. Usage example:
```py
from langchain_core.tools import tool
from langchain_community.chat_models.tongyi import ChatTongyi
@tool
def multiply(first_int: int, second_int: int) -> int:
"""Multiply two integers together."""
return first_int * second_int
llm = ChatTongyi(model="qwen-turbo")
llm_with_tools = llm.bind_tools([multiply])
msg = llm_with_tools.invoke("What's 5 times forty two")
print(msg)
```
Streaming is also supported.
## Dependencies
No Dependency is required for this change.
---------
Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com>
Co-authored-by: Chester Curme <chester.curme@gmail.com>
2024-05-16 10:39:35 -04:00
Eugene Yurtsev
25fbe356b4
community[patch]: upgrade to recent version of mypy ( #21616 )
...
This PR upgrades community to a recent version of mypy. It inserts type:
ignore on all existing failures.
2024-05-13 14:55:07 -04:00
Pengcheng Liu
ecd19a9e58
community[patch]: Add function call support in Tongyi chat model. ( #20119 )
...
- [ ] **PR message**:
- **Description:** This pr adds function calling support in Tongyi chat
model.
- **Issue:** None
- **Dependencies:** None
- **Twitter handle:** None
Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com>
2024-04-17 20:42:23 +00:00
Guangdong Liu
3729bec1a2
community[patch]: standardize init args ( #20210 )
...
Related to https://github.com/langchain-ai/langchain/issues/20085
@baskaryan
2024-04-16 18:29:57 -07:00
Leonid Ganeline
7cf2d2759d
community[patch]: docstrings update ( #20301 )
...
Added missed docstrings. Format docstings to the consistent form.
2024-04-11 16:23:27 -04:00
htaoruan
bcc771e37c
docs: ChatTongyi example error ( #19013 )
2024-03-17 01:55:56 +00:00
Shuai Liu
c244e1a50b
community[patch]: Fixed bug in merging generation_info
during chunk concatenation in Tongyi and ChatTongyi ( #19014 )
...
- **Description:**
In #16218 , during the `GenerationChunk` and `ChatGenerationChunk`
concatenation, the `generation_info` merging changed from simple keys &
values replacement to using the util method
[`merge_dicts`](https://github.com/langchain-ai/langchain/blob/master/libs/core/langchain_core/utils/_merge.py ):

The `merge_dicts` method could not handle merging values of `int` or
some other types, and would raise a
[`TypeError`](https://github.com/langchain-ai/langchain/blob/master/libs/core/langchain_core/utils/_merge.py#L55 ).
This PR fixes this issue in the **Tongyi and ChatTongyi Model** by
adopting the `generation_info` of the last chunk
and discarding the `generation_info` of the intermediate chunks,
ensuring that `stream` and `astream` function correctly.
- **Issue:**
- Related issues or PRs about Tongyi & ChatTongyi: #16605 , #17105
- Other models or cases: #18441 , #17376
- **Dependencies:** No new dependencies
2024-03-15 16:27:53 -07:00
mackong
9678797625
community[patch]: callback before yield for _stream/_astream ( #17907 )
...
- Description: callback on_llm_new_token before yield chunk for
_stream/_astream for some chat models, make all chat models in a
consistent behaviour.
- Issue: N/A
- Dependencies: N/A
2024-02-22 16:15:21 -08:00
Bagatur
66e45e8ab7
community[patch]: chat model mypy fixes ( #17061 )
...
Related to #17048
2024-02-05 13:42:59 -08:00
Harrison Chase
4eda647fdd
infra: add -p to mkdir in lint steps ( #17013 )
...
Previously, if this did not find a mypy cache then it wouldnt run
this makes it always run
adding mypy ignore comments with existing uncaught issues to unblock other prs
---------
Co-authored-by: Erick Friis <erick@langchain.dev>
Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com>
2024-02-05 11:22:06 -08:00
Funkeke
7220124368
community[patch]: fix tongyi completion and params error ( #15544 )
...
fix tongyi completion json parse error and prompt's params error
---------
Co-authored-by: fangkeke <3339698829@qq.com>
Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
2024-01-15 11:43:13 -08:00
Kai
5d05df4bce
community: Fixed bug of "system message check" in chat_models/tongyi. ( #15631 )
...
- **Description:** This PR is to fix a bug of "system message check" in
langchain_community/ chat_models/tongyi.py
- **Issue:** In term of current logic, if there's no system message in
the chat messages, an error of "System message can only be the first
message." will be wrongly raised.
- **Dependencies:** No.
- **Twitter handle:** I don't have a Twitter account.
2024-01-07 08:30:18 -08:00
chyroc
37ad6ec248
Refactor: use SecretStr for tongyi chat-model ( #15102 )
2024-01-02 15:45:23 -08:00
Shuai Liu
4b53440e70
Upgrades the Tongyi LLM and ChatTongyi Model ( #14793 )
...
- **Description:** fixes and upgrades for the Tongyi LLM and ChatTongyi
Model
- Fixed typos; it should be `Tongyi`, not `OpenAI`.
- Fixed a bug in `stream_generate_with_retry`; it's a real stream
generator now.
- Fixed a bug in `validate_environment`; the `dashscope_api_key` should
be properly handled when set by environment variables or initialization
parameters.
- Changed the `dashscope` response to incremental output by setting the
parameter `incremental_output`, which eliminates the need for the
prefix-removal trick.
- Removed some unused parameters, like `n`, `prefix_messages`.
- Added `_stream` method.
- Added async methods support, such as `_astream`, `_agenerate`,
`_abatch`.
- **Dependencies:** No new dependencies.
- **Tag maintainer:** @hwchase17
> PS: Some may be confused about the terms `dashscope`, `tongyi`, and
`Qwen`:
> - `dashscope`: A platform to deploy LLMs and provide APIs to invoke
the LLM.
> - `tongyi`: A brand name or overall term about Alibaba Cloud's LLM/AI.
> - `Qwen`: An LLM that is open-sourced and deployed in `dashscope`.
>
> We use the `dashscope` SDK to interact with the `tongyi`-`Qwen` LLM.
---------
Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
2023-12-29 12:06:12 -08:00
Leonid Ganeline
b2fd41331e
docs: docstrings langchain_community
update ( #14889 )
...
Addded missed docstrings. Fixed inconsistency in docstrings.
**Note** CC @efriis
There were PR errors on
`langchain_experimental/prompt_injection_identifier/hugging_face_identifier.py`
But, I didn't touch this file in this PR! Can it be some cache problems?
I fixed this error.
2023-12-19 08:58:24 -05:00
Bagatur
ed58eeb9c5
community[major], core[patch], langchain[patch], experimental[patch]: Create langchain-community ( #14463 )
...
Moved the following modules to new package langchain-community in a backwards compatible fashion:
```
mv langchain/langchain/adapters community/langchain_community
mv langchain/langchain/callbacks community/langchain_community/callbacks
mv langchain/langchain/chat_loaders community/langchain_community
mv langchain/langchain/chat_models community/langchain_community
mv langchain/langchain/document_loaders community/langchain_community
mv langchain/langchain/docstore community/langchain_community
mv langchain/langchain/document_transformers community/langchain_community
mv langchain/langchain/embeddings community/langchain_community
mv langchain/langchain/graphs community/langchain_community
mv langchain/langchain/llms community/langchain_community
mv langchain/langchain/memory/chat_message_histories community/langchain_community
mv langchain/langchain/retrievers community/langchain_community
mv langchain/langchain/storage community/langchain_community
mv langchain/langchain/tools community/langchain_community
mv langchain/langchain/utilities community/langchain_community
mv langchain/langchain/vectorstores community/langchain_community
mv langchain/langchain/agents/agent_toolkits community/langchain_community
mv langchain/langchain/cache.py community/langchain_community
mv langchain/langchain/adapters community/langchain_community
mv langchain/langchain/callbacks community/langchain_community/callbacks
mv langchain/langchain/chat_loaders community/langchain_community
mv langchain/langchain/chat_models community/langchain_community
mv langchain/langchain/document_loaders community/langchain_community
mv langchain/langchain/docstore community/langchain_community
mv langchain/langchain/document_transformers community/langchain_community
mv langchain/langchain/embeddings community/langchain_community
mv langchain/langchain/graphs community/langchain_community
mv langchain/langchain/llms community/langchain_community
mv langchain/langchain/memory/chat_message_histories community/langchain_community
mv langchain/langchain/retrievers community/langchain_community
mv langchain/langchain/storage community/langchain_community
mv langchain/langchain/tools community/langchain_community
mv langchain/langchain/utilities community/langchain_community
mv langchain/langchain/vectorstores community/langchain_community
mv langchain/langchain/agents/agent_toolkits community/langchain_community
mv langchain/langchain/cache.py community/langchain_community
```
Moved the following to core
```
mv langchain/langchain/utils/json_schema.py core/langchain_core/utils
mv langchain/langchain/utils/html.py core/langchain_core/utils
mv langchain/langchain/utils/strings.py core/langchain_core/utils
cat langchain/langchain/utils/env.py >> core/langchain_core/utils/env.py
rm langchain/langchain/utils/env.py
```
See .scripts/community_split/script_integrations.sh for all changes
2023-12-11 13:53:30 -08:00