docs: update IBM WatsonxLLM docs with deprecated LLMChain (#21960)

Thank you for contributing to LangChain!

- [x] **PR title**: "update IBM WatsonxLLM docs with deprecated
LLMChain"

- [x] **PR message**: 
- **Description:** update IBM WatsonxLLM docs with deprecated LLMChain

- [x] **Lint and test**: Run `make format`, `make lint` and `make test`
from the root of the package(s) you've modified. See contribution
guidelines for more: https://python.langchain.com/docs/contributing/
This commit is contained in:
Mateusz Szewczyk 2024-05-22 01:43:02 +02:00 committed by GitHub
parent eb096675a8
commit 80f8fe1793
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -24,7 +24,7 @@
}, },
{ {
"cell_type": "code", "cell_type": "code",
"execution_count": 4, "execution_count": 1,
"id": "2f1fff4e", "id": "2f1fff4e",
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
@ -45,7 +45,7 @@
}, },
{ {
"cell_type": "code", "cell_type": "code",
"execution_count": 1, "execution_count": 2,
"id": "11d572a1", "id": "11d572a1",
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
@ -93,7 +93,7 @@
}, },
{ {
"cell_type": "code", "cell_type": "code",
"execution_count": 2, "execution_count": 3,
"id": "407cd500", "id": "407cd500",
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
@ -194,6 +194,28 @@
")" ")"
] ]
}, },
{
"cell_type": "markdown",
"id": "7c4a632b",
"metadata": {},
"source": [
"You can also pass the IBM's [`ModelInference`](https://ibm.github.io/watsonx-ai-python-sdk/fm_model_inference.html) object into `WatsonxLLM` class."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5335b148",
"metadata": {},
"outputs": [],
"source": [
"from ibm_watsonx_ai.foundation_models import ModelInference\n",
"\n",
"model = ModelInference(...)\n",
"\n",
"watsonx_llm = WatsonxLLM(watsonx_model=model)"
]
},
{ {
"cell_type": "markdown", "cell_type": "markdown",
"id": "c25ecbd1", "id": "c25ecbd1",
@ -213,6 +235,7 @@
"from langchain_core.prompts import PromptTemplate\n", "from langchain_core.prompts import PromptTemplate\n",
"\n", "\n",
"template = \"Generate a random question about {topic}: Question: \"\n", "template = \"Generate a random question about {topic}: Question: \"\n",
"\n",
"prompt = PromptTemplate.from_template(template)" "prompt = PromptTemplate.from_template(template)"
] ]
}, },
@ -221,31 +244,32 @@
"id": "79056d8e", "id": "79056d8e",
"metadata": {}, "metadata": {},
"source": [ "source": [
"Provide a topic and run the `LLMChain`." "Provide a topic and run the chain."
] ]
}, },
{ {
"cell_type": "code", "cell_type": "code",
"execution_count": 10, "execution_count": 9,
"id": "dc076c56", "id": "dc076c56",
"metadata": {}, "metadata": {},
"outputs": [ "outputs": [
{ {
"data": { "data": {
"text/plain": [ "text/plain": [
"{'topic': 'dog', 'text': 'Why do dogs howl?'}" "'What is the difference between a dog and a wolf?'"
] ]
}, },
"execution_count": 10, "execution_count": 9,
"metadata": {}, "metadata": {},
"output_type": "execute_result" "output_type": "execute_result"
} }
], ],
"source": [ "source": [
"from langchain.chains import LLMChain\n", "llm_chain = prompt | watsonx_llm\n",
"\n", "\n",
"llm_chain = LLMChain(prompt=prompt, llm=watsonx_llm)\n", "topic = \"dog\"\n",
"llm_chain.invoke(\"dog\")" "\n",
"llm_chain.invoke(topic)"
] ]
}, },
{ {