docs: revamp ChatOpenAI (#22253)

Can build API ref docs by running
```bash
make api_docs_clean; make api_docs_quick_preview API_PKG=openai
```
only builds openai ref, takes ~20 sec
This commit is contained in:
Bagatur
2024-05-29 10:20:14 -07:00
committed by GitHub
parent 00c70d98c2
commit 6dd0f095c3
4 changed files with 436 additions and 41 deletions

View File

@@ -220,7 +220,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.4"
"version": "3.9.1"
}
},
"nbformat": 4,

View File

@@ -12,38 +12,130 @@
},
{
"cell_type": "markdown",
"id": "e49f1e0d",
"id": "cb4dd00a-8893-4a45-96f7-9a9fc341cd61",
"metadata": {},
"source": [
"# ChatOpenAI\n",
"\n",
"This notebook covers how to get started with OpenAI chat models."
"This notebook provides a quick overview for getting started with OpenAI [chat models](/docs/concepts/#chat-models). For detailed documentation of all ChatOpenAI features and configurations head to the [API reference](https://api.python.langchain.com/en/latest/chat_models/langchain_openai.chat_models.base.ChatOpenAI.html).\n",
"\n",
"OpenAI has several chat models. You can find information about their latest models and their costs, context windows, and supported input types in the [OpenAI docs](https://platform.openai.com/docs/models).\n",
"\n",
":::info Azure OpenAI\n",
"\n",
"Note that certain OpenAI models can also be accessed via the [Microsoft Azure platform](https://azure.microsoft.com/en-us/products/ai-services/openai-service). To use the Azure OpenAI service use the [AzureChatOpenAI integration](/docs/integrations/chat/azure_chat_openai/).\n",
"\n",
":::"
]
},
{
"cell_type": "markdown",
"id": "e49f1e0d",
"metadata": {},
"source": [
"## Overview\n",
"\n",
"### Integration details\n",
"| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/v0.2/docs/integrations/chat/openai) | Package downloads | Package latest |\n",
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
"| [ChatOpenAI](https://api.python.langchain.com/en/latest/chat_models/langchain_openai.chat_models.base.ChatOpenAI.html) | [langchain-openai](https://api.python.langchain.com/en/latest/openai_api_reference.html) | ❌ | beta | ✅ | ![PyPI - Downloads](https://img.shields.io/pypi/dm/langchain-openai?style=flat-square&label=%20) | ![PyPI - Version](https://img.shields.io/pypi/v/langchain-openai?style=flat-square&label=%20) |\n",
"\n",
"### Model features\n",
"| [Tool calling](/docs/how_to/tool_calling/) | [Structured output](/docs/how_to/structured_output/) | JSON mode | Image input | Audio input | Video input | [Native streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
"| ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ✅ | ✅ | ✅ | \n",
"\n",
"## Setup\n",
"\n",
"To access OpenAI models you'll need to create an OpenAI account, get an API key, and install the `langchain-openai` integration package.\n",
"\n",
"### Credentials\n",
"\n",
"Head to https://platform.openai.com to sign up to OpenAI and generate an API key. Once you've done this set the OPENAI_API_KEY environment variable:"
]
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": 2,
"id": "e817fe2e-4f1d-4533-b19e-2400b1cf6ce8",
"metadata": {},
"outputs": [
{
"name": "stdin",
"output_type": "stream",
"text": [
"Enter your OpenAI API key: ········\n"
]
}
],
"source": [
"import getpass\n",
"import os\n",
"\n",
"os.environ[\"OPENAI_API_KEY\"] = getpass.getpass(\"Enter your OpenAI API key: \")"
]
},
{
"cell_type": "markdown",
"id": "c2a3ce99-a44a-4ea6-8d23-8a88e332f0f9",
"metadata": {},
"source": [
"If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "85255d53-ac8a-44e1-aa26-8e567bb77ae7",
"metadata": {},
"outputs": [],
"source": [
"# os.environ[\"LANGSMITH_API_KEY\"] = getpass.getpass(\"Enter your LangSmith API key: \")\n",
"# os.environ[\"LANGSMITH_TRACING\"] = \"true\""
]
},
{
"cell_type": "markdown",
"id": "c59722a9-6dbb-45f7-ae59-5be50ca5733d",
"metadata": {},
"source": [
"### Installation\n",
"\n",
"The LangChain OpenAI integration lives in the `langchain-openai` package:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "2113471c-75d7-45df-b784-d78da4ef7aba",
"metadata": {},
"outputs": [],
"source": [
"%pip install -qU langchain-openai"
]
},
{
"cell_type": "markdown",
"id": "1098bc9d-ce83-462b-8c19-f85bf3a159dc",
"metadata": {},
"source": [
"## Instantiation\n",
"\n",
"Now we can instantiate our model object and generate chat completions:"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "522686de",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"from langchain_core.messages import HumanMessage, SystemMessage\n",
"from langchain_core.prompts import ChatPromptTemplate\n",
"from langchain_openai import ChatOpenAI"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "62e0dbc3",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"llm = ChatOpenAI(model=\"gpt-3.5-turbo-0125\", temperature=0)"
"from langchain_openai import ChatOpenAI\n",
"\n",
"llm = ChatOpenAI(model=\"gpt-4o\", temperature=0)"
]
},
{
@@ -51,17 +143,30 @@
"id": "4e5fe97e",
"metadata": {},
"source": [
"The above cell assumes that your OpenAI API key is set in your environment variables. If you would rather manually specify your API key and/or organization ID, use the following code:\n",
"The above cell that your OpenAI API key is set in your environment variables. If you would prefer you can specify credentials like API key, organization ID, base url, etc. as init params:\n",
"\n",
"```python\n",
"llm = ChatOpenAI(model=\"gpt-3.5-turbo-0125\", temperature=0, api_key=\"YOUR_API_KEY\", openai_organization=\"YOUR_ORGANIZATION_ID\")\n",
"```\n",
"Remove the openai_organization parameter should it not apply to you."
"llm = ChatOpenAI(\n",
" model=\"gpt-4o\", \n",
" temperature=0, \n",
" api_key=\"YOUR_API_KEY\", \n",
" organization=\"YOUR_ORGANIZATION_ID\", \n",
" base_url=\"YOUR_BASE_URL\"\n",
")\n",
"```"
]
},
{
"cell_type": "markdown",
"id": "6511982a-734a-4193-a47d-254f8dcaff5e",
"metadata": {},
"source": [
"## Invocation"
]
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 6,
"id": "ce16ad78-8e6f-48cd-954e-98be75eb5836",
"metadata": {
"tags": []
@@ -70,20 +175,42 @@
{
"data": {
"text/plain": [
"AIMessage(content=\"J'adore programmer.\", response_metadata={'token_usage': {'completion_tokens': 6, 'prompt_tokens': 34, 'total_tokens': 40}, 'model_name': 'gpt-3.5-turbo-0125', 'system_fingerprint': 'fp_b28b39ffa8', 'finish_reason': 'stop', 'logprobs': None}, id='run-8591eae1-b42b-402b-a23a-dfdb0cd151bd-0')"
"AIMessage(content=\"J'adore la programmation.\", response_metadata={'token_usage': {'completion_tokens': 5, 'prompt_tokens': 31, 'total_tokens': 36}, 'model_name': 'gpt-4o', 'system_fingerprint': 'fp_43dfabdef1', 'finish_reason': 'stop', 'logprobs': None}, id='run-012cffe2-5d3d-424d-83b5-51c6d4a593d1-0', usage_metadata={'input_tokens': 31, 'output_tokens': 5, 'total_tokens': 36})"
]
},
"execution_count": 5,
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"messages = [\n",
" (\"system\", \"You are a helpful assistant that translates English to French.\"),\n",
" (\"human\", \"Translate this sentence from English to French. I love programming.\"),\n",
" (\n",
" \"system\",\n",
" \"You are a helpful assistant that translates English to French. Translate the user sentence.\",\n",
" ),\n",
" (\"human\", \"I love programming.\"),\n",
"]\n",
"llm.invoke(messages)"
"ai_msg = llm.invoke(messages)\n",
"ai_msg"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "2cd224b8-4499-41fb-a604-d53a7ff17b2e",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"J'adore la programmation.\n"
]
}
],
"source": [
"print(ai_msg.content)"
]
},
{
@@ -116,6 +243,8 @@
}
],
"source": [
"from langchain_core.prompts import ChatPromptTemplate\n",
"\n",
"prompt = ChatPromptTemplate.from_messages(\n",
" [\n",
" (\n",
@@ -277,13 +406,23 @@
"\n",
"fine_tuned_model(messages)"
]
},
{
"cell_type": "markdown",
"id": "a796d728-971b-408b-88d5-440015bbb941",
"metadata": {},
"source": [
"## API reference\n",
"\n",
"For detailed documentation of all ChatOpenAI features and configurations head to the API reference: https://api.python.langchain.com/en/latest/chat_models/langchain_openai.chat_models.base.ChatOpenAI.html"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"display_name": "poetry-venv-2",
"language": "python",
"name": "python3"
"name": "poetry-venv-2"
},
"language_info": {
"codemirror_mode": {