mirror of
				https://github.com/hwchase17/langchain.git
				synced 2025-11-04 10:10:09 +00:00 
			
		
		
		
	For using Azure OpenAI API, we need to set multiple env vars. But as can
be seen in openai package
[here](48b69293a3/openai/__init__.py (L35)),
the env var for setting base url is named `OPENAI_API_BASE` and not
`OPENAI_API_BASE_URL`. This PR fixes that part in the documentation.
		
	
		
			
				
	
	
		
			160 lines
		
	
	
		
			4.4 KiB
		
	
	
	
		
			Plaintext
		
	
	
	
	
	
			
		
		
	
	
			160 lines
		
	
	
		
			4.4 KiB
		
	
	
	
		
			Plaintext
		
	
	
	
	
	
{
 | 
						|
 "cells": [
 | 
						|
  {
 | 
						|
   "cell_type": "markdown",
 | 
						|
   "id": "9e9b7651",
 | 
						|
   "metadata": {},
 | 
						|
   "source": [
 | 
						|
    "# Azure OpenAI LLM Example\n",
 | 
						|
    "\n",
 | 
						|
    "This notebook goes over how to use Langchain with [Azure OpenAI](https://aka.ms/azure-openai).\n",
 | 
						|
    "\n",
 | 
						|
    "The Azure OpenAI API is compatible with OpenAI's API.  The `openai` Python package makes it easy to use both OpenAI and Azure OpenAI.  You can call Azure OpenAI the same way you call OpenAI with the exceptions noted below.\n",
 | 
						|
    "\n",
 | 
						|
    "## API configuration\n",
 | 
						|
    "You can configure the `openai` package to use Azure OpenAI using environment variables.  The following is for `bash`:\n",
 | 
						|
    "\n",
 | 
						|
    "```bash\n",
 | 
						|
    "# Set this to `azure`\n",
 | 
						|
    "export OPENAI_API_TYPE=azure\n",
 | 
						|
    "# The API version you want to use: set this to `2022-12-01` for the released version.\n",
 | 
						|
    "export OPENAI_API_VERSION=2022-12-01\n",
 | 
						|
    "# The base URL for your Azure OpenAI resource.  You can find this in the Azure portal under your Azure OpenAI resource.\n",
 | 
						|
    "export OPENAI_API_BASE=https://your-resource-name.openai.azure.com\n",
 | 
						|
    "# The API key for your Azure OpenAI resource.  You can find this in the Azure portal under your Azure OpenAI resource.\n",
 | 
						|
    "export OPENAI_API_KEY=<your Azure OpenAI API key>\n",
 | 
						|
    "```\n",
 | 
						|
    "\n",
 | 
						|
    "Alternatively, you can configure the API right within your running Python environment:\n",
 | 
						|
    "\n",
 | 
						|
    "```python\n",
 | 
						|
    "import os\n",
 | 
						|
    "os.environ[\"OPENAI_API_TYPE\"] = \"azure\"\n",
 | 
						|
    "...\n",
 | 
						|
    "```\n",
 | 
						|
    "\n",
 | 
						|
    "## Deployments\n",
 | 
						|
    "With Azure OpenAI, you set up your own deployments of the common GPT-3 and Codex models.  When calling the API, you need to specify the deployment you want to use.\n",
 | 
						|
    "\n",
 | 
						|
    "Let's say your deployment name is `text-davinci-002-prod`.  In the `openai` Python API, you can specify this deployment with the `engine` parameter.  For example:\n",
 | 
						|
    "\n",
 | 
						|
    "```python\n",
 | 
						|
    "import openai\n",
 | 
						|
    "\n",
 | 
						|
    "response = openai.Completion.create(\n",
 | 
						|
    "    engine=\"text-davinci-002-prod\",\n",
 | 
						|
    "    prompt=\"This is a test\",\n",
 | 
						|
    "    max_tokens=5\n",
 | 
						|
    ")\n",
 | 
						|
    "```\n"
 | 
						|
   ]
 | 
						|
  },
 | 
						|
  {
 | 
						|
   "cell_type": "code",
 | 
						|
   "execution_count": 1,
 | 
						|
   "id": "8fad2a6e",
 | 
						|
   "metadata": {},
 | 
						|
   "outputs": [],
 | 
						|
   "source": [
 | 
						|
    "# Import Azure OpenAI\n",
 | 
						|
    "from langchain.llms import AzureOpenAI"
 | 
						|
   ]
 | 
						|
  },
 | 
						|
  {
 | 
						|
   "cell_type": "code",
 | 
						|
   "execution_count": 2,
 | 
						|
   "id": "8c80213a",
 | 
						|
   "metadata": {},
 | 
						|
   "outputs": [],
 | 
						|
   "source": [
 | 
						|
    "# Create an instance of Azure OpenAI\n",
 | 
						|
    "# Replace the deployment name with your own\n",
 | 
						|
    "llm = AzureOpenAI(deployment_name=\"text-davinci-002-prod\", model_name=\"text-davinci-002\")"
 | 
						|
   ]
 | 
						|
  },
 | 
						|
  {
 | 
						|
   "cell_type": "code",
 | 
						|
   "execution_count": 3,
 | 
						|
   "id": "592dc404",
 | 
						|
   "metadata": {},
 | 
						|
   "outputs": [
 | 
						|
    {
 | 
						|
     "data": {
 | 
						|
      "text/plain": [
 | 
						|
       "'\\n\\nWhy did the chicken cross the road?\\n\\nTo get to the other side.'"
 | 
						|
      ]
 | 
						|
     },
 | 
						|
     "execution_count": 3,
 | 
						|
     "metadata": {},
 | 
						|
     "output_type": "execute_result"
 | 
						|
    }
 | 
						|
   ],
 | 
						|
   "source": [
 | 
						|
    "# Run the LLM\n",
 | 
						|
    "llm(\"Tell me a joke\")"
 | 
						|
   ]
 | 
						|
  },
 | 
						|
  {
 | 
						|
   "cell_type": "markdown",
 | 
						|
   "id": "bbfebea1",
 | 
						|
   "metadata": {},
 | 
						|
   "source": [
 | 
						|
    "We can also print the LLM and see its custom print."
 | 
						|
   ]
 | 
						|
  },
 | 
						|
  {
 | 
						|
   "cell_type": "code",
 | 
						|
   "execution_count": 4,
 | 
						|
   "id": "9c33fa19",
 | 
						|
   "metadata": {},
 | 
						|
   "outputs": [
 | 
						|
    {
 | 
						|
     "name": "stdout",
 | 
						|
     "output_type": "stream",
 | 
						|
     "text": [
 | 
						|
      "\u001b[1mAzureOpenAI\u001b[0m\n",
 | 
						|
      "Params: {'deployment_name': 'text-davinci-002', 'model_name': 'text-davinci-002', 'temperature': 0.7, 'max_tokens': 256, 'top_p': 1, 'frequency_penalty': 0, 'presence_penalty': 0, 'n': 1, 'best_of': 1}\n"
 | 
						|
     ]
 | 
						|
    }
 | 
						|
   ],
 | 
						|
   "source": [
 | 
						|
    "print(llm)"
 | 
						|
   ]
 | 
						|
  },
 | 
						|
  {
 | 
						|
   "cell_type": "code",
 | 
						|
   "execution_count": null,
 | 
						|
   "id": "5a8b5917",
 | 
						|
   "metadata": {},
 | 
						|
   "outputs": [],
 | 
						|
   "source": []
 | 
						|
  }
 | 
						|
 ],
 | 
						|
 "metadata": {
 | 
						|
  "kernelspec": {
 | 
						|
   "display_name": "Python 3 (ipykernel)",
 | 
						|
   "language": "python",
 | 
						|
   "name": "python3"
 | 
						|
  },
 | 
						|
  "language_info": {
 | 
						|
   "codemirror_mode": {
 | 
						|
    "name": "ipython",
 | 
						|
    "version": 3
 | 
						|
   },
 | 
						|
   "file_extension": ".py",
 | 
						|
   "mimetype": "text/x-python",
 | 
						|
   "name": "python",
 | 
						|
   "nbconvert_exporter": "python",
 | 
						|
   "pygments_lexer": "ipython3",
 | 
						|
   "version": "3.10.9"
 | 
						|
  },
 | 
						|
  "vscode": {
 | 
						|
   "interpreter": {
 | 
						|
    "hash": "3bae61d45a4f4d73ecea8149862d4bfbae7d4d4a2f71b6e609a1be8f6c8d4298"
 | 
						|
   }
 | 
						|
  }
 | 
						|
 },
 | 
						|
 "nbformat": 4,
 | 
						|
 "nbformat_minor": 5
 | 
						|
}
 |