docs: fix lets typos in multiple files (#31481)

Fix typo
This commit is contained in:
Michael Li
2025-06-05 00:27:16 +10:00
committed by GitHub
parent f97e1825b7
commit 21d6f1fc6a
7 changed files with 11 additions and 11 deletions

View File

@@ -40,7 +40,7 @@
"from langchain_core.globals import set_llm_cache\n",
"from langchain_openai import OpenAI\n",
"\n",
"# To make the caching really obvious, lets use a slower and older model.\n",
"# To make the caching really obvious, let's use a slower and older model.\n",
"# Caching supports newer chat models as well.\n",
"llm = OpenAI(model=\"gpt-3.5-turbo-instruct\", n=2, best_of=2)"
]

View File

@@ -51,7 +51,7 @@
"from langchain.globals import set_llm_cache\n",
"from langchain_openai import OpenAI\n",
"\n",
"# To make the caching really obvious, lets use a slower and older model.\n",
"# To make the caching really obvious, let's use a slower and older model.\n",
"# Caching supports newer chat models as well.\n",
"llm = OpenAI(model=\"gpt-3.5-turbo-instruct\", n=2, best_of=2)"
]

View File

@@ -211,7 +211,7 @@
"id": "b6e7b9cf-8ce5-4f87-b4bf-100321ad2dd1",
"metadata": {},
"source": [
"***The result is usually closer to the JSON object of the schema definition, rather than a json object conforming to the schema. Lets try to enforce proper output.***"
"***The result is usually closer to the JSON object of the schema definition, rather than a json object conforming to the schema. Let's try to enforce proper output.***"
]
},
{

View File

@@ -49,7 +49,7 @@ The power of the AI gateway comes when you're able to use the above code snippet
Let's modify the code above to make a call to Anthropic's `claude-3-opus-20240229` model.
Portkey supports **[Virtual Keys](https://docs.portkey.ai/docs/product/ai-gateway-streamline-llm-integrations/virtual-keys)** which are an easy way to store and manage API keys in a secure vault. Lets try using a Virtual Key to make LLM calls. You can navigate to the Virtual Keys tab in Portkey and create a new key for Anthropic.
Portkey supports **[Virtual Keys](https://docs.portkey.ai/docs/product/ai-gateway-streamline-llm-integrations/virtual-keys)** which are an easy way to store and manage API keys in a secure vault. Let's try using a Virtual Key to make LLM calls. You can navigate to the Virtual Keys tab in Portkey and create a new key for Anthropic.
The `virtual_key` parameter sets the authentication and provider for the AI provider being used. In our case we're using the Anthropic Virtual key.

View File

@@ -61,7 +61,7 @@
"id": "34318164-7a6f-47b6-8690-3b1d71e1fcfc",
"metadata": {},
"source": [
"Lets ask a question, and compare to 2 documents. The first contains the answer to the question, and the second one does not. \n",
"Let's ask a question, and compare to 2 documents. The first contains the answer to the question, and the second one does not. \n",
"\n",
"We can check better suits our query."
]

View File

@@ -25,7 +25,7 @@
"source": [
"## PremEmbeddings\n",
"\n",
"In this section we are going to dicuss how we can get access to different embedding model using `PremEmbeddings` with LangChain. Lets start by importing our modules and setting our API Key. "
"In this section we are going to dicuss how we can get access to different embedding model using `PremEmbeddings` with LangChain. Let's start by importing our modules and setting our API Key. "
]
},
{