From 652eae2cd1dc37832f50f0e855bd3d78aa576ceb Mon Sep 17 00:00:00 2001 From: Harrison Chase Date: Fri, 22 Dec 2023 10:21:50 -0800 Subject: [PATCH] revamp getting started --- docs/docs/get_started/quickstart.mdx | 37 +++++++++++++++++----------- 1 file changed, 23 insertions(+), 14 deletions(-) diff --git a/docs/docs/get_started/quickstart.mdx b/docs/docs/get_started/quickstart.mdx index 2b5dec3f190..a9b02e3331d 100644 --- a/docs/docs/get_started/quickstart.mdx +++ b/docs/docs/get_started/quickstart.mdx @@ -33,27 +33,36 @@ For more details, see our [Installation guide](/docs/get_started/installation). ### Environment -Using LangChain will usually require integrations with one or more model providers, data stores, APIs, etc. For this example, we'll use OpenAI's model APIs. +Using LangChain will usually require integrations with one or more model providers, data stores, APIs, etc. For this getting started guide, we will provide two options: using OpenAI (a popular model available via API) or using a local open source model. -First we'll need to install their Python package: + + + First we'll need to install their Python package: -```bash -pip install openai -``` + ```bash + pip install openai + ``` -Accessing the API requires an API key, which you can get by creating an account and heading [here](https://platform.openai.com/account/api-keys). Once we have a key we'll want to set it as an environment variable by running: + Accessing the API requires an API key, which you can get by creating an account and heading [here](https://platform.openai.com/account/api-keys). Once we have a key we'll want to set it as an environment variable by running: -```bash -export OPENAI_API_KEY="..." -``` + ```bash + export OPENAI_API_KEY="..." + ``` -If you'd prefer not to set an environment variable you can pass the key in directly via the `openai_api_key` named parameter when initiating the OpenAI LLM class: + If you'd prefer not to set an environment variable you can pass the key in directly via the `openai_api_key` named parameter when initiating the OpenAI LLM class: + + ```python + from langchain.chat_models import ChatOpenAI + + llm = ChatOpenAI(openai_api_key="...") + ``` + + + conda install langchain -c conda-forge + + -```python -from langchain.chat_models import ChatOpenAI -llm = ChatOpenAI(openai_api_key="...") -``` ### LangSmith