Compare commits

...

3 Commits

Author SHA1 Message Date
Brace Sproul
21c8a68ac1 Merge branch 'master' into template-readme-missing-env 2023-10-30 13:24:06 -07:00
Brace Sproul
95cd3d7d07 fix: missing context 2023-10-30 13:22:00 -07:00
Brace Sproul
c2d15c8fbb Added env vars section to missing template readmes 2023-10-30 13:18:16 -07:00
31 changed files with 282 additions and 17 deletions

View File

@@ -1,3 +1,11 @@
# anthropic-iterative-search
Heavily inspired by [this notebook](https://github.com/anthropics/anthropic-cookbook/blob/main/long_context/wikipedia-search-cookbook.ipynb)
## Environment variables
You need to define the following environment variable
```shell
ANTHROPIC_API_KEY=<YOUR_ANTHROPIC_API_KEY
```

View File

@@ -42,6 +42,14 @@ To put this to test, experiment with these example questions:
"Do birds have wings?" <-- no entomology here!
```
## Environment variables
You need to define the following environment variable
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
```
## Reference
Stand-alone repo with LangServe chain: [here](https://github.com/hemidactylus/langserve_cassandra_entomology_rag).

View File

@@ -13,7 +13,16 @@ You need:
_Note:_ you can alternatively use a regular Cassandra cluster: to do so, make sure you provide the `USE_CASSANDRA_CLUSTER` entry as shown in `.env.template` and the subsequent environment variables to specify how to connect to it.
You need to provide the connection parameters and secrets through environment variables. Please refer to `.env.template` for what variables are required.
## Environment variables
You need to define the following environment variables
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
ASTRA_DB_APPLICATION_TOKEN=<YOUR_ASTRA_DB_APPLICATION_TOKEN>
ASTRA_DB_KEYSPACE=<YOUR_ASTRA_DB_KEYSPACE>
ASTRA_DB_ID=<YOUR_ASTRA_DB_ID>
```
## Reference

View File

@@ -3,3 +3,11 @@
This is a csv agent that uses both a Python REPL as well as a vectorstore to allow for interaction with text data.
Set up that is required is running `ingest.py` to do the ingestion into a vectorstore.
## Environment variables
You need to define the following environment variable
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
```

View File

@@ -22,6 +22,14 @@ With a deployment, update the connection string.
Password and connection (elasticsearch url) can be found on the deployment console. Th
## Environment variables
You need to define the following environment variable
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
```
## Populating with data
If you want to populate the DB with some example info, you can run `python ingest.py`.

View File

@@ -11,4 +11,12 @@ By default, it will extract the title and author of papers.
This template will use `Claude2` by default.
Be sure that `ANTHROPIC_API_KEY` is set in your enviorment.
Be sure that `ANTHROPIC_API_KEY` is set in your environment.
## Environment variables
You need to define the following environment variable
```shell
ANTHROPIC_API_KEY=<YOUR_ANTHROPIC_API_KEY>
```

View File

@@ -10,4 +10,12 @@ By default, it will extract the title and author of papers.
This template will use `OpenAI` by default.
Be sure that `OPENAI_API_KEY` is set in your environment.
Be sure that `OPENAI_API_KEY` is set in your environment.
## Environment variables
You need to define the following environment variable
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
```

View File

@@ -7,3 +7,11 @@ The idea behind this is that the hypothetical document may be closer in the embe
For a more detailed description, read the full paper [here](https://arxiv.org/abs/2212.10496).
For this example, we use a simple RAG architecture, although you can easily use this technique in other more complicated architectures.
## Environment variables
You need to define the following environment variable
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
```

View File

@@ -15,4 +15,12 @@ This template will use a `Replicate` [hosted version](https://replicate.com/andr
Based on the `Replicate` example, the JSON schema is supplied directly in the prompt.
Be sure that `REPLICATE_API_TOKEN` is set in your environment.
Be sure that `REPLICATE_API_TOKEN` is set in your environment.
## Environment variables
You need to define the following environment variable
```shell
REPLICATE_API_TOKEN=<YOUR_REPLICATE_API_TOKEN>
```

View File

@@ -14,3 +14,12 @@ Be sure that `OPENAI_API_KEY` is set in your environment.
This template will use `Tavily` by default.
Be sure that `TAVILY_API_KEY` is set in your environment.
## Environment variables
You need to define the following environment variables
```
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
TAVILY_API_KEY=<YOUR_TAVILY_API_KEY>
```

View File

@@ -7,3 +7,11 @@ This simple application converts user input into pirate speak
This template will use `OpenAI` by default.
Be sure that `OPENAI_API_KEY` is set in your environment.
## Environment variables
You need to define the following environment variable
```
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
```

View File

@@ -1 +1,9 @@
# plate-chain
## Environment variables
You need to define the following environment variable
```
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
```

View File

@@ -18,10 +18,19 @@ You need to install the `faiss-cpu` package to work with the FAISS vector store.
pip install faiss-cpu
```
## LLM and Embeddings
The code assumes that you are working with the `default` AWS profile and `us-east-1` region. If not, specify these environment variables to reflect the correct region and AWS profile.
* `AWS_DEFAULT_REGION`
* `AWS_PROFILE`
## Environment variables
You need (if not using default) to define the following environment variables
```shell
AWS_DEFAULT_REGION=<YOUR_AWS_DEFAULT_REGION>
AWS_PROFILE=<YOUR_AWS_PROFILE>
```

View File

@@ -15,7 +15,10 @@ You will need a Kendra Index setup before using this template. For setting up a
The code assumes that you are working with the `default` AWS profile and `us-east-1` region. If not, specify these environment variables to reflect the correct region and AWS profile.
* `AWS_DEFAULT_REGION`
* `AWS_PROFILE`
This code also requires specifying the `KENDRA_INDEX_ID` env variable which should have the Index ID of the Kendra index. Note that the Index ID is a 36 character alphanumeric value that can be found in the index detail page.
```shell
AWS_DEFAULT_REGION=<YOUR_AWS_DEFAULT_REGION>
AWS_PROFILE=<YOUR_AWS_PROFILE>
KENDRA_INDEX_ID=<YOUR_KENDRA_INDEX_ID>
```

View File

@@ -13,3 +13,11 @@ These documents can be loaded from [many sources](https://python.langchain.com/d
## LLM
Be sure that `OPENAI_API_KEY` is set in order to the OpenAI models.
## Environment variables
You need to define the following environment variables
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
```

View File

@@ -10,4 +10,15 @@ Be sure that `OPENAI_API_KEY` is set in order to use the OpenAI models.
## Pinecone
This template uses Pinecone as a vectorstore and requires that `PINECONE_API_KEY`, `PINECONE_ENVIRONMENT`, and `PINECONE_INDEX` are set.
This template uses Pinecone as a vectorstore and requires that `PINECONE_API_KEY`, `PINECONE_ENVIRONMENT`, and `PINECONE_INDEX` are set.
## Environment variables
You need to define the following environment variables
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
PINECONE_API_KEY=<YOUR_PINECONE_API_KEY>
PINECONE_ENVIRONMENT=<YOUR_PINECONE_INDEX>
PINECONE_INDEX=<YOUR_PINECONE_INDEX>
```

View File

@@ -57,3 +57,14 @@ However, you can choose from a large number of document loaders [here](https://p
# from inside your LangServe instance
poe add rag-elasticsearch
```
## Environment variables
You need to define the following environment variables
```shell
ELASTIC_CLOUD_ID=<YOUR_ELASTIC_CLOUD_ID>
ELASTIC_USERNAME=<YOUR_ELASTIC_USERNAME>
ELASTIC_PASSWORD=<YOUR_ELASTIC_PASSWORD>
ES_URL=http://localhost:9200
```

View File

@@ -3,3 +3,14 @@
Re-implemented from [this GitHub repo](https://github.com/Raudaschl/rag-fusion), all credit to original author
> RAG-Fusion, a search methodology that aims to bridge the gap between traditional search paradigms and the multifaceted dimensions of human queries. Inspired by the capabilities of Retrieval Augmented Generation (RAG), this project goes a step further by employing multiple query generation and Reciprocal Rank Fusion to re-rank search results.
## Environment variables
You need to define the following environment variable
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
PINECONE_API_KEY=<YOUR_PINECONE_API_KEY>
PINECONE_ENVIRONMENT=<YOUR_PINECONE_INDEX>
PINECONE_INDEX=<YOUR_PINECONE_INDEX>
```

View File

@@ -48,4 +48,15 @@ See `rag_pinecone_multi_query.ipynb` for example usage -
from langserve.client import RemoteRunnable
rag_app_pinecone = RemoteRunnable('http://0.0.0.0:8001/rag_pinecone_multi_query')
rag_app_pinecone.invoke("What are the different types of agent memory")
```
```
## Environment variables
You need to define the following environment variables
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
PINECONE_API_KEY=<YOUR_PINECONE_API_KEY>
PINECONE_ENVIRONMENT=<YOUR_PINECONE_INDEX>
PINECONE_INDEX=<YOUR_PINECONE_INDEX>
```

View File

@@ -12,7 +12,7 @@ Be sure that you have set a few env variables in `chain.py`:
* `PINECONE_API_KEY`
* `PINECONE_ENV`
* `index_name`
* `PINECONE_INDEX`
## LLM
@@ -20,4 +20,17 @@ Be sure that `OPENAI_API_KEY` is set in order to the OpenAI models.
## Cohere
Be sure that `COHERE_API_KEY` is set in order to the ReRank endpoint.
Be sure that `COHERE_API_KEY` is set in order to the ReRank endpoint.
## Environment variables
You need to define the following environment variables
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
COHERE_API_KEY=<YOUR_COHERE_API_KEY>
PINECONE_API_KEY=<YOUR_PINECONE_API_KEY>
PINECONE_ENVIRONMENT=<YOUR_PINECONE_INDEX>
PINECONE_INDEX=<YOUR_PINECONE_INDEX>
```

View File

@@ -10,8 +10,19 @@ Be sure that you have set a few env variables in `chain.py`:
* `PINECONE_API_KEY`
* `PINECONE_ENV`
* `index_name`
* `PINECONE_INDEX`
## LLM
Be sure that `OPENAI_API_KEY` is set in order to the OpenAI models.
## Environment variables
You need to define the following environment variables
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
PINECONE_API_KEY=<YOUR_PINECONE_API_KEY>
PINECONE_ENVIRONMENT=<YOUR_PINECONE_INDEX>
PINECONE_INDEX=<YOUR_PINECONE_INDEX>
```

View File

@@ -74,4 +74,17 @@ langchain serve add rag-redis
Start the server:
```bash
langchain start
```
```
## Environment variables
You need to define the following environment variables
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
REDIS_HOST=<YOUR_REDIS_HOST>
REDIS_PORT=<YOUR_REDIS_PORT>
REDIS_USER=<YOUR_REDIS_USER>
REDIS_PASSWORD=<YOUR_REDIS_PASSWORD>
INDEX_NAME=<YOUR_INDEX_NAME>
```

View File

@@ -44,4 +44,12 @@ Start server:
langchain start
```
See Jupyter notebook `rag_semi_structured` for various way to connect to the template.
See Jupyter notebook `rag_semi_structured` for various way to connect to the template.
## Environment variables
You need to define the following environment variables
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
```

View File

@@ -61,6 +61,9 @@ Use these steps to setup your Supabase database if you haven't already.
Since we are using [`SupabaseVectorStore`](https://python.langchain.com/docs/integrations/vectorstores/supabase) and [`OpenAIEmbeddings`](https://python.langchain.com/docs/integrations/text_embedding/openai), we need to load their API keys.
## Environment variables
Create a `.env` file in the root of your project:
_.env_

View File

@@ -13,4 +13,14 @@ Be sure that you have set a few env variables in `chain.py`:
## LLM
Be sure that `OPENAI_API_KEY` is set in order to the OpenAI models.
Be sure that `OPENAI_API_KEY` is set in order to the OpenAI models.
## Environment variables
You need to define the following environment variables
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
WEAVIATE_ENVIRONMENT=<YOUR_WEAVIATE_ENVIRONMENT>
WEAVIATE_API_KEY=<YOUR_WEAVIATE_API_KEY>
```

View File

@@ -5,3 +5,12 @@
> Because the original query can not be always optimal to retrieve for the LLM, especially in the real world... we first prompt an LLM to rewrite the queries, then conduct retrieval-augmented reading
We show how you can easily do that with LangChain Expression Language
## Environment variables
You need to define the following environment variables
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
```

View File

@@ -61,6 +61,9 @@ Use these steps to setup your Supabase database if you haven't already.
Since we are using [`SupabaseVectorStore`](https://python.langchain.com/docs/integrations/vectorstores/supabase) and [`OpenAIEmbeddings`](https://python.langchain.com/docs/integrations/text_embedding/openai), we need to load their API keys.
## Environment variables
Create a `.env` file in the root of your project:
_.env_

View File

@@ -18,4 +18,12 @@ You can see instructions to build this DB [here](https://github.com/facebookrese
This template will use a `Replicate` [hosted version](https://replicate.com/meta/llama-2-13b-chat/versions/f4e2de70d66816a838a89eeeb621910adffb0dd0baba3976c96980970978018d) of LLaMA2.
Be sure that `REPLICATE_API_TOKEN` is set in your environment.
Be sure that `REPLICATE_API_TOKEN` is set in your environment.
## Environment variables
You need to define the following environment variables
```shell
REPLICATE_API_TOKEN=<YOUR_REPLICATE_API_TOKEN>
```

View File

@@ -7,3 +7,11 @@ Read the paper [here](https://arxiv.org/abs/2310.06117)
See an excelent blog post on this by Cobus Greyling [here](https://cobusgreyling.medium.com/a-new-prompt-engineering-technique-has-been-introduced-called-step-back-prompting-b00e8954cacb)
In this template we will replicate this technique. We modify the prompts used slightly to work better with chat models.
## Environment variables
You need to define the following environment variables
```shell
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
```

View File

@@ -14,3 +14,11 @@ To do this, we can use various prompts from LangChain hub, such as:
This template will use `Claude2` by default.
Be sure that `ANTHROPIC_API_KEY` is set in your enviorment.
## Environment variables
You need to define the following environment variables
```shell
ANTHROPIC_API_KEY=<YOUR_ANTHROPIC_API_KEY>
```

View File

@@ -15,3 +15,12 @@ Be sure that `ANTHROPIC_API_KEY` is set in your environment.
This template will use `You.com` by default.
Be sure that `YDC_API_KEY` is set in your environment.
## Environment variables
You need to define the following environment variables
```shell
ANTHROPIC_API_KEY=<YOUR_ANTHROPIC_API_KEY>
YDC_API_KEY=<YOUR_YDC_API_KEY>
```