mirror of
https://github.com/hwchase17/langchain.git
synced 2025-06-30 10:23:30 +00:00
[DOCS] Assorted wording, punctuation, and consistency revisions (#1443)
Contributing some small fixes I noticed while reading through the documentation. Thank you for a creating and maintaining this project!
This commit is contained in:
parent
519f0187b6
commit
494c9d341a
@ -1,12 +1,12 @@
|
||||
# AtlasDB
|
||||
|
||||
This page covers how to Nomic's Atlas ecosystem within LangChain.
|
||||
This page covers how to use Nomic's Atlas ecosystem within LangChain.
|
||||
It is broken into two parts: installation and setup, and then references to specific Atlas wrappers.
|
||||
|
||||
## Installation and Setup
|
||||
- Install the Python package with `pip install nomic`
|
||||
- Nomic is also included in langchains poetry extras `poetry install -E all`
|
||||
-
|
||||
|
||||
## Wrappers
|
||||
|
||||
### VectorStore
|
||||
|
@ -5,7 +5,7 @@ It is broken into two parts: installation and setup, and then references to spec
|
||||
|
||||
## Installation and Setup
|
||||
|
||||
- Install with `pip3 install banana-dev`
|
||||
- Install with `pip install banana-dev`
|
||||
- Get an Banana api key and set it as an environment variable (`BANANA_API_KEY`)
|
||||
|
||||
## Define your Banana Template
|
||||
|
@ -1,6 +1,6 @@
|
||||
# Graphsignal
|
||||
|
||||
This page covers how to use the Graphsignal to trace and monitor LangChain.
|
||||
This page covers how to use the Graphsignal ecosystem to trace and monitor LangChain.
|
||||
|
||||
## Installation and Setup
|
||||
|
||||
|
@ -1,6 +1,6 @@
|
||||
# Helicone
|
||||
|
||||
This page covers how to use the [Helicone](https://helicone.ai) within LangChain.
|
||||
This page covers how to use the [Helicone](https://helicone.ai) ecosystem within LangChain.
|
||||
|
||||
## What is Helicone?
|
||||
|
||||
|
@ -10,7 +10,7 @@ One of the simpler forms of memory occurs in chatbots, where they remember previ
|
||||
There are a few different ways to accomplish this:
|
||||
- Buffer: This is just passing in the past `N` interactions in as context. `N` can be chosen based on a fixed number, the length of the interactions, or other!
|
||||
- Summary: This involves summarizing previous conversations and passing that summary in, instead of the raw dialouge itself. Compared to `Buffer`, this compresses information: meaning it is more lossy, but also less likely to run into context length limits.
|
||||
- Combination: A combination of the above two approaches, where you compute a summary but also pass in some previous interfactions directly!
|
||||
- Combination: A combination of the above two approaches, where you compute a summary but also pass in some previous interactions directly!
|
||||
|
||||
## Entity Memory
|
||||
A more complex form of memory is remembering information about specific entities in the conversation.
|
||||
|
Loading…
Reference in New Issue
Block a user