mirror of
https://github.com/hwchase17/langchain.git
synced 2025-08-08 12:31:49 +00:00
Fix use case sentence for bash util doc (#1295)
Thanks for all your hard work! I noticed a small typo in the bash util doc so here's a quick update. Additionally, my formatter caught some spacing in the `.md` as well. Happy to revert that if it's an issue. The main change is just ``` - A common use case this is for letting it interact with your local file system. + A common use case for this is letting the LLM interact with your local file system. ``` ## Testing `make docs_build` succeeds locally and the changes show as expected ✌️ <img width="704" alt="image" src="https://user-images.githubusercontent.com/17773666/221376160-e99e59a6-b318-49d1-a1d7-89f5c17cdab4.png">
This commit is contained in:
parent
fd9975dad7
commit
648b3b3909
@ -6,7 +6,7 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Bash\n",
|
||||
"It can often be useful to have an LLM generate bash commands, and then run them. A common use case this is for letting it interact with your local file system. We provide an easy util to execute bash commands."
|
||||
"It can often be useful to have an LLM generate bash commands, and then run them. A common use case for this is letting the LLM interact with your local file system. We provide an easy util to execute bash commands."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
@ -1,6 +1,7 @@
|
||||
# Key Concepts
|
||||
|
||||
## Python REPL
|
||||
|
||||
Sometimes, for complex calculations, rather than have an LLM generate the answer directly,
|
||||
it can be better to have the LLM generate code to calculate the answer, and then run that code to get the answer.
|
||||
In order to easily do that, we provide a simple Python REPL to execute commands in.
|
||||
@ -8,22 +9,27 @@ This interface will only return things that are printed -
|
||||
therefore, if you want to use it to calculate an answer, make sure to have it print out the answer.
|
||||
|
||||
## Bash
|
||||
|
||||
It can often be useful to have an LLM generate bash commands, and then run them.
|
||||
A common use case this is for letting it interact with your local file system.
|
||||
A common use case for this is letting the LLM interact with your local file system.
|
||||
We provide an easy component to execute bash commands.
|
||||
|
||||
## Requests Wrapper
|
||||
|
||||
The web contains a lot of information that LLMs do not have access to.
|
||||
In order to easily let LLMs interact with that information,
|
||||
we provide a wrapper around the Python Requests module that takes in a URL and fetches data from that URL.
|
||||
|
||||
## Google Search
|
||||
|
||||
This uses the official Google Search API to look up information on the web.
|
||||
|
||||
## SerpAPI
|
||||
|
||||
This uses SerpAPI, a third party search API engine, to interact with Google Search.
|
||||
|
||||
## Searx Search
|
||||
|
||||
This uses the Searx (SearxNG fork) meta search engine API to lookup information
|
||||
on the web. It supports 139 search engines and is easy to self-host
|
||||
which makes it a good choice for privacy-conscious users.
|
||||
|
Loading…
Reference in New Issue
Block a user