mirror of
https://github.com/hwchase17/langchain.git
synced 2025-06-25 16:13:25 +00:00
docs: update Jina ecosystem (#7149)
Documentation update for [Jina ecosystem](https://python.langchain.com/docs/ecosystem/integrations/jina) and `langchain-serve` in the deployments section to latest features. @hwchase17 <!-- Thank you for contributing to LangChain! Replace this comment with: - Description: a description of the change, - Issue: the issue # it fixes (if applicable), - Dependencies: any dependencies required for this change, - Tag maintainer: for a quicker response, tag the relevant maintainer (see below), - Twitter handle: we announce bigger features on Twitter. If your PR gets announced and you'd like a mention, we'll gladly shout you out! If you're adding a new integration, please include: 1. a test for the integration, preferably unit tests that do not rely on network access, 2. an example notebook showing its use. Maintainer responsibilities: - General / Misc / if you don't know who to tag: @baskaryan - DataLoaders / VectorStores / Retrievers: @rlancemartin, @eyurtsev - Models / Prompts: @hwchase17, @baskaryan - Memory: @hwchase17 - Agents / Tools / Toolkits: @hinthornw - Tracing / Callbacks: @agola11 - Async: @agola11 If no one reviews your PR within a few days, feel free to @-mention the same people again. See contribution guidelines for more information on how to write/run tests, lint, etc: https://github.com/hwchase17/langchain/blob/master/.github/CONTRIBUTING.md -->
This commit is contained in:
parent
b39017dc11
commit
da69a6771f
@ -16,3 +16,59 @@ There exists a Jina Embeddings wrapper, which you can access with
|
||||
from langchain.embeddings import JinaEmbeddings
|
||||
```
|
||||
For a more detailed walkthrough of this, see [this notebook](/docs/modules/data_connection/text_embedding/integrations/jina.html)
|
||||
|
||||
## Deployment
|
||||
|
||||
[Langchain-serve](https://github.com/jina-ai/langchain-serve), powered by Jina, helps take LangChain apps to production with easy to use REST/WebSocket APIs and Slack bots.
|
||||
|
||||
### Usage
|
||||
|
||||
Install the package from PyPI.
|
||||
|
||||
```bash
|
||||
pip install langchain-serve
|
||||
```
|
||||
|
||||
Wrap your LangChain app with the `@serving` decorator.
|
||||
|
||||
```python
|
||||
# app.py
|
||||
from lcserve import serving
|
||||
|
||||
@serving
|
||||
def ask(input: str) -> str:
|
||||
from langchain import LLMChain, OpenAI
|
||||
from langchain.agents import AgentExecutor, ZeroShotAgent
|
||||
|
||||
tools = [...] # list of tools
|
||||
prompt = ZeroShotAgent.create_prompt(
|
||||
tools, input_variables=["input", "agent_scratchpad"],
|
||||
)
|
||||
llm_chain = LLMChain(llm=OpenAI(temperature=0), prompt=prompt)
|
||||
agent = ZeroShotAgent(
|
||||
llm_chain=llm_chain, allowed_tools=[tool.name for tool in tools]
|
||||
)
|
||||
agent_executor = AgentExecutor.from_agent_and_tools(
|
||||
agent=agent,
|
||||
tools=tools,
|
||||
verbose=True,
|
||||
)
|
||||
return agent_executor.run(input)
|
||||
```
|
||||
|
||||
Deploy on Jina AI Cloud with `lc-serve deploy jcloud app`. Once deployed, we can send a POST request to the API endpoint to get a response.
|
||||
|
||||
```bash
|
||||
curl -X 'POST' 'https://<your-app>.wolf.jina.ai/ask' \
|
||||
-d '{
|
||||
"input": "Your Quesion here?",
|
||||
"envs": {
|
||||
"OPENAI_API_KEY": "sk-***"
|
||||
}
|
||||
}'
|
||||
```
|
||||
|
||||
You can also self-host the app on your infrastructure with Docker-compose or Kubernetes. See [here](https://github.com/jina-ai/langchain-serve#-self-host-llm-apps-with-docker-compose-or-kubernetes) for more details.
|
||||
|
||||
|
||||
Langchain-serve also allows to deploy the apps with WebSocket APIs and Slack Bots both on [Jina AI Cloud](https://cloud.jina.ai/) or self-hosted infrastructure.
|
||||
|
@ -24,6 +24,7 @@ Understanding these components is crucial when assessing serving systems. LangCh
|
||||
- [BentoML](https://github.com/bentoml/BentoML)
|
||||
- [OpenLLM](/docs/ecosystem/integrations/openllm.html)
|
||||
- [Modal](/docs/ecosystem/integrations/modal.html)
|
||||
- [Jina](/docs/ecosystem/integrations/jina.html#deployment)
|
||||
|
||||
These links will provide further information on each ecosystem, assisting you in finding the best fit for your LLM deployment needs.
|
||||
|
||||
|
@ -61,7 +61,7 @@ This repository contains LangChain adapters for Steamship, enabling LangChain de
|
||||
|
||||
## [Langchain-serve](https://github.com/jina-ai/langchain-serve)
|
||||
|
||||
This repository allows users to serve local chains and agents as RESTful, gRPC, or WebSocket APIs, thanks to [Jina](https://docs.jina.ai/). Deploy your chains & agents with ease and enjoy independent scaling, serverless and autoscaling APIs, as well as a Streamlit playground on Jina AI Cloud.
|
||||
This repository allows users to deploy any LangChain app as REST/WebSocket APIs or, as Slack Bots with ease. Benefit from the scalability and serverless architecture of Jina AI Cloud, or deploy on-premise with Kubernetes.
|
||||
|
||||
## [BentoML](https://github.com/ssheng/BentoChain)
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user