diff --git a/docs/docs/integrations/platforms/microsoft.mdx b/docs/docs/integrations/platforms/microsoft.mdx index 8eaf7a3a04a..5cf1c36c452 100644 --- a/docs/docs/integrations/platforms/microsoft.mdx +++ b/docs/docs/integrations/platforms/microsoft.mdx @@ -3,6 +3,15 @@ All functionality related to `Microsoft Azure` and other `Microsoft` products. ## LLMs + +### Azure ML + +See a [usage example](/docs/integrations/llms/azure_ml). + +```python +from langchain_community.llms.azureml_endpoint import AzureMLOnlineEndpoint +``` + ### Azure OpenAI See a [usage example](/docs/integrations/llms/azure_openai). diff --git a/docs/docs/integrations/providers/arcee.mdx b/docs/docs/integrations/providers/arcee.mdx new file mode 100644 index 00000000000..b685dd9b2d7 --- /dev/null +++ b/docs/docs/integrations/providers/arcee.mdx @@ -0,0 +1,30 @@ +# Arcee + +>[Arcee](https://www.arcee.ai/about/about-us) enables the development and advancement +> of what we coin as SLMs—small, specialized, secure, and scalable language models. +> By offering a SLM Adaptation System and a seamless, secure integration, +> `Arcee` empowers enterprises to harness the full potential of +> domain-adapted language models, driving the transformative +> innovation in operations. + + +## Installation and Setup + +Get your `Arcee API` key. + + +## LLMs + +See a [usage example](/docs/integrations/llms/arcee). + +```python +from langchain_community.llms import Arcee +``` + +## Retrievers + +See a [usage example](/docs/integrations/retrievers/arcee). + +```python +from langchain_community.retrievers import ArceeRetriever +``` diff --git a/docs/docs/integrations/providers/baidu.mdx b/docs/docs/integrations/providers/baidu.mdx new file mode 100644 index 00000000000..20cc3a5c763 --- /dev/null +++ b/docs/docs/integrations/providers/baidu.mdx @@ -0,0 +1,50 @@ +# Baidu + +>[Baidu Cloud](https://cloud.baidu.com/) is a cloud service provided by `Baidu, Inc.`, +> headquartered in Beijing. It offers a cloud storage service, client software, +> file management, resource sharing, and Third Party Integration. + + +## Installation and Setup + +Register and get the `Qianfan` `AK` and `SK` keys [here](https://cloud.baidu.com/product/wenxinworkshop). + +## LLMs + +### Baidu Qianfan + +See a [usage example](/docs/integrations/llms/baidu_qianfan_endpoint). + +```python +from langchain_community.llms import QianfanLLMEndpoint +``` + +## Chat models + +### Qianfan Chat Endpoint + +See a [usage example](/docs/integrations/chat/baidu_qianfan_endpoint). + +```python +from langchain_community.chat_models import QianfanChatEndpoint +``` + +## Embedding models + +### Baidu Qianfan + +See a [usage example](/docs/integrations/text_embedding/baidu_qianfan_endpoint). + +```python +from langchain_community.embeddings import QianfanEmbeddingsEndpoint +``` + +## Vector stores + +### Baidu Cloud ElasticSearch VectorSearch + +See a [usage example](/docs/integrations/vectorstores/baiducloud_vector_search). + +```python +from langchain_community.vectorstores import BESVectorStore +``` diff --git a/docs/docs/integrations/providers/ctranslate2.mdx b/docs/docs/integrations/providers/ctranslate2.mdx new file mode 100644 index 00000000000..0e3c3a9319e --- /dev/null +++ b/docs/docs/integrations/providers/ctranslate2.mdx @@ -0,0 +1,30 @@ +# CTranslate2 + +>[CTranslate2](https://opennmt.net/CTranslate2/quickstart.html) is a C++ and Python library +> for efficient inference with Transformer models. +> +>The project implements a custom runtime that applies many performance optimization +> techniques such as weights quantization, layers fusion, batch reordering, etc., +> to accelerate and reduce the memory usage of Transformer models on CPU and GPU. +> +>A full list of features and supported models is included in the +> [project’s repository](https://opennmt.net/CTranslate2/guides/transformers.html). +> To start, please check out the official [quickstart guide](https://opennmt.net/CTranslate2/quickstart.html). + + +## Installation and Setup + +Install the Python package: + +```bash +pip install ctranslate2 +``` + + +## LLMs + +See a [usage example](/docs/integrations/llms/ctranslate2). + +```python +from langchain_community.llms import CTranslate2 +``` diff --git a/docs/docs/integrations/providers/deepsparse.mdx b/docs/docs/integrations/providers/deepsparse.mdx index aa6905a1eb0..879b07c55c9 100644 --- a/docs/docs/integrations/providers/deepsparse.mdx +++ b/docs/docs/integrations/providers/deepsparse.mdx @@ -8,9 +8,8 @@ It is broken into two parts: installation and setup, and then examples of DeepSp - Install the Python package with `pip install deepsparse` - Choose a [SparseZoo model](https://sparsezoo.neuralmagic.com/?useCase=text_generation) or export a support model to ONNX [using Optimum](https://github.com/neuralmagic/notebooks/blob/main/notebooks/opt-text-generation-deepsparse-quickstart/OPT_Text_Generation_DeepSparse_Quickstart.ipynb) -## Wrappers -### LLM +## LLMs There exists a DeepSparse LLM wrapper, which you can access with: diff --git a/docs/docs/integrations/providers/edenai.mdx b/docs/docs/integrations/providers/edenai.mdx new file mode 100644 index 00000000000..a33e92ec6a9 --- /dev/null +++ b/docs/docs/integrations/providers/edenai.mdx @@ -0,0 +1,62 @@ +# Eden AI + +>[Eden AI](https://docs.edenai.co/docs/getting-started-with-eden-ai) user interface (UI) +> is designed for handling the AI projects. With `Eden AI Portal`, +> you can perform no-code AI using the best engines for the market. + + +## Installation and Setup + +Accessing the Eden AI API requires an API key, which you can get by +[creating an account](https://app.edenai.run/user/register) and +heading [here](https://app.edenai.run/admin/account/settings). + +## LLMs + +See a [usage example](/docs/integrations/llms/edenai). + +```python +from langchain_community.llms import EdenAI + +``` + +## Chat models + +See a [usage example](/docs/integrations/chat/edenai). + +```python +from langchain_community.chat_models.edenai import ChatEdenAI +``` + +## Embedding models + +See a [usage example](/docs/integrations/text_embedding/edenai). + +```python +from langchain_community.embeddings.edenai import EdenAiEmbeddings +``` + +## Tools + +Eden AI provides a list of tools that grants your Agent the ability to do multiple tasks, such as: +* speech to text +* text to speech +* text explicit content detection +* image explicit content detection +* object detection +* OCR invoice parsing +* OCR ID parsing + +See a [usage example](/docs/integrations/tools/edenai_tools). + +```python +from langchain_community.tools.edenai import ( + EdenAiExplicitImageTool, + EdenAiObjectDetectionTool, + EdenAiParsingIDTool, + EdenAiParsingInvoiceTool, + EdenAiSpeechToTextTool, + EdenAiTextModerationTool, + EdenAiTextToSpeechTool, +) +``` diff --git a/docs/docs/integrations/providers/elevenlabs.mdx b/docs/docs/integrations/providers/elevenlabs.mdx new file mode 100644 index 00000000000..56352730478 --- /dev/null +++ b/docs/docs/integrations/providers/elevenlabs.mdx @@ -0,0 +1,27 @@ +# ElevenLabs + +>[ElevenLabs](https://elevenlabs.io/about) is a voice AI research & deployment company +> with a mission to make content universally accessible in any language & voice. +> +>`ElevenLabs` creates the most realistic, versatile and contextually-aware +> AI audio, providing the ability to generate speech in hundreds of +> new and existing voices in 29 languages. + +## Installation and Setup + +First, you need to set up an ElevenLabs account. You can follow the +[instructions here](https://docs.elevenlabs.io/welcome/introduction). + +Install the Python package: + +```bash +pip install elevenlabs +``` + +## Tools + +See a [usage example](/docs/integrations/tools/eleven_labs_tts). + +```python +from langchain_community.tools import ElevenLabsText2SpeechTool +``` diff --git a/docs/docs/integrations/providers/pygmalionai.mdx b/docs/docs/integrations/providers/pygmalionai.mdx new file mode 100644 index 00000000000..2d98fdf38c0 --- /dev/null +++ b/docs/docs/integrations/providers/pygmalionai.mdx @@ -0,0 +1,21 @@ +# PygmalionAI + +>[PygmalionAI](https://pygmalion.chat/) is a company supporting the +> open-source models by serving the inference endpoint +> for the [Aphrodite Engine](https://github.com/PygmalionAI/aphrodite-engine). + + +## Installation and Setup + + +```bash +pip install aphrodite-engine +``` + +## LLMs + +See a [usage example](/docs/integrations/llms/aphrodite). + +```python +from langchain_community.llms import Aphrodite +```