langchain/docs/docs/integrations/platforms/huggingface.mdx
2024-05-14 03:17:40 +00:00

127 lines
3.5 KiB
Plaintext

# Hugging Face
All functionality related to the [Hugging Face Platform](https://huggingface.co/).
## Installation
Most of the Hugging Face integrations are available in the `langchain-huggingface` package.
```bash
pip install langchain-huggingface
```
## Chat models
### Models from Hugging Face
We can use the `Hugging Face` LLM classes or directly use the `ChatHuggingFace` class.
See a [usage example](/docs/integrations/chat/huggingface).
```python
from langchain_huggingface import ChatHuggingFace
```
## LLMs
### Hugging Face Local Pipelines
Hugging Face models can be run locally through the `HuggingFacePipeline` class.
See a [usage example](/docs/integrations/llms/huggingface_pipelines).
```python
from langchain_huggingface import HuggingFacePipeline
```
## Embedding Models
### HuggingFaceEmbeddings
See a [usage example](/docs/integrations/text_embedding/huggingfacehub).
```python
from langchain_huggingface import HuggingFaceEmbeddings
```
### HuggingFaceInstructEmbeddings
See a [usage example](/docs/integrations/text_embedding/instruct_embeddings).
```python
from langchain_community.embeddings import HuggingFaceInstructEmbeddings
```
### HuggingFaceBgeEmbeddings
>[BGE models on the HuggingFace](https://huggingface.co/BAAI/bge-large-en) are [the best open-source embedding models](https://huggingface.co/spaces/mteb/leaderboard).
>BGE model is created by the [Beijing Academy of Artificial Intelligence (BAAI)](https://en.wikipedia.org/wiki/Beijing_Academy_of_Artificial_Intelligence). `BAAI` is a private non-profit organization engaged in AI research and development.
See a [usage example](/docs/integrations/text_embedding/bge_huggingface).
```python
from langchain_community.embeddings import HuggingFaceBgeEmbeddings
```
### Hugging Face Text Embeddings Inference (TEI)
>[Hugging Face Text Embeddings Inference (TEI)](https://huggingface.co/docs/text-generation-inference/index) is a toolkit for deploying and serving open-source
> text embeddings and sequence classification models. `TEI` enables high-performance extraction for the most popular models,
>including `FlagEmbedding`, `Ember`, `GTE` and `E5`.
We need to install `huggingface-hub` python package.
```bash
pip install huggingface-hub
```
See a [usage example](/docs/integrations/text_embedding/text_embeddings_inference).
```python
from langchain_community.embeddings import HuggingFaceHubEmbeddings
```
## Document Loaders
### Hugging Face dataset
>[Hugging Face Hub](https://huggingface.co/docs/hub/index) is home to over 75,000
> [datasets](https://huggingface.co/docs/hub/index#datasets) in more than 100 languages
> that can be used for a broad range of tasks across NLP, Computer Vision, and Audio.
> They used for a diverse range of tasks such as translation, automatic speech
> recognition, and image classification.
We need to install `datasets` python package.
```bash
pip install datasets
```
See a [usage example](/docs/integrations/document_loaders/hugging_face_dataset).
```python
from langchain_community.document_loaders.hugging_face_dataset import HuggingFaceDatasetLoader
```
## Tools
### Hugging Face Hub Tools
>[Hugging Face Tools](https://huggingface.co/docs/transformers/v4.29.0/en/custom_tools)
> support text I/O and are loaded using the `load_huggingface_tool` function.
We need to install several python packages.
```bash
pip install transformers huggingface_hub
```
See a [usage example](/docs/integrations/tools/huggingface_tools).
```python
from langchain.agents import load_huggingface_tool
```