mirror of
https://github.com/hwchase17/langchain.git
synced 2025-05-12 10:37:32 +00:00
**drop-in-replacement for sentence-transformers inference.** https://github.com/langchain-ai/langchain/discussions/17670 tldr from the discussion above -> around a 4x-22x speedup over using SentenceTransformers / huggingface embeddings. For more info: https://github.com/michaelfeil/infinity (pure-python dependency) --------- Co-authored-by: Erick Friis <erick@langchain.dev>
291 lines
8.9 KiB
Plaintext
291 lines
8.9 KiB
Plaintext
{
|
|
"cells": [
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"# Infinity\n",
|
|
"\n",
|
|
"`Infinity` allows to create `Embeddings` using a MIT-licensed Embedding Server. \n",
|
|
"\n",
|
|
"This notebook goes over how to use Langchain with Embeddings with the [Infinity Github Project](https://github.com/michaelfeil/infinity).\n"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Imports"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 1,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain_community.embeddings import InfinityEmbeddings, InfinityEmbeddingsLocal"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"# Option 1: Use infinity from Python"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"#### Optional: install infinity\n",
|
|
"\n",
|
|
"To install infinity use the following command. For further details check out the [Docs on Github](https://github.com/michaelfeil/infinity).\n",
|
|
"Install the torch and onnx dependencies. \n",
|
|
"\n",
|
|
"```bash\n",
|
|
"pip install infinity_emb[torch,optimum]\n",
|
|
"```"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 2,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"documents = [\n",
|
|
" \"Baguette is a dish.\",\n",
|
|
" \"Paris is the capital of France.\",\n",
|
|
" \"numpy is a lib for linear algebra\",\n",
|
|
" \"You escaped what I've escaped - You'd be in Paris getting fucked up too\",\n",
|
|
"]\n",
|
|
"query = \"Where is Paris?\""
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 3,
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"name": "stderr",
|
|
"output_type": "stream",
|
|
"text": [
|
|
"/home/michael/langchain/libs/langchain/.venv/lib/python3.10/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html\n",
|
|
" from .autonotebook import tqdm as notebook_tqdm\n",
|
|
"The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.\n",
|
|
"/home/michael/langchain/libs/langchain/.venv/lib/python3.10/site-packages/optimum/bettertransformer/models/encoder_models.py:301: UserWarning: The PyTorch API of nested tensors is in prototype stage and will change in the near future. (Triggered internally at ../aten/src/ATen/NestedTensorImpl.cpp:177.)\n",
|
|
" hidden_states = torch._nested_tensor_from_mask(hidden_states, ~attention_mask)\n"
|
|
]
|
|
}
|
|
],
|
|
"source": [
|
|
"embeddings = InfinityEmbeddingsLocal(\n",
|
|
" model=\"sentence-transformers/all-MiniLM-L6-v2\",\n",
|
|
" # revision\n",
|
|
" revision=None,\n",
|
|
" # best to keep at 32\n",
|
|
" batch_size=32,\n",
|
|
" # for AMD/Nvidia GPUs via torch\n",
|
|
" device=\"cuda\",\n",
|
|
" # warm up model before execution\n",
|
|
")\n",
|
|
"\n",
|
|
"\n",
|
|
"async def embed():\n",
|
|
" # TODO: This function is just to showcase that your call can run async.\n",
|
|
"\n",
|
|
" # important: use engine inside of `async with` statement to start/stop the batching engine.\n",
|
|
" async with embeddings:\n",
|
|
" # avoid closing and starting the engine often.\n",
|
|
" # rather keep it running.\n",
|
|
" # you may call `await embeddings.__aenter__()` and `__aexit__()\n",
|
|
" # if you are sure when to manually start/stop execution` in a more granular way\n",
|
|
"\n",
|
|
" documents_embedded = await embeddings.aembed_documents(documents)\n",
|
|
" query_result = await embeddings.aembed_query(query)\n",
|
|
" print(\"embeddings created successful\")\n",
|
|
" return documents_embedded, query_result"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"# run the async code however you would like\n",
|
|
"# if you are in a jupyter notebook, you can use the following\n",
|
|
"documents_embedded, query_result = await embed()"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"# (demo) compute similarity\n",
|
|
"import numpy as np\n",
|
|
"\n",
|
|
"scores = np.array(documents_embedded) @ np.array(query_result).T\n",
|
|
"dict(zip(documents, scores))"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"# Option 2: Run the server, and connect via the API"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"#### Optional: Make sure to start the Infinity instance\n",
|
|
"\n",
|
|
"To install infinity use the following command. For further details check out the [Docs on Github](https://github.com/michaelfeil/infinity).\n",
|
|
"```bash\n",
|
|
"pip install infinity_emb[all]\n",
|
|
"```"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"# Install the infinity package\n",
|
|
"%pip install --upgrade --quiet infinity_emb[all]"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"Start up the server - best to be done from a separate terminal, not inside Jupyter Notebook\n",
|
|
"\n",
|
|
"```bash\n",
|
|
"model=sentence-transformers/all-MiniLM-L6-v2\n",
|
|
"port=7797\n",
|
|
"infinity_emb --port $port --model-name-or-path $model\n",
|
|
"```\n",
|
|
"\n",
|
|
"or alternativley just use docker:\n",
|
|
"```bash\n",
|
|
"model=sentence-transformers/all-MiniLM-L6-v2\n",
|
|
"port=7797\n",
|
|
"docker run -it --gpus all -p $port:$port michaelf34/infinity:latest --model-name-or-path $model --port $port\n",
|
|
"```"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Embed your documents using your Infinity instance "
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 5,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"documents = [\n",
|
|
" \"Baguette is a dish.\",\n",
|
|
" \"Paris is the capital of France.\",\n",
|
|
" \"numpy is a lib for linear algebra\",\n",
|
|
" \"You escaped what I've escaped - You'd be in Paris getting fucked up too\",\n",
|
|
"]\n",
|
|
"query = \"Where is Paris?\""
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 6,
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"name": "stdout",
|
|
"output_type": "stream",
|
|
"text": [
|
|
"Make sure the infinity instance is running. Verify by clicking on http://localhost:7797/docs Exception: HTTPConnectionPool(host='localhost', port=7797): Max retries exceeded with url: /v1/embeddings (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f91c35dbd30>: Failed to establish a new connection: [Errno 111] Connection refused')). \n"
|
|
]
|
|
}
|
|
],
|
|
"source": [
|
|
"#\n",
|
|
"infinity_api_url = \"http://localhost:7797/v1\"\n",
|
|
"# model is currently not validated.\n",
|
|
"embeddings = InfinityEmbeddings(\n",
|
|
" model=\"sentence-transformers/all-MiniLM-L6-v2\", infinity_api_url=infinity_api_url\n",
|
|
")\n",
|
|
"try:\n",
|
|
" documents_embedded = embeddings.embed_documents(documents)\n",
|
|
" query_result = embeddings.embed_query(query)\n",
|
|
" print(\"embeddings created successful\")\n",
|
|
"except Exception as ex:\n",
|
|
" print(\n",
|
|
" \"Make sure the infinity instance is running. Verify by clicking on \"\n",
|
|
" f\"{infinity_api_url.replace('v1','docs')} Exception: {ex}. \"\n",
|
|
" )"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"text/plain": [
|
|
"{'Baguette is a dish.': 0.31344215908661155,\n",
|
|
" 'Paris is the capital of France.': 0.8148670296896388,\n",
|
|
" 'numpy is a lib for linear algebra': 0.004429399861302009,\n",
|
|
" \"You escaped what I've escaped - You'd be in Paris getting fucked up too\": 0.5088476180154582}"
|
|
]
|
|
},
|
|
"execution_count": 5,
|
|
"metadata": {},
|
|
"output_type": "execute_result"
|
|
}
|
|
],
|
|
"source": [
|
|
"# (demo) compute similarity\n",
|
|
"import numpy as np\n",
|
|
"\n",
|
|
"scores = np.array(documents_embedded) @ np.array(query_result).T\n",
|
|
"dict(zip(documents, scores))"
|
|
]
|
|
}
|
|
],
|
|
"metadata": {
|
|
"kernelspec": {
|
|
"display_name": "Python 3 (ipykernel)",
|
|
"language": "python",
|
|
"name": "python3"
|
|
},
|
|
"language_info": {
|
|
"codemirror_mode": {
|
|
"name": "ipython",
|
|
"version": 3
|
|
},
|
|
"file_extension": ".py",
|
|
"mimetype": "text/x-python",
|
|
"name": "python",
|
|
"nbconvert_exporter": "python",
|
|
"pygments_lexer": "ipython3",
|
|
"version": "3.10.12"
|
|
},
|
|
"vscode": {
|
|
"interpreter": {
|
|
"hash": "a0a0263b650d907a3bfe41c0f8d6a63a071b884df3cfdc1579f00cdc1aed6b03"
|
|
}
|
|
}
|
|
},
|
|
"nbformat": 4,
|
|
"nbformat_minor": 4
|
|
}
|