mirror of
https://github.com/hwchase17/langchain.git
synced 2026-02-21 06:33:41 +00:00
349 lines
13 KiB
Plaintext
349 lines
13 KiB
Plaintext
{
|
|
"cells": [
|
|
{
|
|
"cell_type": "raw",
|
|
"id": "afaf8039",
|
|
"metadata": {},
|
|
"source": [
|
|
"---\n",
|
|
"sidebar_label: Outlines\n",
|
|
"---"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "e49f1e0d",
|
|
"metadata": {},
|
|
"source": [
|
|
"# ChatOutlines\n",
|
|
"\n",
|
|
"This will help you getting started with Outlines [chat models](/docs/concepts/chat_models/). For detailed documentation of all ChatOutlines features and configurations head to the [API reference](https://python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.outlines.ChatOutlines.html).\n",
|
|
"\n",
|
|
"[Outlines](https://github.com/outlines-dev/outlines) is a library for constrained language generation. It allows you to use large language models (LLMs) with various backends while applying constraints to the generated output.\n",
|
|
"\n",
|
|
"## Overview\n",
|
|
"### Integration details\n",
|
|
"\n",
|
|
"| Class | Package | Local | Serializable | JS support | Package downloads | Package latest |\n",
|
|
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
|
|
"| [ChatOutlines](https://python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.outlines.ChatOutlines.html) | [langchain-community](https://python.langchain.com/api_reference/community/index.html) | ✅ | ❌ | ❌ |  |  |\n",
|
|
"\n",
|
|
"### Model features\n",
|
|
"| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
|
|
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
|
|
"| ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ❌ | ❌ | ❌ | \n",
|
|
"\n",
|
|
"## Setup\n",
|
|
"\n",
|
|
"To access Outlines models you'll need to have an internet connection to download the model weights from huggingface. Depending on the backend you need to install the required dependencies (see [Outlines docs](https://dottxt-ai.github.io/outlines/latest/installation/))\n",
|
|
"\n",
|
|
"### Credentials\n",
|
|
"\n",
|
|
"There is no built-in auth mechanism for Outlines.\n",
|
|
"\n",
|
|
"### Installation\n",
|
|
"\n",
|
|
"The LangChain Outlines integration lives in the `langchain-community` package and requires the `outlines` library:"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "652d6238-1f87-422a-b135-f5abbb8652fc",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"%pip install -qU langchain-community outlines"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "a38cde65-254d-4219-a441-068766c0d4b5",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Instantiation\n",
|
|
"\n",
|
|
"Now we can instantiate our model object and generate chat completions:"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "cb09c344-1836-4e0c-acf8-11d13ac1dbae",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain_community.chat_models.outlines import ChatOutlines\n",
|
|
"\n",
|
|
"# For llamacpp backend\n",
|
|
"model = ChatOutlines(model=\"TheBloke/phi-2-GGUF/phi-2.Q4_K_M.gguf\", backend=\"llamacpp\")\n",
|
|
"\n",
|
|
"# For vllm backend (not available on Mac)\n",
|
|
"model = ChatOutlines(model=\"meta-llama/Llama-3.2-1B\", backend=\"vllm\")\n",
|
|
"\n",
|
|
"# For mlxlm backend (only available on Mac)\n",
|
|
"model = ChatOutlines(model=\"mistralai/Ministral-8B-Instruct-2410\", backend=\"mlxlm\")\n",
|
|
"\n",
|
|
"# For huggingface transformers backend\n",
|
|
"model = ChatOutlines(model=\"microsoft/phi-2\") # defaults to transformers backend"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "2b4f3e15",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Invocation"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "62e0dbc3",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain_core.messages import HumanMessage\n",
|
|
"\n",
|
|
"messages = [HumanMessage(content=\"What will the capital of mars be called?\")]\n",
|
|
"response = model.invoke(messages)\n",
|
|
"\n",
|
|
"response.content"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "18e2bfc0-7e78-4528-a73f-499ac150dca8",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Streaming\n",
|
|
"\n",
|
|
"ChatOutlines supports streaming of tokens:"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "e197d1d7-a070-4c96-9f8a-a0e86d046e0b",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"messages = [HumanMessage(content=\"Count to 10 in French:\")]\n",
|
|
"\n",
|
|
"for chunk in model.stream(messages):\n",
|
|
" print(chunk.content, end=\"\", flush=True)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "ccc3e2f6",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Chaining"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "5a032003",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain_core.prompts import ChatPromptTemplate\n",
|
|
"\n",
|
|
"prompt = ChatPromptTemplate.from_messages(\n",
|
|
" [\n",
|
|
" (\n",
|
|
" \"system\",\n",
|
|
" \"You are a helpful assistant that translates {input_language} to {output_language}.\",\n",
|
|
" ),\n",
|
|
" (\"human\", \"{input}\"),\n",
|
|
" ]\n",
|
|
")\n",
|
|
"\n",
|
|
"chain = prompt | model\n",
|
|
"chain.invoke(\n",
|
|
" {\n",
|
|
" \"input_language\": \"English\",\n",
|
|
" \"output_language\": \"German\",\n",
|
|
" \"input\": \"I love programming.\",\n",
|
|
" }\n",
|
|
")"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "d1ee55bc-ffc8-4cfa-801c-993953a08cfd",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Constrained Generation\n",
|
|
"\n",
|
|
"ChatOutlines allows you to apply various constraints to the generated output:\n",
|
|
"\n",
|
|
"### Regex Constraint"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "3a5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"model.regex = r\"((25[0-5]|2[0-4]\\d|[01]?\\d\\d?)\\.){3}(25[0-5]|2[0-4]\\d|[01]?\\d\\d?)\"\n",
|
|
"\n",
|
|
"response = model.invoke(\"What is the IP address of Google's DNS server?\")\n",
|
|
"\n",
|
|
"response.content"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "4a5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
|
|
"metadata": {},
|
|
"source": [
|
|
"### Type Constraints"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "5a5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"model.type_constraints = int\n",
|
|
"response = model.invoke(\"What is the answer to life, the universe, and everything?\")\n",
|
|
"\n",
|
|
"response.content"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "6a5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
|
|
"metadata": {},
|
|
"source": [
|
|
"### Pydantic and JSON Schemas"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "7a5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from pydantic import BaseModel\n",
|
|
"\n",
|
|
"\n",
|
|
"class Person(BaseModel):\n",
|
|
" name: str\n",
|
|
"\n",
|
|
"\n",
|
|
"model.json_schema = Person\n",
|
|
"response = model.invoke(\"Who are the main contributors to LangChain?\")\n",
|
|
"person = Person.model_validate_json(response.content)\n",
|
|
"\n",
|
|
"person"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "8a5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
|
|
"metadata": {},
|
|
"source": [
|
|
"### Context Free Grammars"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "9a5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"model.grammar = \"\"\"\n",
|
|
"?start: expression\n",
|
|
"?expression: term ((\"+\" | \"-\") term)*\n",
|
|
"?term: factor ((\"*\" | \"/\") factor)*\n",
|
|
"?factor: NUMBER | \"-\" factor | \"(\" expression \")\"\n",
|
|
"%import common.NUMBER\n",
|
|
"%import common.WS\n",
|
|
"%ignore WS\n",
|
|
"\"\"\"\n",
|
|
"response = model.invoke(\"Give me a complex arithmetic expression:\")\n",
|
|
"\n",
|
|
"response.content"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "aa5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
|
|
"metadata": {},
|
|
"source": [
|
|
"## LangChain's Structured Output\n",
|
|
"\n",
|
|
"You can also use LangChain's Structured Output with ChatOutlines:"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "ba5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from pydantic import BaseModel\n",
|
|
"\n",
|
|
"\n",
|
|
"class AnswerWithJustification(BaseModel):\n",
|
|
" answer: str\n",
|
|
" justification: str\n",
|
|
"\n",
|
|
"\n",
|
|
"_model = model.with_structured_output(AnswerWithJustification)\n",
|
|
"result = _model.invoke(\"What weighs more, a pound of bricks or a pound of feathers?\")\n",
|
|
"\n",
|
|
"result"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "ca5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
|
|
"metadata": {},
|
|
"source": [
|
|
"## API reference\n",
|
|
"\n",
|
|
"For detailed documentation of all ChatOutlines features and configurations head to the API reference: https://python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.outlines.ChatOutlines.html\n",
|
|
"\n",
|
|
"## Full Outlines Documentation: \n",
|
|
"\n",
|
|
"https://dottxt-ai.github.io/outlines/latest/"
|
|
]
|
|
}
|
|
],
|
|
"metadata": {
|
|
"kernelspec": {
|
|
"display_name": "Python 3 (ipykernel)",
|
|
"language": "python",
|
|
"name": "python3"
|
|
},
|
|
"language_info": {
|
|
"codemirror_mode": {
|
|
"name": "ipython",
|
|
"version": 3
|
|
},
|
|
"file_extension": ".py",
|
|
"mimetype": "text/x-python",
|
|
"name": "python",
|
|
"nbconvert_exporter": "python",
|
|
"pygments_lexer": "ipython3",
|
|
"version": "3.9.9"
|
|
}
|
|
},
|
|
"nbformat": 4,
|
|
"nbformat_minor": 5
|
|
}
|