mirror of
https://github.com/hwchase17/langchain.git
synced 2025-05-18 21:41:24 +00:00
…tch]: import models from community ran ```bash git grep -l 'from langchain\.chat_models' | xargs -L 1 sed -i '' "s/from\ langchain\.chat_models/from\ langchain_community.chat_models/g" git grep -l 'from langchain\.llms' | xargs -L 1 sed -i '' "s/from\ langchain\.llms/from\ langchain_community.llms/g" git grep -l 'from langchain\.embeddings' | xargs -L 1 sed -i '' "s/from\ langchain\.embeddings/from\ langchain_community.embeddings/g" git checkout master libs/langchain/tests/unit_tests/llms git checkout master libs/langchain/tests/unit_tests/chat_models git checkout master libs/langchain/tests/unit_tests/embeddings/test_imports.py make format cd libs/langchain; make format cd ../experimental; make format cd ../core; make format ```
103 lines
3.4 KiB
Plaintext
103 lines
3.4 KiB
Plaintext
{
|
|
"cells": [
|
|
{
|
|
"attachments": {},
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"# Titan Takeoff Pro\n",
|
|
"\n",
|
|
"`TitanML` helps businesses build and deploy better, smaller, cheaper, and faster NLP models through our training, compression, and inference optimization platform.\n",
|
|
"\n",
|
|
">Note: These docs are for the Pro version of Titan Takeoff. For the community version, see the page for Titan Takeoff.\n",
|
|
"\n",
|
|
"Our inference server, [Titan Takeoff (Pro Version)](https://docs.titanml.co/docs/titan-takeoff/pro-features/feature-comparison) enables deployment of LLMs locally on your hardware in a single command. Most generative model architectures are supported, such as Falcon, Llama 2, GPT2, T5 and many more."
|
|
]
|
|
},
|
|
{
|
|
"attachments": {},
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Example usage\n",
|
|
"Here are some helpful examples to get started using the Pro version of Titan Takeoff Server.\n",
|
|
"No parameters are needed by default, but a baseURL that points to your desired URL where Takeoff is running can be specified and generation parameters can be supplied."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain.callbacks.manager import CallbackManager\n",
|
|
"from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler\n",
|
|
"from langchain.prompts import PromptTemplate\n",
|
|
"from langchain_community.llms import TitanTakeoffPro\n",
|
|
"\n",
|
|
"# Example 1: Basic use\n",
|
|
"llm = TitanTakeoffPro()\n",
|
|
"output = llm(\"What is the weather in London in August?\")\n",
|
|
"print(output)\n",
|
|
"\n",
|
|
"\n",
|
|
"# Example 2: Specifying a port and other generation parameters\n",
|
|
"llm = TitanTakeoffPro(\n",
|
|
" base_url=\"http://localhost:3000\",\n",
|
|
" min_new_tokens=128,\n",
|
|
" max_new_tokens=512,\n",
|
|
" no_repeat_ngram_size=2,\n",
|
|
" sampling_topk=1,\n",
|
|
" sampling_topp=1.0,\n",
|
|
" sampling_temperature=1.0,\n",
|
|
" repetition_penalty=1.0,\n",
|
|
" regex_string=\"\",\n",
|
|
")\n",
|
|
"output = llm(\"What is the largest rainforest in the world?\")\n",
|
|
"print(output)\n",
|
|
"\n",
|
|
"\n",
|
|
"# Example 3: Using generate for multiple inputs\n",
|
|
"llm = TitanTakeoffPro()\n",
|
|
"rich_output = llm.generate([\"What is Deep Learning?\", \"What is Machine Learning?\"])\n",
|
|
"print(rich_output.generations)\n",
|
|
"\n",
|
|
"\n",
|
|
"# Example 4: Streaming output\n",
|
|
"llm = TitanTakeoffPro(\n",
|
|
" streaming=True, callback_manager=CallbackManager([StreamingStdOutCallbackHandler()])\n",
|
|
")\n",
|
|
"prompt = \"What is the capital of France?\"\n",
|
|
"llm(prompt)\n",
|
|
"\n",
|
|
"# Example 5: Using LCEL\n",
|
|
"llm = TitanTakeoffPro()\n",
|
|
"prompt = PromptTemplate.from_template(\"Tell me about {topic}\")\n",
|
|
"chain = prompt | llm\n",
|
|
"chain.invoke({\"topic\": \"the universe\"})"
|
|
]
|
|
}
|
|
],
|
|
"metadata": {
|
|
"kernelspec": {
|
|
"display_name": "Python 3 (ipykernel)",
|
|
"language": "python",
|
|
"name": "python3"
|
|
},
|
|
"language_info": {
|
|
"codemirror_mode": {
|
|
"name": "ipython",
|
|
"version": 3
|
|
},
|
|
"file_extension": ".py",
|
|
"mimetype": "text/x-python",
|
|
"name": "python",
|
|
"nbconvert_exporter": "python",
|
|
"pygments_lexer": "ipython3",
|
|
"version": "3.10.12"
|
|
}
|
|
},
|
|
"nbformat": 4,
|
|
"nbformat_minor": 4
|
|
}
|