1
0
mirror of https://github.com/hwchase17/langchain.git synced 2025-04-29 20:35:43 +00:00
langchain/libs/cli/langchain_cli/integration_template
Oskar Stark 0d2cea747c
docs: streamline LangSmith teasing ()
This can only be reviewed by [hiding
whitespaces](https://github.com/langchain-ai/langchain/pull/30302/files?diff=unified&w=1).

The motivation behind this PR is to get my hands on the docs and make
the LangSmith teasing short and clear.

Right now I don't know how to do it, but this could be an include in the
future.

---------

Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com>
2025-03-28 15:13:22 -04:00
..
docs docs: streamline LangSmith teasing () 2025-03-28 15:13:22 -04:00
integration_template standard-tests[patch]: require model_name in response_metadata if returns_usage_metadata () 2025-03-26 12:20:53 -04:00
scripts multiple: pydantic 2 compatibility, v0.3 () 2024-09-13 14:38:45 -07:00
tests docs: standard tests to markdown, load templates from files () 2024-12-07 01:37:21 +00:00
.gitignore
LICENSE cli[patch]: copyright 2024 default () 2024-02-07 14:52:37 -08:00
Makefile cli: standard tests in cli, test that they run, skip vectorstore tests () 2024-12-05 00:38:32 -08:00
pyproject.toml cli: release 0.0.34 () 2024-12-05 15:35:49 +00:00
README.md cli[patch], google-vertexai[patch]: readme template () 2024-01-23 12:08:17 -07:00

package_name

This package contains the LangChain integration with ModuleName

Installation

pip install -U __package_name__

And you should configure credentials by setting the following environment variables:

  • TODO: fill this out

Chat Models

Chat__ModuleName__ class exposes chat models from ModuleName.

from __module_name__ import Chat__ModuleName__

llm = Chat__ModuleName__()
llm.invoke("Sing a ballad of LangChain.")

Embeddings

__ModuleName__Embeddings class exposes embeddings from ModuleName.

from __module_name__ import __ModuleName__Embeddings

embeddings = __ModuleName__Embeddings()
embeddings.embed_query("What is the meaning of life?")

LLMs

__ModuleName__LLM class exposes LLMs from ModuleName.

from __module_name__ import __ModuleName__LLM

llm = __ModuleName__LLM()
llm.invoke("The meaning of life is")