mirror of
https://github.com/hwchase17/langchain.git
synced 2025-06-24 15:43:54 +00:00
Add RunnableSequence documentation (#13094)
Add RunnableSequence documentation
This commit is contained in:
parent
869df62736
commit
b0e8cbe0b3
@ -1217,16 +1217,97 @@ class RunnableSerializable(Serializable, Runnable[Input, Output]):
|
||||
|
||||
|
||||
class RunnableSequence(RunnableSerializable[Input, Output]):
|
||||
"""
|
||||
A sequence of runnables, where the output of each is the input of the next.
|
||||
"""A sequence of runnables, where the output of each is the input of the next.
|
||||
|
||||
RunnableSequence is the most important composition operator in LangChain as it is
|
||||
used in virtually every chain.
|
||||
|
||||
A RunnableSequence can be instantiated directly or more commonly by using the `|`
|
||||
operator where either the left or right operands (or both) must be a Runnable.
|
||||
|
||||
Any RunnableSequence automatically supports sync, async, batch.
|
||||
|
||||
The default implementations of `batch` and `abatch` utilize threadpools and
|
||||
asyncio gather and will be faster than naive invocation of invoke or ainvoke
|
||||
for IO bound runnables.
|
||||
|
||||
Batching is implemented by invoking the batch method on each component of the
|
||||
RunnableSequence in order.
|
||||
|
||||
A RunnableSequence preserves the streaming properties of its components, so if all
|
||||
components of the sequence implement a `transform` method -- which
|
||||
is the method that implements the logic to map a streaming input to a streaming
|
||||
output -- then the sequence will be able to stream input to output!
|
||||
|
||||
If any component of the sequence does not implement transform then the
|
||||
streaming will only begin after this component is run. If there are
|
||||
multiple blocking components, streaming begins after the last one.
|
||||
|
||||
Please note: RunnableLambdas do not support `transform` by default! So if
|
||||
you need to use a RunnableLambdas be careful about where you place them in a
|
||||
RunnableSequence (if you need to use the .stream()/.astream() methods).
|
||||
|
||||
If you need arbitrary logic and need streaming, you can subclass
|
||||
Runnable, and implement `transform` for whatever logic you need.
|
||||
|
||||
Here is a simple example that uses simple functions to illustrate the use of
|
||||
RunnableSequence:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from langchain.schema.runnable import RunnableLambda
|
||||
|
||||
def add_one(x: int) -> int:
|
||||
return x + 1
|
||||
|
||||
def mul_two(x: int) -> int:
|
||||
return x * 2
|
||||
|
||||
runnable_1 = RunnableLambda(add_one)
|
||||
runnable_2 = RunnableLambda(mul_two)
|
||||
sequence = runnable_1 | runnable_2
|
||||
# Or equivalently:
|
||||
# sequence = RunnableSequence(first=runnable_1, last=runnable_2)
|
||||
sequence.invoke(1)
|
||||
await runnable.ainvoke(1)
|
||||
|
||||
sequence.batch([1, 2, 3])
|
||||
await sequence.abatch([1, 2, 3])
|
||||
|
||||
Here's an example that uses streams JSON output generated by an LLM:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from langchain.output_parsers.json import SimpleJsonOutputParser
|
||||
from langchain.chat_models.openai import ChatOpenAI
|
||||
|
||||
prompt = PromptTemplate.from_template(
|
||||
'In JSON format, give me a list of {topic} and their '
|
||||
'corresponding names in French, Spanish and in a '
|
||||
'Cat Language.'
|
||||
)
|
||||
|
||||
model = ChatOpenAI()
|
||||
chain = prompt | model | SimpleJsonOutputParser()
|
||||
|
||||
async for chunk in chain.astream({'topic': 'colors'}):
|
||||
print('-')
|
||||
print(chunk, sep='', flush=True)
|
||||
"""
|
||||
|
||||
# The steps are broken into first, middle and last, solely for type checking
|
||||
# purposes. It allows specifying the `Input` on the first type, the `Output` of
|
||||
# the last type.
|
||||
first: Runnable[Input, Any]
|
||||
"""The first runnable in the sequence."""
|
||||
middle: List[Runnable[Any, Any]] = Field(default_factory=list)
|
||||
"""The middle runnables in the sequence."""
|
||||
last: Runnable[Any, Output]
|
||||
"""The last runnable in the sequence."""
|
||||
|
||||
@property
|
||||
def steps(self) -> List[Runnable[Any, Any]]:
|
||||
"""All the runnables that make up the sequence in order."""
|
||||
return [self.first] + self.middle + [self.last]
|
||||
|
||||
@classmethod
|
||||
|
Loading…
Reference in New Issue
Block a user