NLPCloud client integration (#81)

lots of kwargs! generation docs here:
https://docs.nlpcloud.com/#generation

This somewhat breaks the paradigm introduced in LLM base class as the
stop sequence isn't a list, and should rightfully be introduced at the
time of initialization of the class, along with the other kwargs that
depend on its presence (e.g. remove_end_sequence, etc.) curious if you'd
want to refactor LLM base class to take out stop as a specific named
kwarg?
This commit is contained in:
Samantha Whitmore
2022-11-08 06:24:23 -08:00
committed by GitHub
parent 6d8a657676
commit efbc03bda8
4 changed files with 149 additions and 1 deletions

View File

@@ -0,0 +1,10 @@
"""Test NLPCloud API wrapper."""
from langchain.llms.nlpcloud import NLPCloud
def test_nlpcloud_call() -> None:
"""Test valid call to nlpcloud."""
llm = NLPCloud(max_length=10)
output = llm("Say foo:")
assert isinstance(output, str)