diff --git a/docs/snippets/modules/model_io/models/llms/get_started.mdx b/docs/snippets/modules/model_io/models/llms/get_started.mdx index 5553a7faa21..54d6a96b930 100644 --- a/docs/snippets/modules/model_io/models/llms/get_started.mdx +++ b/docs/snippets/modules/model_io/models/llms/get_started.mdx @@ -43,7 +43,7 @@ llm("Tell me a joke") ### `generate`: batch calls, richer outputs -`generate` lets you can call the model with a list of strings, getting back a more complete response than just the text. This complete response can include things like multiple top responses and other LLM provider-specific information: +`generate` lets you call the model with a list of strings, getting back a more complete response than just the text. This complete response can include things like multiple top responses and other LLM provider-specific information: ```python llm_result = llm.generate(["Tell me a joke", "Tell me a poem"]*15)