docs: Update with LCEL examples to Ollama & ChatOllama Integration notebook (#16194)

- **Description:** Updated the Chat/Ollama docs notebook with LCEL chain
examples

- **Issue:**  #15664 I'm a new contributor 😊

- **Dependencies:** No dependencies

- **Twitter handle:** 

Comments:

- How do I truncate the output of the stream in the notebook if and or
when it goes on and on and on for even the basic of prompts?

Edit:

Looking forward to feedback @baskaryan

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
This commit is contained in:
KhoPhi
2024-01-23 06:05:59 +00:00
committed by GitHub
parent 3b0226b2c6
commit fb41b68ea1
3 changed files with 402 additions and 189 deletions

BIN
docs/static/img/ollama_example_img.jpg vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 64 KiB