ccurme
481d3855dc
patch: remove usage of llm, chat model __call__ (#20788)
- `llm(prompt)` -> `llm.invoke(prompt)`
- `llm(prompt=prompt` -> `llm.invoke(prompt)` (same with `messages=`)
- `llm(prompt, callbacks=callbacks)` -> `llm.invoke(prompt,
config={"callbacks": callbacks})`
- `llm(prompt, **kwargs)` -> `llm.invoke(prompt, **kwargs)`
2024-04-24 19:39:23 -04:00
..
2024-02-22 16:02:00 -08:00
2023-12-11 13:53:30 -08:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-03-28 23:38:20 +00:00
2024-04-24 19:39:23 -04:00
2024-01-09 15:29:25 -08:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-01-22 11:22:17 -08:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-02-22 16:02:00 -08:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-04-24 19:39:23 -04:00
2024-03-15 16:28:36 -07:00
2024-04-24 19:39:23 -04:00
2024-04-09 14:17:07 +00:00