From 618611f4ddd12e90fc6c42ca21b3a5396550285b Mon Sep 17 00:00:00 2001 From: Harrison Chase Date: Sat, 5 Nov 2022 08:44:37 -0700 Subject: [PATCH] update glossary (#63) --- docs/glossary.md | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/docs/glossary.md b/docs/glossary.md index 11621a36ac7..c6f2fc5a6d8 100644 --- a/docs/glossary.md +++ b/docs/glossary.md @@ -45,9 +45,11 @@ Resources: Combining multiple LLM calls together, with the output of one step being the input to the next. Resources: -- [Paper](https://arxiv.org/pdf/2203.06566.pdf) +- [PromptChainer Paper](https://arxiv.org/pdf/2203.06566.pdf) +- [Language Model Cascades](https://arxiv.org/abs/2207.10342) +- [ICE Primer Book](https://primer.ought.org/) -### Mimetic Proxy +### Memetic Proxy Encouraging the LLM to respond in a certain way framing the discussion in a context that the model knows of and that will result in that type of response. For example, as a conversation between a student and a teacher.