From e9aa6e9a08766876e8cbf9572e7b1384981df2d9 Mon Sep 17 00:00:00 2001 From: pudepiedj Date: Thu, 5 Oct 2023 11:17:28 +0100 Subject: [PATCH] Yet more LLM-questions --- prompts/LLM-questions.txt | 3 +++ 1 file changed, 3 insertions(+) diff --git a/prompts/LLM-questions.txt b/prompts/LLM-questions.txt index 9d47283e4..fdf3d52f4 100644 --- a/prompts/LLM-questions.txt +++ b/prompts/LLM-questions.txt @@ -18,6 +18,9 @@ In the context of LLMs, what is "RoPe" and what is it used for? In the context of LLMs, what is "LoRA" and what is it used for? In the context of LLMs, what are weights? In the context of LLMs, what are biases? +In the context of LLMs, what are checkpoints? +In the context of LLMs, what is "perplexity"? +In the context of LLMs, what are models? In the context of machine-learning, what is "catastrophic forgetting"? In the context of machine-learning, what is "elastic weight consolidation (EWC)"? In the context of neural nets, what is a hidden layer?