From 016548dc6074b3413deaeed32d628101270633d9 Mon Sep 17 00:00:00 2001 From: ltoniazzi Date: Mon, 10 Jun 2024 22:54:51 +0100 Subject: [PATCH] first notes --- NOTES.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/NOTES.md b/NOTES.md index 9deb5bcd7..634e7aadc 100644 --- a/NOTES.md +++ b/NOTES.md @@ -4,4 +4,4 @@ 2. How to wrap the suggestion from lauren on matmul (need to see how to find the llora info to pick up). Something about lora being loaded in the context? How to pick a specifi LoRA 3. check the PR "It was removed in [#7204](https://github.com/ggerganov/llama.cpp/pull/7204). `convert-lora-to-ggml.py` seems to write loras to gguf witouth the model? Should check the train script and see how they match lora with base layers 4. https://github.com/ggerganov/llama.cpp/discussions/3489 -5. check lora example in examples \ No newline at end of file +5. check lora example in examples `examples/export-lora/export-lora.cpp`, ask gpt if can be used to extend applying multiple Loras, then ask back to lauren \ No newline at end of file