Create example bash script for LlaMa 2 Chat

Builds on top of PR ggerganov#2304 to create a working script for system
prompt integration with interactive mode.
This commit is contained in:
lionelchg 2023-07-26 21:31:30 +02:00
parent 6df1f5940f
commit e9c17039db
3 changed files with 16 additions and 0 deletions

View file

@ -555,6 +555,10 @@ Here is an example of a few-shot interaction, invoked with the command
# custom arguments using a 13B model # custom arguments using a 13B model
./main -m ./models/13B/ggml-model-q4_0.bin -n 256 --repeat_penalty 1.0 --color -i -r "User:" -f prompts/chat-with-bob.txt ./main -m ./models/13B/ggml-model-q4_0.bin -n 256 --repeat_penalty 1.0 --color -i -r "User:" -f prompts/chat-with-bob.txt
# chat with LlaMa-2 chat models (handles special system and instruction tokens)
# second argument is system prompt and third one is first user prompt
./examples/chat-llama-2.sh models/llama-2-13b-chat.ggmlv3.q4_0.bin ./prompts/pirate.txt "Hello there"
``` ```
Note the use of `--color` to distinguish between user input and generated text. Other parameters are explained in more detail in the [README](examples/main/README.md) for the `main` example program. Note the use of `--color` to distinguish between user input and generated text. Other parameters are explained in more detail in the [README](examples/main/README.md) for the `main` example program.

11
examples/chat-llama-2.sh Executable file
View file

@ -0,0 +1,11 @@
#!/bin/bash
# The script should be launched like ./chat.sh models/llama-2-13b-chat.ggmlv3.q4_0.bin system_prompts/translation.txt Hello
# Load system prompt
SYSTEM_PROMPT=$(cat $2)
# Execute model
./main -m $1 -c 4096 -n -1 --in-prefix-bos --in-prefix ' [INST] ' --in-suffix ' [/INST]' -ngl 40 -i \
-p "[INST] <<SYS>>\n$SYSTEM_PROMPT\n<</SYS>>\n\n$3 [/INST]"

1
prompts/pirate.txt Normal file
View file

@ -0,0 +1 @@
You are a helpful assitant that speaks pirate