Fix inference example lacks required parameters (#9035)
Signed-off-by: Aisuko <urakiny@gmail.com>
This commit is contained in:
parent
23fd453544
commit
c8ddce8560
1 changed files with 1 additions and 1 deletions
|
@ -34,7 +34,7 @@ Run the quantized model:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# start inference on a gguf model
|
# start inference on a gguf model
|
||||||
./llama-cli -m ./models/mymodel/ggml-model-Q4_K_M.gguf -n 128
|
./llama-cli -m ./models/mymodel/ggml-model-Q4_K_M.gguf -cnv -p "You are a helpful assistant"
|
||||||
```
|
```
|
||||||
|
|
||||||
When running the larger models, make sure you have enough disk space to store all the intermediate files.
|
When running the larger models, make sure you have enough disk space to store all the intermediate files.
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue