From c8d4b6b54ee22eb09a7653dd15f462cfd1cfd2d5 Mon Sep 17 00:00:00 2001 From: Ting Sun Date: Wed, 27 Mar 2024 10:46:07 +0700 Subject: [PATCH] doc: fix outdated default value of batch size --- examples/main/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/examples/main/README.md b/examples/main/README.md index 9c83fd3bf..57058be61 100644 --- a/examples/main/README.md +++ b/examples/main/README.md @@ -296,7 +296,7 @@ These options help improve the performance and memory usage of the LLaMA models. ### Batch Size -- `-b N, --batch-size N`: Set the batch size for prompt processing (default: 512). This large batch size benefits users who have BLAS installed and enabled it during the build. If you don't have BLAS enabled ("BLAS=0"), you can use a smaller number, such as 8, to see the prompt progress as it's evaluated in some situations. +- `-b N, --batch-size N`: Set the batch size for prompt processing (default: 2048). This large batch size benefits users who have BLAS installed and enabled it during the build. If you don't have BLAS enabled ("BLAS=0"), you can use a smaller number, such as 8, to see the prompt progress as it's evaluated in some situations. ### Prompt Caching