convert: Fix detection of LLAMA2
In recent downloads of LLAMA2 dataset the norm_eps is set to 1e-06, this leads to convert.py erroneously considering the model to be LLAMA1 and setting the context to 2k tokens. Fix it by extending the existing hack to also check for the 1e-06 value.
This commit is contained in:
parent
2833a6f63c
commit
d6d905b242
1 changed files with 1 additions and 1 deletions
|
@ -250,7 +250,7 @@ class Params:
|
|||
if config.get("rope_theta") == 1000000:
|
||||
# CodeLlama
|
||||
n_ctx = 16384
|
||||
elif config["norm_eps"] == 1e-05:
|
||||
elif config["norm_eps"] in (1e-05, 1e-06):
|
||||
# LLaMA v2
|
||||
n_ctx = 4096
|
||||
else:
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue