convert.py : fix llama/llama2 conversion due to vocab_size=-1 - take 2

PR #4818 (merged last week) reintroduced a config check for vocab_size that was addressed in PR #4258 (merged 2023-11-30).

Without the fix, llama2 models can't be converted. The error is:

`ValueError: The model's vocab size is set to -1 in params.json. Please update it manually. Maybe 32000?`
This commit is contained in:
David Sommers 2024-01-18 11:35:02 -05:00
parent ad19812cda
commit 2c36544741
No known key found for this signature in database
GPG key ID: C2C6526D2F4BC3BC

View file

@ -348,7 +348,7 @@ class Params:
f_rope_freq_base = 1e6
return Params(
n_vocab=config.get("vocab_size", model["tok_embeddings.weight"].shape[0]),
n_vocab=model["tok_embeddings.weight"].shape[0],
n_embd=config["dim"],
n_layer=config["n_layers"],
n_ctx=n_ctx,