ChatON:Initial go at vicuna chat template in meta.json

Have looked at tokenizer_config.json, jinja file and default
hardcoded template in llama.cpp.

This is also one of the models where a Global BoS is needed.

NOTE: Have taken the liberty to also add a SYSTEM: prefix wrt
system message, even thou default vicuna doesnt seem to need, but
vicuna-orca seems to need, so that both models can be driven from
same chat template config. I am assuming the system prefix should
not create any problem even in default vicuna, however if it does
create a problem one can duplicate the existing vicuna block in
chaton_meta.json and make the system prefix empty in it.
This commit is contained in:
HanishKVC 2024-05-05 15:44:39 +05:30
parent 0f8f2a18c2
commit b875b02979

View file

@ -378,6 +378,35 @@
"systemuser-system-has-end": true,
"systemuser-1st-user-has-begin": true,
"systemuser-1st-user-has-prefix": true
},
"vicuna": {
"global": {
"begin": "<s>",
"end": ""
},
"system": {
"begin": "",
"prefix": "SYSTEM: ",
"suffix": "\n\n",
"end": ""
},
"user": {
"begin": "",
"prefix": "USER: ",
"suffix": "\n",
"end": ""
},
"assistant": {
"begin": "",
"prefix": "ASSISTANT: ",
"suffix": "</s>\n",
"end": ""
},
"reverse-prompt": "</s>",
"systemuser-system-has-suffix": true,
"systemuser-system-has-end": true,
"systemuser-1st-user-has-begin": true,
"systemuser-1st-user-has-prefix": true
}
}