llama : fix session saving/loading (#3400)

* llama : fix session saving/loading

* llama : temp fix for clearing "future" tokens from the KV cache

* llama : fix handling of "future" tokens when loading sessions

* llama : fix comments for llama_kv_cache API
This commit is contained in:
Georgi Gerganov 2023-10-03 21:04:01 +03:00 committed by GitHub
parent 48be797ffb
commit ac2219fef3
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
7 changed files with 106 additions and 59 deletions

View file

@ -543,6 +543,9 @@ int main(int argc, char ** argv) {
if (i > 0) {
embd.erase(embd.begin(), embd.begin() + i);
}
// remove any "future" tokens that we might have inherited from the session from the KV cache
llama_kv_cache_tokens_rm(ctx, n_past, -1);
}
// evaluate tokens in batches