mirror of
https://github.com/jart/cosmopolitan.git
synced 2025-09-10 02:33:49 +00:00
Use Companion AI in llama.com by default
This commit is contained in:
parent
d9e27203d4
commit
3dac9f8999
8 changed files with 310 additions and 193 deletions
1
third_party/ggml/README.cosmo
vendored
1
third_party/ggml/README.cosmo
vendored
|
@ -19,6 +19,7 @@ LOCAL CHANGES
|
|||
- Make it possible for loaded prompts to be cached to disk
|
||||
- Introduce -v and --verbose flags
|
||||
- Reduce batch size from 512 to 32
|
||||
- Allow --n_keep to specify a substring of prompt
|
||||
- Don't print stats / diagnostics unless -v is passed
|
||||
- Reduce --top_p default from 0.95 to 0.70
|
||||
- Change --reverse-prompt to no longer imply --interactive
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue