Aneesh Joy
|
66c27f3120
|
Fixd CUBLAS dll load issue in Windows
|
2023-05-31 16:37:55 -07:00 |
|
Andrei Betlen
|
aae6c03e94
|
Update llama.cpp
|
2023-05-31 16:37:55 -07:00 |
|
Andrei Betlen
|
a83d117507
|
Add winmode arg only on windows if python version supports it
|
2023-05-31 16:37:55 -07:00 |
|
Andrei Betlen
|
7609c73ee6
|
Update llama.cpp (remove min_keep default value)
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
59f80d2a0d
|
Fix mlock_supported and mmap_supported return type
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
3808a73751
|
Fix obscure Wndows DLL issue. Closes #208
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
690588410e
|
Fix return type
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
4885e55ccd
|
Fix: runtime type errors
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
0c2fb05361
|
Fix: types
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
ff31330d7f
|
Fix candidates type
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
7862b520ec
|
Fix llama_cpp types
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
f20b34a3be
|
Add return type annotations for embeddings and logits
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
731c71255b
|
Add types for all low-level api functions
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
a439fe1529
|
Allow model to tokenize strings longer than context length and set add_bos. Closes #92
|
2023-05-31 15:56:55 -07:00 |
|
Don Mahurin
|
b5531e1435
|
low_level_api_chat_cpp.py: Fix missing antiprompt output in chat.
|
2023-05-31 15:56:55 -07:00 |
|
Mug
|
fb79c567d2
|
Fix session loading and saving in low level example chat
|
2023-05-31 15:56:55 -07:00 |
|
Mug
|
0bf36a77ae
|
Fix mirastat requiring c_float
|
2023-05-31 15:56:55 -07:00 |
|
Mug
|
f8ba031576
|
Fix lora
|
2023-05-31 15:56:55 -07:00 |
|
Mug
|
bbf6848cb0
|
Wrong logit_bias parsed type
|
2023-05-31 15:56:55 -07:00 |
|
Mug
|
335cd8d947
|
Rename postfix to suffix to match upstream
|
2023-05-31 15:56:55 -07:00 |
|
Mug
|
32cf0133c9
|
Update low level examples
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
9e79465b21
|
Prefer explicit imports
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
d15578e63e
|
Update llama.cpp (session version)
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
c26e9bf1c1
|
Update sampling api
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
78531e5d05
|
Fix return types and import comments
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
d0031edbd2
|
Update llama.cpp
|
2023-05-31 15:56:55 -07:00 |
|
Mug
|
441d30811a
|
Detect multi-byte responses and wait
|
2023-05-31 15:56:55 -07:00 |
|
Mug
|
36b3494332
|
Also ignore errors on input prompts
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
c8e6ac366a
|
Update llama.cpp (llama_load_session_file)
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
66ad132575
|
Update llama.cpp
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
656190750d
|
Update llama.cpp
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
80c18cb665
|
Update llama.cpp (remove llama_get_kv_cache)
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
bf9f02d8ee
|
Update llama.cpp
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
5bbf40aa47
|
Update llama.cpp
|
2023-05-31 15:56:55 -07:00 |
|
Mug
|
fd64310276
|
Fix decode errors permanently
|
2023-05-31 15:56:55 -07:00 |
|
Mug
|
bdbaf5dc76
|
Fixed end of text wrong type, and fix n_predict behaviour
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
81c4c10389
|
Update type signature to allow for null pointer to be passed.
|
2023-05-31 15:56:55 -07:00 |
|
Mug
|
8229410a4e
|
More reasonable defaults
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
b6ce5133d9
|
Add bindings for LoRA adapters. Closes #88
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
3693449c07
|
Update llama.cpp
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
d595f330e2
|
Update llama.cpp
|
2023-05-31 15:56:55 -07:00 |
|
Andrei Betlen
|
ce0ca60b56
|
Update llama.cpp (llama_mmap_supported)
|
2023-05-31 15:56:49 -07:00 |
|
Mug
|
d0a7ce9abf
|
Make windows users happy (hopefully)
|
2023-05-31 15:25:57 -07:00 |
|
Mug
|
848b4021a3
|
Better custom library debugging
|
2023-05-31 15:25:57 -07:00 |
|
Mug
|
c8b5d0b963
|
Use environment variable for library override
|
2023-05-31 15:25:57 -07:00 |
|
Mug
|
d1b3517477
|
Allow local llama library usage
|
2023-05-31 15:25:57 -07:00 |
|
Mug
|
b36c04c99e
|
Added iterative search to prevent instructions from being echoed, add ignore eos, add no-mmap, fixed 1 character echo too much bug
|
2023-05-31 15:25:57 -07:00 |
|
Andrei Betlen
|
f25a81309e
|
Update model paths to be more clear they should point to file
|
2023-05-31 15:25:57 -07:00 |
|
Mug
|
e19909249d
|
More interoperability to the original llama.cpp, and arguments now work
|
2023-05-31 15:25:57 -07:00 |
|
Andrei Betlen
|
d5680144c5
|
Bugfix: Wrong size of embeddings. Closes #47
|
2023-05-31 15:25:57 -07:00 |
|