| 
								
								
									 Shijie | 37c746d687 | llama : add Qwen support (#4281) * enable qwen to llama.cpp
* llama : do not GPU split bias tensors
---------
Co-authored-by: Georgi Gerganov <ggerganov@gmail.com> | 2023-12-01 20:16:31 +02:00 |  | 
				
					
						| 
								
								
									 Georgi Gerganov | 0e89203b51 | speculative : add tree-based sampling example (#3624) * sampling : one sequence per sampling context
ggml-ci
* speculative : add tree-based sampling support
ggml-ci
* speculative : reuse the n_parallel CLI param
* speculative : refactor sampling
* examples : fix build after sampling refactoring
ggml-ci
* batched : fix n_seq_id
* sampling : fix malloc
ggml-ci
* swift : fix build
ggml-ci
* swift : try to fix build
ggml-ci
* prompts : add assistant.txt
* common : add llama_batch_add() and llama_batch_clear() helpers
* speculative : minor refactor
ggml-ci
* minor : comments + rename
ggml-ci
* speculative : fix off-by-one for n_drafted
* speculative : fix the n_drafted fix + p constants | 2023-10-18 16:21:57 +03:00 |  | 
				
					
						| 
								
								
									 Georgi Gerganov | 6b3ae4da92 | prompts : add mnemonics.txt | 2023-10-12 09:35:30 +03:00 |  | 
				
					
						| 
								
								
									 Georgi Gerganov | 0c731ca403 | prompts : fix editorconfig checks after #3416 | 2023-10-06 16:36:32 +03:00 |  | 
				
					
						| 
								
								
									 pudepiedj | a8777ad84e | parallel : add option to load external prompt file (#3416) * Enable external file and add datestamp
* Add name of external file at end
* Upload ToK2024
* Delete ToK2024.txt
* Experiments with jeopardy
* Move ParallelQuestions to /proimpts and rename
* Interim commit
* Interim commit
* Final revision
* Remove trailing whitespace
* remove cmake_all.sh
* Remove cmake_all.sh
* Changed .gitignore
* Improved reporting and new question files.
* Corrected typo
* More LLM questions
* Update LLM-questions.txt
* Yet more LLM-questions
* Remove jeopardy results file
* Reinstate original jeopardy.sh
* Update examples/parallel/parallel.cpp
---------
Co-authored-by: Georgi Gerganov <ggerganov@gmail.com> | 2023-10-06 16:16:38 +03:00 |  | 
				
					
						| 
								
								
									 jameswu2014 | 4c8643dd6e | feature : support Baichuan serial models (#3009) | 2023-09-14 12:32:10 -04:00 |  | 
				
					
						| 
								
								
									 CRD716 | b608b55a3e | prompts : model agnostic DAN (#1304) * add model-agnostic dan prompt
* quick readme update
* save a token
* Revert "quick readme update"
This reverts commit 8dc342c069. | 2023-05-11 18:10:19 +03:00 |  | 
				
					
						| 
								
								
									 khimaros | 6daa09d879 | examples : read chat prompts from a template file (#1196) | 2023-05-03 20:58:11 +03:00 |  | 
				
					
						| 
								
								
									 CRD716 | a8a2efdc81 | examples : various prompt and example fixes (#1298) * fix dan.txt
* miku prompt improvements
* use common characters | 2023-05-03 18:26:47 +03:00 |  | 
				
					
						| 
								
								
									 Pavol Rusnak | c85e03d12e | Revert "main : alternative instruct mode (Vicuna support, etc.) (#863)" (#982) This reverts commit f4d277ae17. | 2023-04-14 22:58:43 +03:00 |  | 
				
					
						| 
								
								
									 Tomáš Pazdiora | f4d277ae17 | main : alternative instruct mode (Vicuna support, etc.) (#863) * Add support for configs, add configurable prefixes / suffixes, deprecate instruct mode, add stop prompt
* Add multiline mode, update text input.
* bugfix
* update implementation
* typos
* Change --multiline implementation to be toggled by EOF.
* bugfix
* default multiline mode
* add more configs
* update formating
* update formatting
* apply suggestions | 2023-04-14 18:19:17 +03:00 |  | 
				
					
						| 
								
								
									 Pavol Rusnak | 82d146df9b | do not force the prompt file to end with a new line (#908) | 2023-04-13 11:33:16 +02:00 |  | 
				
					
						| 
								
								
									 Tobias Lütke | a6956b25a1 | add example of re-act pattern (#583) * add example of re-act pattern
* spelling...
* fixed whitespace in reverse prompt issue | 2023-03-29 10:10:24 -05:00 |  | 
				
					
						| 
								
								
									 Georgi Gerganov | ab77d76312 | Add longer DAN prompt for testing big batch numbers | 2023-03-25 16:49:09 +02:00 |  | 
				
					
						| 
								
								
									 Georgi Gerganov | 9e1707218a | Add "--instruct" argument for usage with Alpaca (#240) Also start adding prompts in "./prompts" | 2023-03-19 18:37:02 +02:00 |  |