kokansei 
							
						 
					 
					
						
						
							
						
						75a1ef4304 
					 
					
						
						
							
							Add DRY Samplers to ST Staging ( #2211 )  
						
						... 
						
						
						
						* Add files via upload
* Add files via upload
* Delete public/index.html
* Add files via upload
* Delete public/scripts/textgen-settings.js
* Add files via upload
* Delete public/scripts/power-user.js
* Add files via upload
* Delete public/scripts/power-user.js
* Add files via upload
* Update power-user.js
* Update index.html
* Fix control attribution
* Fix app loading
* Put sequence breakers under DRY block
* DRY for DRY
* Update public/index.html
Co-authored-by: Philipp Emanuel Weidmann <pew@worldwidemann.com >
* Merge fix
* Add llamacpp control. Add default value for sequence breakers
* Forgot reset
---------
Co-authored-by: Cohee <18619528+Cohee1207@users.noreply.github.com >
Co-authored-by: Philipp Emanuel Weidmann <pew@worldwidemann.com > 
						
						
					 
					
						2024-05-22 20:46:52 +03:00 
						 
				 
			
				
					
						
							
							
								kingbri 
							
						 
					 
					
						
						
							
						
						74b6ed97c2 
					 
					
						
						
							
							Textgen: Add repetition decay for TabbyAPI  
						
						... 
						
						
						
						Repetition decay softens the drop off for repetition penalty. It's
best paired with rep pen range.
Signed-off-by: kingbri <bdashore3@proton.me > 
						
						
					 
					
						2024-05-22 00:09:10 -04:00 
						 
				 
			
				
					
						
							
							
								kingbri 
							
						 
					 
					
						
						
							
						
						99d143263d 
					 
					
						
						
							
							Textgen: Add skew sampling  
						
						... 
						
						
						
						Adds the option from skew sampling from exllamaV2
Signed-off-by: kingbri <bdashore3@proton.me > 
						
						
					 
					
						2024-05-21 23:48:33 -04:00 
						 
				 
			
				
					
						
							
							
								kingbri 
							
						 
					 
					
						
						
							
						
						a12df762a0 
					 
					
						
						
							
							Textgen: Add speculative_ngram for TabbyAPI  
						
						... 
						
						
						
						Speculative ngram allows for a different method of speculative
decoding. Using a draft model is still preferred.
Signed-off-by: kingbri <bdashore3@proton.me > 
						
						
					 
					
						2024-05-21 23:37:36 -04:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						ee913be46b 
					 
					
						
						
							
							Merge pull request  #2266  from sasha0552/vllm-fixes  
						
						... 
						
						
						
						vLLM fixes 
						
						
					 
					
						2024-05-19 14:23:07 +03:00 
						 
				 
			
				
					
						
							
							
								RossAscends 
							
						 
					 
					
						
						
							
						
						c7232ae23c 
					 
					
						
						
							
							WIP textgen API custom sampler display  
						
						
						
						
					 
					
						2024-05-19 15:06:29 +09:00 
						 
				 
			
				
					
						
							
							
								sasha0552 
							
						 
					 
					
						
						
							
						
						db5e2d95c2 
					 
					
						
						
							
							vLLM fixes  
						
						... 
						
						
						
						* Enable seed field for vLLM
* Enable beam search for vLLM
* Set the default length penalty to 1
(There is validation error from vLLM when beam search is disabled and the value is not equal to 1) 
						
						
					 
					
						2024-05-19 04:34:11 +00:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						4227968dfa 
					 
					
						
						
							
							Allow using JSON schema with llamacpp server  
						
						
						
						
					 
					
						2024-05-18 18:50:48 +03:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						c7d75b7789 
					 
					
						
						
							
							llamacpp broke  
						
						
						
						
					 
					
						2024-05-12 21:41:07 +03:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						27ccc6b090 
					 
					
						
						
							
							Minor stylistic changes  
						
						
						
						
					 
					
						2024-05-11 11:38:22 +03:00 
						 
				 
			
				
					
						
							
							
								kingbri 
							
						 
					 
					
						
						
							
						
						62faddac8d 
					 
					
						
						
							
							Textgen: Add banned_strings  
						
						... 
						
						
						
						TabbyAPI supports the ability to ban the presence of strings during
a generation. Add this support in SillyTavern by handling lines
enclosed in quotes as a special case.
Signed-off-by: kingbri <bdashore3@proton.me > 
						
						
					 
					
						2024-05-11 00:58:29 -04:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						c73bfbd7b0 
					 
					
						
						
							
							Safari bruh moment  
						
						
						
						
					 
					
						2024-05-06 21:21:03 +03:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						7063fce2af 
					 
					
						
						
							
							Selectable openrouter providers  
						
						
						
						
					 
					
						2024-05-06 19:26:20 +03:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						05db2552b3 
					 
					
						
						
							
							Fix Top K disabled state for Infermatic.  
						
						... 
						
						
						
						Also an icon. 
						
						
					 
					
						2024-05-04 02:37:05 +03:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						7bfd666321 
					 
					
						
						
							
							Add llama 3 tokenizer  
						
						
						
						
					 
					
						2024-05-03 23:59:39 +03:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						7b87f44518 
					 
					
						
						
							
							Clean-up API-specific settings  
						
						
						
						
					 
					
						2024-05-03 20:02:13 +03:00 
						 
				 
			
				
					
						
							
							
								sasha0552 
							
						 
					 
					
						
						
							
						
						2bd239fe81 
					 
					
						
						
							
							Initial vLLM support  
						
						
						
						
					 
					
						2024-05-02 22:40:40 +00:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						022c180b62 
					 
					
						
						
							
							Lint and clean-up  
						
						
						
						
					 
					
						2024-04-15 00:39:15 +03:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						3e60919289 
					 
					
						
						
							
							Specify LLM prompt in case JSON schema is not supported  
						
						
						
						
					 
					
						2024-04-14 17:13:54 +03:00 
						 
				 
			
				
					
						
							
							
								kingbri 
							
						 
					 
					
						
						
							
						
						b8b49f0012 
					 
					
						
						
							
							TextgenSettings: Fix JSON schema fallback  
						
						... 
						
						
						
						Did not fall back if the provided string was empty, resulting in
errors
Signed-off-by: kingbri <bdashore3@proton.me > 
						
						
					 
					
						2024-04-09 22:15:00 -04:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						51b3b8bfaa 
					 
					
						
						
							
							Add smooth streaming  
						
						
						
						
					 
					
						2024-04-02 14:56:15 +03:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						04edf32ef0 
					 
					
						
						
							
							Do not send dynatemp to backends if disabled  
						
						
						
						
					 
					
						2024-04-02 11:29:49 +03:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						0b76e1d350 
					 
					
						
						
							
							Fix schema not loading from presets. Fix ESLint warnings  
						
						
						
						
					 
					
						2024-04-02 11:23:29 +03:00 
						 
				 
			
				
					
						
							
							
								kingbri 
							
						 
					 
					
						
						
							
						
						5210db5679 
					 
					
						
						
							
							Format  
						
						... 
						
						
						
						Signed-off-by: kingbri <bdashore3@proton.me > 
						
						
					 
					
						2024-04-02 01:01:59 -04:00 
						 
				 
			
				
					
						
							
							
								kingbri 
							
						 
					 
					
						
						
							
						
						4f0322351e 
					 
					
						
						
							
							Sampling: Add ability to send JSON schemas  
						
						... 
						
						
						
						TabbyAPI supports the ability to send JSON schemas with prompts in
addition to EBNF strings supported by outlines. Add an extra box
for TabbyAPI only.
Signed-off-by: kingbri <bdashore3@proton.me > 
						
						
					 
					
						2024-04-02 00:59:21 -04:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						a3ec0938c5 
					 
					
						
						
							
							KoboldCpp grammar fix  
						
						
						
						
					 
					
						2024-03-29 17:28:28 +02:00 
						 
				 
			
				
					
						
							
							
								Alexander Abushady 
							
						 
					 
					
						
						
							
						
						9bd3a526aa 
					 
					
						
						
							
							Fix for unique swipes  
						
						... 
						
						
						
						Fix for unique swipes in Aphrodite 
						
						
					 
					
						2024-03-26 23:57:24 -04:00 
						 
				 
			
				
					
						
							
							
								50h100a 
							
						 
					 
					
						
						
							
						
						8b092adc14 
					 
					
						
						
							
							Use mode enum to toggle dynatemp behavior.  
						
						
						
						
					 
					
						2024-03-25 12:25:03 -04:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						df805d692b 
					 
					
						
						
							
							Fix some code  
						
						
						
						
					 
					
						2024-03-24 21:42:27 +02:00 
						 
				 
			
				
					
						
							
							
								50h100a 
							
						 
					 
					
						
						
							
						
						6f7e7b85ab 
					 
					
						
						
							
							For Mancer:  
						
						... 
						
						
						
						- Allow logprobs (works)
- Allow multiswipe (not yet)
- Adjust visible samplers
Fix: 0 logprob is 100% chance, handle accordingly. 
						
						
					 
					
						2024-03-24 14:45:37 -04:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						c8f84bd413 
					 
					
						
						
							
							Textgen setting refactors  
						
						
						
						
					 
					
						2024-03-19 01:38:55 +02:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						dc74f546d5 
					 
					
						
						
							
							Merge pull request  #1875  from kalomaze/cubic-curve  
						
						... 
						
						
						
						smoothing_curve UI support 
						
						
					 
					
						2024-03-17 02:20:51 +02:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						a0279b636b 
					 
					
						
						
							
							Remove dead code  
						
						
						
						
					 
					
						2024-03-08 08:41:54 +02:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						2cdfda9d69 
					 
					
						
						
							
							Actually use getCurrentDreamGenModelTokenizer  
						
						
						
						
					 
					
						2024-03-08 08:40:03 +02:00 
						 
				 
			
				
					
						
							
							
								DreamGenX 
							
						 
					 
					
						
						
							
						
						bc8d41b530 
					 
					
						
						
							
							Implement suggestions  
						
						
						
						
					 
					
						2024-03-07 17:28:38 +01:00 
						 
				 
			
				
					
						
							
							
								DreamGenX 
							
						 
					 
					
						
						
							
						
						5c410986a4 
					 
					
						
						
							
							Add support for DreamGen API.  
						
						... 
						
						
						
						API docs: https://dreamgen.com/docs/models/opus/v1 
API keys: https://dreamgen.com/account/api-keys 
I decided to base this on the text-completion API since it's more
flexible with SillyTavern's prompt formating capabilities.
This also means that custom context and instruct settings are required.
Will add documentation in a followup PR. 
						
						
					 
					
						2024-03-07 12:25:48 +01:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						95c49029f7 
					 
					
						
						
							
							Add aphrodite model selector  
						
						
						
						
					 
					
						2024-03-01 23:02:43 +02:00 
						 
				 
			
				
					
						
							
							
								kalomaze 
							
						 
					 
					
						
						
							
						
						45776de1d5 
					 
					
						
						
							
							Smoothing curve support for ooba  
						
						
						
						
					 
					
						2024-03-01 00:06:34 -06:00 
						 
				 
			
				
					
						
							
							
								gabriel dhimoila 
							
						 
					 
					
						
						
							
						
						76669ff8bb 
					 
					
						
						
							
							add max_tokens_second  
						
						
						
						
					 
					
						2024-02-29 00:55:25 +01:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						f962ad5c02 
					 
					
						
						
							
							Add OpenRouter as a text completion source  
						
						
						
						
					 
					
						2024-02-25 22:47:07 +02:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						fc289126fa 
					 
					
						
						
							
							Add event type for text completion generation request settings ready  
						
						
						
						
					 
					
						2024-02-24 21:45:33 +02:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						d140b8d5be 
					 
					
						
						
							
							Parse non-streaming tabby logprobs  
						
						
						
						
					 
					
						2024-02-24 20:10:53 +02:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						8848818d67 
					 
					
						
						
							
							Fix dynatemp neutralization  
						
						
						
						
					 
					
						2024-02-24 15:32:12 +02:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						299bd9d563 
					 
					
						
						
							
							Merge branch 'staging' into llamacpp-sampler-order  
						
						
						
						
					 
					
						2024-02-24 15:10:58 +02:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						13aebc623a 
					 
					
						
						
							
							Merge pull request  #1854  from deciare/llamacpp-probs  
						
						... 
						
						
						
						Request and display token probabilities from llama.cpp backend 
						
						
					 
					
						2024-02-24 15:06:28 +02:00 
						 
				 
			
				
					
						
							
							
								Cohee 
							
						 
					 
					
						
						
							
						
						9287ff18de 
					 
					
						
						
							
							Fix for non-streaming  
						
						
						
						
					 
					
						2024-02-24 14:50:06 +02:00 
						 
				 
			
				
					
						
							
							
								Deciare 
							
						 
					 
					
						
						
							
						
						9eba076ae4 
					 
					
						
						
							
							Sampler order for llama.cpp server backend  
						
						
						
						
					 
					
						2024-02-23 23:01:04 -05:00 
						 
				 
			
				
					
						
							
							
								Deciare 
							
						 
					 
					
						
						
							
						
						936fbac6c5 
					 
					
						
						
							
							Merge remote-tracking branch 'origin/staging' into llamacpp-probs  
						
						
						
						
					 
					
						2024-02-23 17:45:54 -05:00 
						 
				 
			
				
					
						
							
							
								Deciare 
							
						 
					 
					
						
						
							
						
						344b9eedbc 
					 
					
						
						
							
							Request token probabilities from llama.cpp backend  
						
						... 
						
						
						
						llama.cpp server token probabilities are given as values ranging from
0 to 1 instead of as logarithms. 
						
						
					 
					
						2024-02-23 14:01:46 -05:00 
						 
				 
			
				
					
						
							
							
								NWilson 
							
						 
					 
					
						
						
							
						
						f569424f3e 
					 
					
						
						
							
							Merge branch 'staging' into InfermaticAI  
						
						
						
						
					 
					
						2024-02-22 08:32:10 -06:00