Commit Graph

186 Commits

Author SHA1 Message Date
Cohee 50de678980 Hide beam search for vllm. It never worked. 2024-09-14 16:53:21 +03:00
Cohee f0d361bc7a Remove unused beam search 2024-09-14 16:41:22 +03:00
Cohee 28837ff883 Hard code include_stop_str_in_output 2024-09-14 16:32:50 +03:00
AlpinDale 1cc935796f fix early_stopping 2024-09-14 12:45:29 +00:00
AlpinDale fde76069e0 remove beam search 2024-09-14 12:42:21 +00:00
AlpinDale 9c94348491 clean up 2024-09-14 12:38:19 +00:00
AlpinDale efd477da04 chore: slight refactor of aphrodite samplers 2024-09-13 10:34:06 +00:00
Cohee 7534e137ae Parse Tabby streaming error. 2024-09-08 22:24:03 +03:00
Cohee 42fa3c79d7 Add Tabby model selection 2024-09-08 22:23:25 +03:00
Cohee b16915cfb9 Remove truncation_length from textgen settings 2024-09-03 14:06:10 +00:00
Cohee ae2d0f04ed Add XTC for koboldcpp 2024-08-31 20:18:51 +03:00
Cohee 696c83f96d [chore] Fix eslint 2024-08-19 21:36:28 +03:00
Cohee d77363cd7c Merge branch 'staging' into feat/xtc 2024-08-19 21:35:35 +03:00
Vitor e28257096a added xtc parameter for ooba 2024-08-19 01:32:45 -03:00
Cohee 9215dfd0c6 Replace macros in DRY sequence breakers 2024-08-18 13:50:58 +03:00
Wolfsblvt 28a9c45c31 /api-url slash command to get/set server url 2024-08-18 01:05:25 +02:00
Cohee 8ff4a4a36a Don't modify legacy URL path for inappropriate API types 2024-08-01 15:01:38 +03:00
Cohee e6e8d7726b Generate random seed for HF endpoint 2024-07-26 12:50:39 +00:00
Cohee 5f2a73ac9f Expose "Allow fallback providers" for OpenRouter 2024-07-19 23:34:16 +03:00
Cohee b66e589b30 Don't use dynatemp for unsupported backends 2024-07-02 14:17:10 +00:00
Cohee b62cbdeebd Merge branch 'staging' into DarokCx/release 2024-06-28 19:09:12 +03:00
Cohee bbb1a6e578 Add huggingface inference as text completion source 2024-06-28 18:17:27 +03:00
DarokCx 29ff0876a7 Added additional headers 2024-06-28 08:20:15 -04:00
DarokCx bd5592de7b Added featherless, connect button not working 2024-06-27 09:06:11 -04:00
Cohee 41ab90bb8e Support more parameters for Infermatic 2024-06-24 19:16:20 +03:00
Cohee a3dbcf3c2a Fix context and response size not being passed to Ollama 2024-06-24 03:48:34 +03:00
Cohee 1dd21caa66 Adjust number of VLLM logprobs 2024-06-09 00:59:40 +03:00
Cohee ff680f46cc Add rep_pen_slope control for koboldcpp 2024-06-05 22:05:41 +03:00
kingbri 4528655bb7 Textgen: Add multiswipe support for TabbyAPI
Tabby now supports batching and the "n" parameter for both non-streaming
and streaming. Add this into SillyTavern.

Signed-off-by: kingbri <bdashore3@proton.me>
2024-05-28 00:55:57 -04:00
Cohee e8b96fec02 Merge branch 'staging' into new-samplers 2024-05-22 23:26:47 +03:00
kokansei 75a1ef4304
Add DRY Samplers to ST Staging (#2211)
* Add files via upload

* Add files via upload

* Delete public/index.html

* Add files via upload

* Delete public/scripts/textgen-settings.js

* Add files via upload

* Delete public/scripts/power-user.js

* Add files via upload

* Delete public/scripts/power-user.js

* Add files via upload

* Update power-user.js

* Update index.html

* Fix control attribution

* Fix app loading

* Put sequence breakers under DRY block

* DRY for DRY

* Update public/index.html

Co-authored-by: Philipp Emanuel Weidmann <pew@worldwidemann.com>

* Merge fix

* Add llamacpp control. Add default value for sequence breakers

* Forgot reset

---------

Co-authored-by: Cohee <18619528+Cohee1207@users.noreply.github.com>
Co-authored-by: Philipp Emanuel Weidmann <pew@worldwidemann.com>
2024-05-22 20:46:52 +03:00
kingbri 74b6ed97c2 Textgen: Add repetition decay for TabbyAPI
Repetition decay softens the drop off for repetition penalty. It's
best paired with rep pen range.

Signed-off-by: kingbri <bdashore3@proton.me>
2024-05-22 00:09:10 -04:00
kingbri 99d143263d Textgen: Add skew sampling
Adds the option from skew sampling from exllamaV2

Signed-off-by: kingbri <bdashore3@proton.me>
2024-05-21 23:48:33 -04:00
kingbri a12df762a0 Textgen: Add speculative_ngram for TabbyAPI
Speculative ngram allows for a different method of speculative
decoding. Using a draft model is still preferred.

Signed-off-by: kingbri <bdashore3@proton.me>
2024-05-21 23:37:36 -04:00
Cohee ee913be46b
Merge pull request #2266 from sasha0552/vllm-fixes
vLLM fixes
2024-05-19 14:23:07 +03:00
RossAscends c7232ae23c WIP textgen API custom sampler display 2024-05-19 15:06:29 +09:00
sasha0552 db5e2d95c2
vLLM fixes
* Enable seed field for vLLM

* Enable beam search for vLLM

* Set the default length penalty to 1
(There is validation error from vLLM when beam search is disabled and the value is not equal to 1)
2024-05-19 04:34:11 +00:00
Cohee 4227968dfa Allow using JSON schema with llamacpp server 2024-05-18 18:50:48 +03:00
Cohee c7d75b7789 llamacpp broke 2024-05-12 21:41:07 +03:00
Cohee 27ccc6b090 Minor stylistic changes 2024-05-11 11:38:22 +03:00
kingbri 62faddac8d Textgen: Add banned_strings
TabbyAPI supports the ability to ban the presence of strings during
a generation. Add this support in SillyTavern by handling lines
enclosed in quotes as a special case.

Signed-off-by: kingbri <bdashore3@proton.me>
2024-05-11 00:58:29 -04:00
Cohee c73bfbd7b0 Safari bruh moment 2024-05-06 21:21:03 +03:00
Cohee 7063fce2af Selectable openrouter providers 2024-05-06 19:26:20 +03:00
Cohee 05db2552b3 Fix Top K disabled state for Infermatic.
Also an icon.
2024-05-04 02:37:05 +03:00
Cohee 7bfd666321 Add llama 3 tokenizer 2024-05-03 23:59:39 +03:00
Cohee 7b87f44518 Clean-up API-specific settings 2024-05-03 20:02:13 +03:00
sasha0552 2bd239fe81
Initial vLLM support 2024-05-02 22:40:40 +00:00
Cohee 022c180b62 Lint and clean-up 2024-04-15 00:39:15 +03:00
Cohee 3e60919289 Specify LLM prompt in case JSON schema is not supported 2024-04-14 17:13:54 +03:00
kingbri b8b49f0012 TextgenSettings: Fix JSON schema fallback
Did not fall back if the provided string was empty, resulting in
errors

Signed-off-by: kingbri <bdashore3@proton.me>
2024-04-09 22:15:00 -04:00