Commit Graph

3976 Commits

Author SHA1 Message Date
SDS 1f56f0d64a
Update simple-proxy-for-tavern.settings
Go back down to 2048 tokens instead of 4096 to be in line with the other non-Llama 2-specific presets
2023-08-03 11:47:59 +02:00
SDS 905131c764
Update Deterministic.settings
Go back down to 2048 tokens instead of 4096 to be in line with the other non-Llama 2-specific presets
2023-08-03 11:46:53 +02:00
Cohee 31feaee805 Enter to submit dialogue popup input 2023-08-03 11:32:08 +03:00
Cohee a07cbe5f7f
Merge pull request #860 from gd1551/staging 2023-08-03 11:11:36 +03:00
gd1551 67fa7b9607 Update Custom Stop Strings to note NovelAI support 2023-08-03 11:06:29 +03:00
Cohee 1b005ef47f
Merge pull request #856 from mweldon/preamble 2023-08-03 10:24:31 +03:00
Mike Weldon c8b5b7da22 Use prose augmenter by default for Kayra 2023-08-02 23:07:17 -07:00
RossAscends 5a67d72fea /qr, /qrset & ctrl+1~9 hotkeys for QRs 2023-08-03 14:44:23 +09:00
RossAscends 68e5ae63d6 move closechat/togglepanels to slashcommands 2023-08-03 13:21:38 +09:00
city-unit 9712e4bbb0 Moved bulk edit from external to internal extension. 2023-08-03 00:15:09 -04:00
50h100a 61c0e3b08b Merge branch 'staging' of https://github.com/SillyTavern/SillyTavern into mancer-api 2023-08-02 23:46:03 -04:00
50h100a d4278388f7 remove non-changes 2023-08-02 23:38:50 -04:00
50h100a 2fdec7eb03 Added authentication variant to WebUI API. 2023-08-02 23:25:24 -04:00
Mike Weldon 1d0f67c144 Add NAI preamble to start of chat buffer 2023-08-02 18:22:06 -07:00
Mike Weldon 14ef5d9a6b Add new NAI presets TeaTime and ProWriter 2023-08-02 18:21:14 -07:00
Cohee1207 143b4347c2 Extend cases for OAI status code message pulling 2023-08-02 23:46:09 +03:00
Cohee1207 2a08e199d2 Merge branch 'release' of http://github.com/cohee1207/SillyTavern into release 2023-08-02 23:46:05 +03:00
Cohee1207 f198f5eb6e Fix localization hiding Usage Stats button 2023-08-02 23:04:52 +03:00
Cohee1207 65a16970f4 Extend cases for OAI status code message pulling 2023-08-02 23:02:29 +03:00
Cohee1207 5a7c4947b3 Merge branch 'staging' of http://github.com/cohee1207/SillyTavern into staging 2023-08-02 23:00:55 +03:00
Cohee bb3fc5be62
Merge pull request #853 from gd1551/staging
Add stop sequences support to NovelAI generations
2023-08-02 22:56:05 +03:00
gd1551 ea800d1550 Add stop sequences support to NovelAI generations 2023-08-02 22:42:11 +03:00
Cohee 90e08e08de
Merge pull request #852 from StefanDanielSchwarz/fix-inconsistent-newline-trimming
Fix inconsistent newline trimming
2023-08-02 21:55:12 +03:00
SDS 9d99b89c9c
Fix inconsistent newline trimming
Newlines weren't trimmed at first generation in a new chat, only on subsequent generations. By commenting out this check, it works consistently for all generations.

(I noticed because even with my deterministic preset, a regen/swipe would give a different output than the very first generation, so I went looking and found this if-clause as the source of the inconsistent behavior.)
2023-08-02 20:00:01 +02:00
Cohee baddee8082
Merge pull request #850 from StefanDanielSchwarz/improved-roleplay-preset 2023-08-02 19:10:13 +03:00
Cohee a51653e8b5
Merge pull request #851 from StefanDanielSchwarz/proxy-preset+some-fixes 2023-08-02 19:09:40 +03:00
SDS 7dfaf6f0b0
Fix Storywriter-Llama2.settings
Removed "can_use_streaming" since that should be determined by SillyTavern and not be hardcoded in the preset.
2023-08-02 18:06:06 +02:00
SDS 3f015f4bd2
Fix Deterministic.settings
Removed "can_use_streaming" since that should be determined by SillyTavern and not be hardcoded in the preset.
2023-08-02 18:05:27 +02:00
SDS 6f18c457fc
simple-proxy-for-tavern default preset
These are the same settings as [simple-proxy-for-tavern's default preset](https://github.com/anon998/simple-proxy-for-tavern/blob/main/presets/default.json). I've fixed the sampler order and raised the context size to Llama 2's 4096 tokens.
2023-08-02 18:03:14 +02:00
SDS c57de3d47b
Improved Roleplay Preset
More like simple-proxy-for-tavern's default verbose prompt, and now works with SillyTavern's AutoFormat Overrides on or off, so this new and improved version is better and more compatible.
2023-08-02 17:42:42 +02:00
mashwell cd02abe205 textgen simple-1 preset fix 2023-08-02 14:23:16 +03:00
50h100a aac7525204 Add secret key storage 2023-08-02 03:31:17 -04:00
50h100a 42cc66f06e First pass UI for extending webui 2023-08-02 03:30:57 -04:00
Cohee f6f51d21c5
Merge pull request #841 from ouoertheo/ouoertheo/objectives6
Objective: currentTask fix in MESSAGE_RECEIVED, ignore swipes
2023-08-02 01:45:20 +03:00
ouoertheo 9a4d62ca6f
add lastMessageWasSwipe=false to resetState 2023-08-01 16:16:52 -05:00
Cohee 9d023dc3b1 Load live2d by posting a file 2023-08-01 23:57:04 +03:00
Cohee ac98ebcc6c Fix npm audit 2023-08-01 23:25:36 +03:00
Cohee 29a3c5d590 Fix npm audit 2023-08-01 23:25:09 +03:00
SDS 0c0e24323c proxy-replacement-preset: Roleplay Instruct Mode Preset
In [[Feature Request] Add Simple Proxy functionality into Silly Tavern directly · Issue #831 · SillyTavern/SillyTavern](https://github.com/SillyTavern/SillyTavern/issues/831), I explained how to replace the [simple-proxy-for-tavern](https://github.com/anon998/simple-proxy-for-tavern) using built-in functionality of SillyTavern. To make this easier, here's an Instruct Mode Preset that helps setting this up.
2023-08-01 23:14:50 +03:00
SDS 32f605e413 Llama-2-KoboldAI-presets
Here are two presets I've found very useful for Llama 2-based models:

- Deterministic takes away the randomness and is good for testing/comparing models because same input equals same output.

- Storywriter-Llama2 is the Storywriter preset adjusted for Llama 2's 4K context size. It also works well against Llama 2's repetition/looping issues.
2023-08-01 23:13:37 +03:00
Cohee af8c21fea2 Send middle-out transform strategy to OpenRouter 2023-08-01 18:49:03 +03:00
Cohee 72974d8a54 More clear message for character import failure 2023-08-01 18:13:50 +03:00
Cohee 7f86551ab4 Don't try to load live2d if variable is disabled or module is not loaded to Extras 2023-08-01 16:33:30 +03:00
Cohee c5d87e4808 ParseImgDimensions for Showdown 2023-08-01 16:24:54 +03:00
Cohee e5f3a70860 #843 Wait for group to stop generating before checking objectives 2023-08-01 15:53:10 +03:00
Cohee 7596d78322 #844 Properly handle KoboldAI errors 2023-08-01 15:22:51 +03:00
Cohee 5645432e9d #823 Allow arbitrary line breaks in instruct sequences and custom chat separators 2023-08-01 14:17:33 +03:00
Cohee bad7892baa Adjust chromadb auto% for character description 2023-08-01 14:16:03 +03:00
Cohee a0c8ac54dd Lower max context for Clio and Kayra presets 2023-08-01 13:30:30 +03:00
SDS 73fd306b8b Fixed persona_description linebreak
When persona description is positioned after character card, there are double linebreaks before and none behind the persona description, because storyString already brings a trailing one whereas persona_description doesn't. This fixes that by putting the linebreak where it belongs.
2023-08-01 12:54:46 +03:00