Commit Graph

4808 Commits

Author SHA1 Message Date
Cohee1207 65a16970f4 Extend cases for OAI status code message pulling 2023-08-02 23:02:29 +03:00
Cohee1207 5a7c4947b3 Merge branch 'staging' of http://github.com/cohee1207/SillyTavern into staging 2023-08-02 23:00:55 +03:00
Cohee bb3fc5be62
Merge pull request #853 from gd1551/staging
Add stop sequences support to NovelAI generations
2023-08-02 22:56:05 +03:00
gd1551 ea800d1550 Add stop sequences support to NovelAI generations 2023-08-02 22:42:11 +03:00
Cohee 90e08e08de
Merge pull request #852 from StefanDanielSchwarz/fix-inconsistent-newline-trimming
Fix inconsistent newline trimming
2023-08-02 21:55:12 +03:00
SDS 9d99b89c9c
Fix inconsistent newline trimming
Newlines weren't trimmed at first generation in a new chat, only on subsequent generations. By commenting out this check, it works consistently for all generations.

(I noticed because even with my deterministic preset, a regen/swipe would give a different output than the very first generation, so I went looking and found this if-clause as the source of the inconsistent behavior.)
2023-08-02 20:00:01 +02:00
Cohee baddee8082
Merge pull request #850 from StefanDanielSchwarz/improved-roleplay-preset 2023-08-02 19:10:13 +03:00
Cohee a51653e8b5
Merge pull request #851 from StefanDanielSchwarz/proxy-preset+some-fixes 2023-08-02 19:09:40 +03:00
SDS 7dfaf6f0b0
Fix Storywriter-Llama2.settings
Removed "can_use_streaming" since that should be determined by SillyTavern and not be hardcoded in the preset.
2023-08-02 18:06:06 +02:00
SDS 3f015f4bd2
Fix Deterministic.settings
Removed "can_use_streaming" since that should be determined by SillyTavern and not be hardcoded in the preset.
2023-08-02 18:05:27 +02:00
SDS 6f18c457fc
simple-proxy-for-tavern default preset
These are the same settings as [simple-proxy-for-tavern's default preset](https://github.com/anon998/simple-proxy-for-tavern/blob/main/presets/default.json). I've fixed the sampler order and raised the context size to Llama 2's 4096 tokens.
2023-08-02 18:03:14 +02:00
SDS c57de3d47b
Improved Roleplay Preset
More like simple-proxy-for-tavern's default verbose prompt, and now works with SillyTavern's AutoFormat Overrides on or off, so this new and improved version is better and more compatible.
2023-08-02 17:42:42 +02:00
mashwell cd02abe205 textgen simple-1 preset fix 2023-08-02 14:23:16 +03:00
50h100a aac7525204 Add secret key storage 2023-08-02 03:31:17 -04:00
50h100a 42cc66f06e First pass UI for extending webui 2023-08-02 03:30:57 -04:00
Cohee f6f51d21c5
Merge pull request #841 from ouoertheo/ouoertheo/objectives6
Objective: currentTask fix in MESSAGE_RECEIVED, ignore swipes
2023-08-02 01:45:20 +03:00
ouoertheo 9a4d62ca6f
add lastMessageWasSwipe=false to resetState 2023-08-01 16:16:52 -05:00
Cohee 9d023dc3b1 Load live2d by posting a file 2023-08-01 23:57:04 +03:00
Cohee ac98ebcc6c Fix npm audit 2023-08-01 23:25:36 +03:00
Cohee 29a3c5d590 Fix npm audit 2023-08-01 23:25:09 +03:00
SDS 0c0e24323c proxy-replacement-preset: Roleplay Instruct Mode Preset
In [[Feature Request] Add Simple Proxy functionality into Silly Tavern directly · Issue #831 · SillyTavern/SillyTavern](https://github.com/SillyTavern/SillyTavern/issues/831), I explained how to replace the [simple-proxy-for-tavern](https://github.com/anon998/simple-proxy-for-tavern) using built-in functionality of SillyTavern. To make this easier, here's an Instruct Mode Preset that helps setting this up.
2023-08-01 23:14:50 +03:00
SDS 32f605e413 Llama-2-KoboldAI-presets
Here are two presets I've found very useful for Llama 2-based models:

- Deterministic takes away the randomness and is good for testing/comparing models because same input equals same output.

- Storywriter-Llama2 is the Storywriter preset adjusted for Llama 2's 4K context size. It also works well against Llama 2's repetition/looping issues.
2023-08-01 23:13:37 +03:00
Cohee af8c21fea2 Send middle-out transform strategy to OpenRouter 2023-08-01 18:49:03 +03:00
Cohee 72974d8a54 More clear message for character import failure 2023-08-01 18:13:50 +03:00
Cohee 7f86551ab4 Don't try to load live2d if variable is disabled or module is not loaded to Extras 2023-08-01 16:33:30 +03:00
Cohee c5d87e4808 ParseImgDimensions for Showdown 2023-08-01 16:24:54 +03:00
Cohee e5f3a70860 #843 Wait for group to stop generating before checking objectives 2023-08-01 15:53:10 +03:00
Cohee 7596d78322 #844 Properly handle KoboldAI errors 2023-08-01 15:22:51 +03:00
Cohee 5645432e9d #823 Allow arbitrary line breaks in instruct sequences and custom chat separators 2023-08-01 14:17:33 +03:00
Cohee bad7892baa Adjust chromadb auto% for character description 2023-08-01 14:16:03 +03:00
Cohee a0c8ac54dd Lower max context for Clio and Kayra presets 2023-08-01 13:30:30 +03:00
SDS 73fd306b8b Fixed persona_description linebreak
When persona description is positioned after character card, there are double linebreaks before and none behind the persona description, because storyString already brings a trailing one whereas persona_description doesn't. This fixes that by putting the linebreak where it belongs.
2023-08-01 12:54:46 +03:00
Cohee 72213add56 #833 Sort tags list alphabetically 2023-08-01 12:26:28 +03:00
ouoertheo 6f4fd15095 currentTask fix in MESSAGE_RECEIVED, ingore swipes 2023-08-01 04:24:55 -05:00
Cohee 78d62d7be2 Merge branch 'staging' of https://github.com/SillyLossy/TavernAI into staging 2023-08-01 11:58:42 +03:00
Cohee 99af6ed472 Update default settings 2023-08-01 11:57:25 +03:00
Cohee 16b45f1ea9 Reformat new code 2023-07-31 20:56:05 +03:00
Cohee 435d319090
Merge pull request #835 from pyrater/staging
Live2d Changes
2023-07-31 20:53:45 +03:00
Cohee e7148c41a9
Merge pull request #839 from Tony-sama/staging
Restored speech-recognition streaming mode as a new provider "Streaming"
2023-07-31 20:51:02 +03:00
Tony Ribeiro 192c82b180 Restored streaming mode as a new provider "Streaming", recording is done on server side, voice detection with vosk and transcript with whisper. 2023-07-31 18:47:33 +02:00
Cohee 8aff89de30
Merge pull request #838 from ouoertheo/ouoertheo/objective-next-task-bugfix
Objective: Current task fixes, {{parent}} prompt template variable
2023-07-31 17:01:32 +03:00
ouoertheo 6768c56e2b fix regression on task selection 2023-07-31 07:56:49 -05:00
joe 4939387bbf Updated based on feedback 2023-07-31 19:14:15 +09:00
joe 4c14b8ee2d Updated Static URL 2023-07-31 19:01:45 +09:00
joe 0bbcf0db83 Updated non static URL Calls 2023-07-31 18:54:50 +09:00
Cohee 29d841a50b Bump package version 2023-07-31 12:54:49 +03:00
joe 0c919bf32d Talking Animation 2023-07-31 18:21:32 +09:00
pyrater 9f92b19004
Merge branch 'SillyTavern:staging' into staging 2023-07-31 16:10:22 +09:00
joe 7824a18103 Live2d Commits 2023-07-31 16:09:36 +09:00
Cohee 18e6e578dd
Merge pull request #834 from ouoertheo/ouoertheo/objective-auto-check-fix 2023-07-31 09:54:22 +03:00