Commit Graph

110 Commits

Author SHA1 Message Date
b8084eac65 Add per-character override of JB prompts. 2023-06-08 23:12:58 +03:00
6f83128bd6 Dynamic toggling of chat completion forms 2023-06-08 13:55:05 +03:00
62674707eb Dynamically hide incompatible chat completion elements depending on selected API 2023-06-08 13:38:04 +03:00
6ac4e2db0b Fix generation when group contains a deleted character 2023-06-08 02:00:51 +03:00
0642abbfe5 Add per-character system prompt overrides 2023-06-08 01:47:24 +03:00
e8ef60ff47 Claude disclaimer 2023-06-07 13:48:15 +03:00
821273630b Restore Claude model on load 2023-06-07 11:15:15 +03:00
7cff1a92fa Clamp Claude's max temp when using Window AI 2023-06-07 10:53:24 +03:00
a1b130fc9a Streaming for Claude. 2023-06-06 20:18:28 +03:00
e205323482 Clamp Claude's max temperature. 2023-06-06 19:43:59 +03:00
960bc32340 [RAW / UNTESTED / NO STREAMING] Native Claude API supported 2023-06-06 19:16:46 +03:00
5df7d2d1dc Fix /sys and /sendas attribution when converting to groups. Fix context line with /sys with OpenAI 2023-06-01 11:18:19 +03:00
da30b69471 Fix unlocked context breaking OAI tokenizer 2023-05-28 13:42:30 +03:00
5a7daedfca Stop button fix for window.ai. Refactor the generation function 2023-05-28 02:33:34 +03:00
8ab1b68c52 Add WI prompt format 2023-05-28 00:01:35 +03:00
158fdfb140 Add NSFW avoidance prompt to UI 2023-05-27 22:12:19 +03:00
be64b3469f Properly fallback when w.ai model doesn't support streaming 2023-05-27 21:42:28 +03:00
a415deb8fa Unlock context size of OAI 2023-05-27 20:45:22 +03:00
0ab097711b Fix window.ai streaming 2023-05-27 19:50:08 +03:00
53d6c58b15 Support Window.ai extension 2023-05-27 17:37:25 +03:00
8cce0d0ce7 Merge branch 'main' into dev 2023-05-21 14:32:19 +03:00
6fa4c2c1c8 Make openai credit error a catch all 2023-05-21 14:16:30 +05:30
91315b4a74 Merge branch 'main' into dev 2023-05-21 01:37:34 +03:00
f0c7c96d3c Added switch to unbrick streaming on some unsupported proxies 2023-05-21 01:36:35 +03:00
299b9a04bc Replace info popups with toasts 2023-05-20 23:59:39 +03:00
df0734aac4 #336 Slash commands / bias adjustments 2023-05-19 23:05:22 +03:00
ab5e555d62 Add reverse proxy to presets. #345 #109
Remove token breakdown from OAI options (it's now globally active).
2023-05-19 18:31:59 +03:00
b626417a73 Merge branch 'main' into dev 2023-05-19 12:14:11 +03:00
7e59745dfc buffers partial SSE messages from Readable 2023-05-19 03:20:27 -05:00
0d1f291003 Add /sendas command 2023-05-18 18:49:49 +03:00
cb43fe13aa Somewhat usable system message narrator 2023-05-17 20:24:35 +03:00
741c7b6568 Merge branch 'dev' of github.com:Cohee1207/SillyTavern into dev 2023-05-15 16:14:00 -03:00
dade3fa17d Merge branch 'dev' of https://github.com/SillyLossy/TavernAI into dev 2023-05-14 19:47:34 +03:00
6a94bb5063 Old anchors removed 2023-05-14 19:47:32 +03:00
133caa58d2 add in process files for OAI tokenization merge 2023-05-15 01:45:36 +09:00
30a43f96de OAI token itemization WIP (integrate PR299) 2023-05-15 01:08:45 +09:00
1b2e113a34 Feature: Auto Swipe 2023-05-13 22:15:47 -03:00
e374703798 Refactor API keys handling. Remove ST hosting from colab 2023-05-11 21:08:22 +03:00
22f4e6f1fe Fixed trailing whitespace on join 2023-05-09 13:51:01 -03:00
941781719b Fix: extra space on prompt (due to join(" ") on array) 2023-05-09 13:38:18 -03:00
14105bc4dd More reliable bias cancellation 2023-05-07 15:55:44 +03:00
28b5aa75a4 #255 Fix WI tokenization if character is not selected 2023-05-07 15:36:55 +03:00
7f718e09be #228 Don't use a selected tokenizer for parallel prompt building of OAI prompts 2023-05-06 15:30:15 +03:00
9d94936e3e Merge pull request #242 from random-username-423/dev
Add top_p to OpenAI parameters UI
2023-05-06 13:24:33 +03:00
7453eb8872 #228 Add delay microtasks in OAI tokenization to unblock the CPU thread during prompt building 2023-05-06 13:19:22 +03:00
a61369b52a Avoid an empty line in OpenAI when bottom anchor is empty 2023-05-06 00:25:53 +02:00
20f43b2c17 Fix bottom anchor never being included for OpenAI 2023-05-06 00:25:53 +02:00
a5ae9cc7fb Simplify message owner checks a bit 2023-05-06 00:25:53 +02:00
f86f42fad7 Add top_p to OpenAI parameters UI 2023-05-06 00:49:47 +09:00
e639666b34 Move chat renaming logic to server side. Add "quiet" reply generation mode. 2023-05-03 21:02:23 +03:00