Commit Graph

139 Commits

Author SHA1 Message Date
Cohee
5c39327450 Display Claude tokenizer in the UI if Claude model is used 2023-07-05 14:49:45 +03:00
Cohee
9e3c55805f Better chat completion continue 2023-07-03 18:22:12 +03:00
Cohee
b14a85a96b Lower PaLM max context size 2023-07-02 21:37:44 +03:00
Cohee
8eb82cdcd9 Continue 2023-07-02 20:21:42 +03:00
Cohee
1d2dc19359 #600 Add gpt-4-32k to OpenRouter selection 2023-06-30 11:44:23 +03:00
Cohee
e2bbc7fbcf Infer model settings from Window extension settings 2023-06-30 00:36:39 +03:00
Cohee
f532192726 Add direct OpenRouter connection and PaLM models to Window selection 2023-06-30 00:32:52 +03:00
Cohee
931fffaa5c Fix chat.comp models not saving on Safari 2023-06-28 16:16:49 +03:00
Cohee
68f967ea78 Add Claude tokenizer 2023-06-26 13:36:56 +03:00
Cohee
e78abf9269 #557 Only add user name to chat completion name if it was sent as another persona in the same chat. 2023-06-23 23:05:41 +03:00
Cohee
1672824416 Spec v2: {{original}} macro for prompt overrides. 2023-06-22 23:24:22 +03:00
Cohee
df4586811d Don't query OpenAI status if it's not a currently selected API on load 2023-06-21 12:50:43 +03:00
Cohee
4a29072e1c Import / export chat completion presets 2023-06-20 23:53:52 +03:00
Cohee
fda152cef0 Placeholders for import/export chat completion preset 2023-06-20 22:41:15 +03:00
Cohee
ec05937dd4 #540 Save streaming flag to Chat Completion preset 2023-06-20 18:58:09 +03:00
Cohee
b85605cac8 Don't auto-select chat.comp preset if it's already selected 2023-06-17 20:28:02 +03:00
Cohee
e91cbe009f Correctly clamp max_context value on saving a chat completion preset 2023-06-15 18:32:56 +03:00
Cohee
183bf442d4 Add Turbo 16k to Window AI 2023-06-13 23:10:59 +03:00
Cohee
356e85fedd Add new OpenAI models 2023-06-13 22:33:35 +03:00
Cohee
6207bdb671 Fix presets not respecting Claude source selection 2023-06-13 00:47:18 +03:00
breathingmanually
98092222fd Add preset setting to avoid sending empty messages 2023-06-11 11:49:32 -03:00
Cohee
46c1fde423 Get appropriate tokenizer for WAI and handle streaming rejection properly 2023-06-10 18:41:02 +03:00
Cohee
ccefee6cee Restore WAI model on load 2023-06-10 18:18:23 +03:00
Cohee
d292f6ee87 Add Window AI model settings 2023-06-10 18:16:13 +03:00
Cohee
acee302b09 Wrap Chat Completion buttons 2023-06-10 17:02:40 +03:00
Cohee
51919cff5d Add Test API connection button 2023-06-10 16:35:22 +03:00
Cohee
f894176e14 Add Top K and Top P for Claude 2023-06-10 16:21:45 +03:00
Cohee
c4a2809849 Fix Window AI settings 2023-06-09 02:23:10 +03:00
Cohee
2ca5f43bc6 Fix mixing Claude models with OpenAI 2023-06-09 02:20:04 +03:00
Cohee
b8084eac65 Add per-character override of JB prompts. 2023-06-08 23:12:58 +03:00
SillyLossy
6f83128bd6 Dynamic toggling of chat completion forms 2023-06-08 13:55:05 +03:00
SillyLossy
62674707eb Dynamically hide incompatible chat completion elements depending on selected API 2023-06-08 13:38:04 +03:00
Cohee
6ac4e2db0b Fix generation when group contains a deleted character 2023-06-08 02:00:51 +03:00
Cohee
0642abbfe5 Add per-character system prompt overrides 2023-06-08 01:47:24 +03:00
Cohee
e8ef60ff47 Claude disclaimer 2023-06-07 13:48:15 +03:00
Cohee
821273630b Restore Claude model on load 2023-06-07 11:15:15 +03:00
Cohee
7cff1a92fa Clamp Claude's max temp when using Window AI 2023-06-07 10:53:24 +03:00
Cohee
a1b130fc9a Streaming for Claude. 2023-06-06 20:18:28 +03:00
Cohee
e205323482 Clamp Claude's max temperature. 2023-06-06 19:43:59 +03:00
Cohee
960bc32340 [RAW / UNTESTED / NO STREAMING] Native Claude API supported 2023-06-06 19:16:46 +03:00
SillyLossy
5df7d2d1dc Fix /sys and /sendas attribution when converting to groups. Fix context line with /sys with OpenAI 2023-06-01 11:18:19 +03:00
SillyLossy
da30b69471 Fix unlocked context breaking OAI tokenizer 2023-05-28 13:42:30 +03:00
SillyLossy
5a7daedfca Stop button fix for window.ai. Refactor the generation function 2023-05-28 02:33:34 +03:00
SillyLossy
8ab1b68c52 Add WI prompt format 2023-05-28 00:01:35 +03:00
SillyLossy
158fdfb140 Add NSFW avoidance prompt to UI 2023-05-27 22:12:19 +03:00
SillyLossy
be64b3469f Properly fallback when w.ai model doesn't support streaming 2023-05-27 21:42:28 +03:00
SillyLossy
a415deb8fa Unlock context size of OAI 2023-05-27 20:45:22 +03:00
SillyLossy
0ab097711b Fix window.ai streaming 2023-05-27 19:50:08 +03:00
Cohee1207
53d6c58b15 Support Window.ai extension 2023-05-27 17:37:25 +03:00
SillyLossy
8cce0d0ce7 Merge branch 'main' into dev 2023-05-21 14:32:19 +03:00