Commit Graph

310 Commits

Author SHA1 Message Date
Cohee
9106696f2f Render prompt manager when switching APIs 2024-01-01 17:06:10 +02:00
valadaptive
0d3505c44b Remove OAI_BEFORE_CHATCOMPLETION
Not used in any internal code or extensions I can find.
2023-12-25 03:48:49 -05:00
Cohee
f8dece9d88 Always remove logit bias and stop from vision 2023-12-24 20:01:59 +02:00
Cohee
89d70539b9 Alternative continue method for chat completions 2023-12-22 20:24:54 +02:00
Cohee
a85a6cf606 Allow displaying unreferenced macro in message texts 2023-12-21 20:49:03 +02:00
Cohee
b5e59c819c Merge branch 'staging' into claude-rework 2023-12-21 16:52:43 +02:00
Cohee
3001db3a47 Add additional parameters for custom endpoints 2023-12-20 23:39:10 +02:00
Cohee
ae64c99835 Add custom caption source 2023-12-20 21:05:20 +02:00
Cohee
5734dbd17c Add custom endpoint type 2023-12-20 18:29:03 +02:00
DonMoralez
50ece13752 Add restore button, def hum message, claude check 2023-12-18 02:25:17 +02:00
DonMoralez
7835a1360f Merge remote-tracking branch 'upstream/staging' into staging 2023-12-17 19:46:47 +02:00
based
ed96ec5c3e reverse proxy condition fix 2023-12-16 12:02:34 +10:00
DonMoralez
6b59014892 (Fix) "squash sys. messages" processed empty messages, adding \n 2023-12-16 00:24:48 +02:00
based
583f786d74 finish mistral frontend integration + apikey status check 2023-12-16 07:15:57 +10:00
based
041957975a add mistral completion source to UI 2023-12-16 06:08:41 +10:00
DonMoralez
10fb83ee53 Merge remote-tracking branch 'upstream/staging' into staging 2023-12-15 13:12:15 +02:00
Cohee
cde9903fcb Fix Bison models 2023-12-14 22:18:34 +02:00
DonMoralez
6f16ccf01f Merge branch 'staging' of https://github.com/DonMoralez/SillyTavern into staging 2023-12-14 20:17:41 +02:00
Cohee
6bb894286e Migrate palm source to makersuite 2023-12-14 19:54:31 +02:00
based
5071b9a369 webstorm moment 2023-12-15 02:01:42 +10:00
based
60880cfd4d merge 2023-12-15 01:39:12 +10:00
based
698850b514 Merge remote-tracking branch 'fork/staging' into gemini
# Conflicts:
#	server.js
#	src/endpoints/prompt-converters.js
#	src/endpoints/tokenizers.js
2023-12-15 01:35:17 +10:00
based
d5bcd96eef message inlining vision support 2023-12-15 01:28:54 +10:00
based
0b7c1a98cd added google vision caption support 2023-12-14 22:37:53 +10:00
based
ca87f29771 added streaming for google models 2023-12-14 21:03:41 +10:00
based
3e82a7d439 tokenizer changes and fixes. + a toggle 2023-12-14 16:31:08 +10:00
based
e26159c00d refactor and rework palm request to work with the 'content' format and added an endpoint for googles tokenizer 2023-12-14 15:49:50 +10:00
based
be396991de finish implementing ui changes for google models 2023-12-14 11:53:26 +10:00
based
69e24c9686 change palm naming in UI 2023-12-14 11:14:41 +10:00
valadaptive
22e048b5af Rename generate_altscale endpoint 2023-12-13 18:53:46 -05:00
valadaptive
92bd766bcb Rename chat completions endpoints
OpenAI calls this the "Chat Completions API", in contrast to their
previous "Text Completions API", so that's what I'm naming it; both
because other services besides OpenAI implement it, and to avoid
confusion with the existing /api/openai route used for OpenAI extras.
2023-12-13 18:52:08 -05:00
DonMoralez
fec27820ff (claude)reworked prefix assignment, sysprompt mode, console message display 2023-12-13 21:19:26 +02:00
Cohee
9160de7714 Run macros on impersonation prompt 2023-12-12 19:24:32 +02:00
Cohee
9176f46caf Add /preset command 2023-12-12 19:14:17 +02:00
Cohee
b0e7b73a32 Fix streaming processor error handler hooks 2023-12-08 02:01:08 +02:00
valadaptive
5569a63595 Remove legacy_streaming setting
This was a workaround for older versions of Slaude that implemented SSE
improperly. This was fixed in Slaude 7 months ago, so the workaround can
be removed.
2023-12-07 18:00:36 -05:00
valadaptive
cdcd913805 Don't stream events if the API returned a 4xx code 2023-12-07 18:00:36 -05:00
valadaptive
5540c165cf Refactor server-sent events parsing
Create one server-sent events stream class which implements the entire
spec (different line endings, chunking, etc) and use it in all the
streaming generators.
2023-12-07 18:00:36 -05:00
Cohee
72adb4c8aa Fix window.ai streaming 2023-12-07 17:42:06 +02:00
Cohee
671df1f62e Fix constant usage 2023-12-04 00:24:23 +02:00
valadaptive
e33c8bd955 Replace use_[source] with chat_completion_source
Same as the is_[api] replacement--it's easier to have one enum field
than several mutually-exclusive boolean ones
2023-12-03 15:03:39 -05:00
Cohee
8a1ead531c
Merge pull request #1439 from valadaptive/prompt-manager-class
Convert PromptManagerModule to a class
2023-12-03 21:52:27 +02:00
Cohee
1786b0d340 #1403 Add Aphrodite multi-swipe 2023-12-03 20:40:09 +02:00
valadaptive
b8b24540a9 Rename PromptManagerModule to PromptManager
The one place where it was imported renamed it to PromptManager anyway.
2023-12-03 12:14:56 -05:00
Cohee
a3bc51bcea Fix type-in max context for OAI 2023-12-03 13:56:22 +02:00
Cohee
64a3564892 lint: Comma dangle 2023-12-02 22:06:57 +02:00
Cohee
c63cd87cc0 lint: Require semicolons 2023-12-02 21:11:06 +02:00
valadaptive
a37f874e38 Require single quotes 2023-12-02 13:04:51 -05:00
valadaptive
518bb58d5a Enable no-unused-vars lint
This is the big one. Probably needs thorough review to make sure I
didn't accidentally remove any setInterval or fetch calls.
2023-12-02 12:11:19 -05:00
valadaptive
c893e2165e Enable no-prototype-builtins lint 2023-12-02 12:10:31 -05:00