Commit Graph

406 Commits

Author SHA1 Message Date
Cohee 965bb54f7d Option to add names to completion contents 2024-03-19 21:53:40 +02:00
Cohee 4a5c1a5ac8 Remove vision model restrictions from OpenRouter. 2024-03-19 20:48:49 +02:00
Cohee 46993384a3 Allow any model to send inline images in OpenAI custom endpoint mode 2024-03-14 00:33:04 +02:00
Cohee e24fbfdc1d Update default OAI sampler parameters 2024-03-13 02:25:20 +02:00
Cohee 15eb18740e Pass char/user names for Claude example messages converter 2024-03-08 08:31:36 +02:00
Cohee 00a4a12d7d Remove "Exclude Assistant suffix" option 2024-03-05 20:41:53 +02:00
based 3e1d44fc29 remove deprecated option + reverse proxy support in claude vision 2024-03-06 02:47:00 +10:00
based de0e0dad27 he forgor oop 2024-03-05 15:33:07 +10:00
based 8d9175f3f2 update default settings preset 2024-03-05 15:22:16 +10:00
based 64d9c9dc5d anthropic captioning 2024-03-05 07:07:38 +10:00
based 7bb8741cfa fix older model naming for the new api 2024-03-05 05:50:29 +10:00
based 04bb882e90 implement messages api and selector for nuclaude models 2024-03-05 04:40:19 +10:00
based 149a65cf62 migrate model name in old presets to new naming scheme 2024-02-27 02:23:07 +10:00
Cohee c9f0d61f19 #1851 Substitute macros in new example chat 2024-02-22 18:45:50 +02:00
Cohee ece3b2a7c1 Fix Chat Completions status check on settings loading if another API is selected 2024-02-22 04:36:06 +02:00
Cohee 2e00a1baaf [FEATURE_REQUEST] Can the unlocked max context size for OpenAI completion be increased from 102k to 200k for example? #1842 2024-02-19 19:37:18 +02:00
Cohee 9d713825c2 #1827 Consolidate {{group}} macro behavior 2024-02-12 16:23:01 +02:00
Cohee 33d93b9761 #1813 Fix squash system messages 2024-02-11 15:56:48 +02:00
Cohee 18f84979f2 Use SSE streaming for MakerSuite 2024-02-10 02:43:50 +02:00
Cohee 90231680a9 Remove extra space 2024-02-08 00:05:23 +02:00
Cohee dfc1719c3f Use fuzzy name matching 2024-02-08 00:04:48 +02:00
EX3-0 b2eb361028
Update openai.js added /proxy command.
Added "proxy" slash command to openai.js to change between proxy presets in ST script.
2024-02-07 13:52:48 -05:00
anon 634c9aad3b add logsprobs support for custom OpenAI APIs 2024-02-04 23:11:45 +00:00
Cohee 7ac6ed267f #1782 OpenAI multiswipe 2024-02-04 03:36:37 +02:00
Cohee da7b435b7c
Merge pull request #1751 from kingbased/proxypreset
Reverse proxy presets
2024-01-29 22:09:33 +02:00
based ee7ee9f60a merged mistral proxy support 2024-01-27 06:26:23 +10:00
based aa976d0de2 implemented proxy preset manager 2024-01-27 06:21:00 +10:00
Cohee 4d534e3042 [BUG] Incorrect context size for gpt-4-turbo-0125 #1748 2024-01-26 18:51:20 +02:00
Cohee 1647e5ae49
Merge pull request #1734 from khanonnie/alternative-tokens
Implement Token Probabilities UI panel using logprobs
2024-01-26 03:39:25 +02:00
khanon 60044c18a4 Implement Token Probabilities UI using logprobs 2024-01-25 18:34:46 -06:00
Cohee 40476dca3b New OAI models 2024-01-25 22:01:02 +02:00
Cohee 515e3859ec
Merge pull request #1689 from h-a-s-k/staging
Group chat fixes
2024-01-25 20:51:55 +02:00
Cohee 107fe85543 Add OpenRouter filtered reason display 2024-01-23 00:10:53 +02:00
Cohee 0b322c0e3d Add repetition penalty control for OpenRouter 2024-01-18 23:55:09 +02:00
h-a-s-k 074cc13e60 Fix group chat example messages not including character name 2024-01-13 14:34:17 -03:00
h-a-s-k 9354697753 Actually call them example chats 2024-01-13 13:06:51 -03:00
Cohee f4c7fff8c0 Mistral API got fixed 2024-01-12 18:17:43 +02:00
Cohee e33ac6a78a Add min_p and top_a for OpenRouter 2024-01-12 17:15:13 +02:00
Cohee 747a7824c0 OpenRouter model dropdown facelift 2024-01-11 20:27:59 +02:00
Cohee 9b24e7dc67
Merge pull request #1596 from DonMoralez/staging
added exclude prefixes, modified sequence checker
2024-01-01 23:33:58 +02:00
Cohee 9106696f2f Render prompt manager when switching APIs 2024-01-01 17:06:10 +02:00
based 42aa7fd316 mistral proxy support 2023-12-31 06:21:40 +10:00
valadaptive 0d3505c44b Remove OAI_BEFORE_CHATCOMPLETION
Not used in any internal code or extensions I can find.
2023-12-25 03:48:49 -05:00
DonMoralez a8e5285ff7 Merge remote-tracking branch 'upstream/staging' into staging 2023-12-25 01:19:30 +02:00
Cohee f8dece9d88 Always remove logit bias and stop from vision 2023-12-24 20:01:59 +02:00
DonMoralez 6fb69d5929 Merge remote-tracking branch 'upstream/staging' into staging 2023-12-23 00:25:57 +02:00
Cohee 89d70539b9 Alternative continue method for chat completions 2023-12-22 20:24:54 +02:00
DonMoralez e95482aea1 Merge remote-tracking branch 'upstream/staging' into staging 2023-12-22 17:12:59 +02:00
DonMoralez ee06a488b0 Add exclude prefixes checkbox, modified sequence checker 2023-12-22 17:04:58 +02:00
Cohee a85a6cf606 Allow displaying unreferenced macro in message texts 2023-12-21 20:49:03 +02:00
DonMoralez 1c9643806b Merge remote-tracking branch 'upstream/staging' into staging 2023-12-21 17:30:37 +02:00
Cohee b5e59c819c Merge branch 'staging' into claude-rework 2023-12-21 16:52:43 +02:00
Cohee 3001db3a47 Add additional parameters for custom endpoints 2023-12-20 23:39:10 +02:00
Cohee ae64c99835 Add custom caption source 2023-12-20 21:05:20 +02:00
Cohee 5734dbd17c Add custom endpoint type 2023-12-20 18:29:03 +02:00
DonMoralez 50ece13752 Add restore button, def hum message, claude check 2023-12-18 02:25:17 +02:00
DonMoralez 7835a1360f Merge remote-tracking branch 'upstream/staging' into staging 2023-12-17 19:46:47 +02:00
based ed96ec5c3e reverse proxy condition fix 2023-12-16 12:02:34 +10:00
DonMoralez 6b59014892 (Fix) "squash sys. messages" processed empty messages, adding \n 2023-12-16 00:24:48 +02:00
based 583f786d74 finish mistral frontend integration + apikey status check 2023-12-16 07:15:57 +10:00
based 041957975a add mistral completion source to UI 2023-12-16 06:08:41 +10:00
DonMoralez 10fb83ee53 Merge remote-tracking branch 'upstream/staging' into staging 2023-12-15 13:12:15 +02:00
Cohee cde9903fcb Fix Bison models 2023-12-14 22:18:34 +02:00
DonMoralez 6f16ccf01f Merge branch 'staging' of https://github.com/DonMoralez/SillyTavern into staging 2023-12-14 20:17:41 +02:00
Cohee 6bb894286e Migrate palm source to makersuite 2023-12-14 19:54:31 +02:00
based 5071b9a369 webstorm moment 2023-12-15 02:01:42 +10:00
based 60880cfd4d merge 2023-12-15 01:39:12 +10:00
based 698850b514 Merge remote-tracking branch 'fork/staging' into gemini
# Conflicts:
#	server.js
#	src/endpoints/prompt-converters.js
#	src/endpoints/tokenizers.js
2023-12-15 01:35:17 +10:00
based d5bcd96eef message inlining vision support 2023-12-15 01:28:54 +10:00
based 0b7c1a98cd added google vision caption support 2023-12-14 22:37:53 +10:00
based ca87f29771 added streaming for google models 2023-12-14 21:03:41 +10:00
based 3e82a7d439 tokenizer changes and fixes. + a toggle 2023-12-14 16:31:08 +10:00
based e26159c00d refactor and rework palm request to work with the 'content' format and added an endpoint for googles tokenizer 2023-12-14 15:49:50 +10:00
based be396991de finish implementing ui changes for google models 2023-12-14 11:53:26 +10:00
based 69e24c9686 change palm naming in UI 2023-12-14 11:14:41 +10:00
valadaptive 22e048b5af Rename generate_altscale endpoint 2023-12-13 18:53:46 -05:00
valadaptive 92bd766bcb Rename chat completions endpoints
OpenAI calls this the "Chat Completions API", in contrast to their
previous "Text Completions API", so that's what I'm naming it; both
because other services besides OpenAI implement it, and to avoid
confusion with the existing /api/openai route used for OpenAI extras.
2023-12-13 18:52:08 -05:00
DonMoralez fec27820ff (claude)reworked prefix assignment, sysprompt mode, console message display 2023-12-13 21:19:26 +02:00
Cohee 9160de7714 Run macros on impersonation prompt 2023-12-12 19:24:32 +02:00
Cohee 9176f46caf Add /preset command 2023-12-12 19:14:17 +02:00
Cohee b0e7b73a32 Fix streaming processor error handler hooks 2023-12-08 02:01:08 +02:00
valadaptive 5569a63595 Remove legacy_streaming setting
This was a workaround for older versions of Slaude that implemented SSE
improperly. This was fixed in Slaude 7 months ago, so the workaround can
be removed.
2023-12-07 18:00:36 -05:00
valadaptive cdcd913805 Don't stream events if the API returned a 4xx code 2023-12-07 18:00:36 -05:00
valadaptive 5540c165cf Refactor server-sent events parsing
Create one server-sent events stream class which implements the entire
spec (different line endings, chunking, etc) and use it in all the
streaming generators.
2023-12-07 18:00:36 -05:00
Cohee 72adb4c8aa Fix window.ai streaming 2023-12-07 17:42:06 +02:00
Cohee 671df1f62e Fix constant usage 2023-12-04 00:24:23 +02:00
valadaptive e33c8bd955 Replace use_[source] with chat_completion_source
Same as the is_[api] replacement--it's easier to have one enum field
than several mutually-exclusive boolean ones
2023-12-03 15:03:39 -05:00
Cohee 8a1ead531c
Merge pull request #1439 from valadaptive/prompt-manager-class
Convert PromptManagerModule to a class
2023-12-03 21:52:27 +02:00
Cohee 1786b0d340 #1403 Add Aphrodite multi-swipe 2023-12-03 20:40:09 +02:00
valadaptive b8b24540a9 Rename PromptManagerModule to PromptManager
The one place where it was imported renamed it to PromptManager anyway.
2023-12-03 12:14:56 -05:00
Cohee a3bc51bcea Fix type-in max context for OAI 2023-12-03 13:56:22 +02:00
Cohee 64a3564892 lint: Comma dangle 2023-12-02 22:06:57 +02:00
Cohee c63cd87cc0 lint: Require semicolons 2023-12-02 21:11:06 +02:00
valadaptive a37f874e38 Require single quotes 2023-12-02 13:04:51 -05:00
valadaptive 518bb58d5a Enable no-unused-vars lint
This is the big one. Probably needs thorough review to make sure I
didn't accidentally remove any setInterval or fetch calls.
2023-12-02 12:11:19 -05:00
valadaptive c893e2165e Enable no-prototype-builtins lint 2023-12-02 12:10:31 -05:00
valadaptive 0a27275772 Enable no-extra-semi lint 2023-12-02 10:32:26 -05:00
valadaptive 367f3dba27 Enable no-unsafe-finally lint 2023-12-02 10:32:07 -05:00
Cohee 19c6370fa5 Revert preset checkbox update logic 2023-12-01 11:55:05 +02:00
Cohee b96054f337 Update max token limit for palm2 2023-11-30 19:02:31 +02:00