Cohee
747a7824c0
OpenRouter model dropdown facelift
2024-01-11 20:27:59 +02:00
Cohee
9b24e7dc67
Merge pull request #1596 from DonMoralez/staging
...
added exclude prefixes, modified sequence checker
2024-01-01 23:33:58 +02:00
Cohee
9106696f2f
Render prompt manager when switching APIs
2024-01-01 17:06:10 +02:00
based
42aa7fd316
mistral proxy support
2023-12-31 06:21:40 +10:00
valadaptive
0d3505c44b
Remove OAI_BEFORE_CHATCOMPLETION
...
Not used in any internal code or extensions I can find.
2023-12-25 03:48:49 -05:00
DonMoralez
a8e5285ff7
Merge remote-tracking branch 'upstream/staging' into staging
2023-12-25 01:19:30 +02:00
Cohee
f8dece9d88
Always remove logit bias and stop from vision
2023-12-24 20:01:59 +02:00
DonMoralez
6fb69d5929
Merge remote-tracking branch 'upstream/staging' into staging
2023-12-23 00:25:57 +02:00
Cohee
89d70539b9
Alternative continue method for chat completions
2023-12-22 20:24:54 +02:00
DonMoralez
e95482aea1
Merge remote-tracking branch 'upstream/staging' into staging
2023-12-22 17:12:59 +02:00
DonMoralez
ee06a488b0
Add exclude prefixes checkbox, modified sequence checker
2023-12-22 17:04:58 +02:00
Cohee
a85a6cf606
Allow displaying unreferenced macro in message texts
2023-12-21 20:49:03 +02:00
DonMoralez
1c9643806b
Merge remote-tracking branch 'upstream/staging' into staging
2023-12-21 17:30:37 +02:00
Cohee
b5e59c819c
Merge branch 'staging' into claude-rework
2023-12-21 16:52:43 +02:00
Cohee
3001db3a47
Add additional parameters for custom endpoints
2023-12-20 23:39:10 +02:00
Cohee
ae64c99835
Add custom caption source
2023-12-20 21:05:20 +02:00
Cohee
5734dbd17c
Add custom endpoint type
2023-12-20 18:29:03 +02:00
DonMoralez
50ece13752
Add restore button, def hum message, claude check
2023-12-18 02:25:17 +02:00
DonMoralez
7835a1360f
Merge remote-tracking branch 'upstream/staging' into staging
2023-12-17 19:46:47 +02:00
based
ed96ec5c3e
reverse proxy condition fix
2023-12-16 12:02:34 +10:00
DonMoralez
6b59014892
(Fix) "squash sys. messages" processed empty messages, adding \n
2023-12-16 00:24:48 +02:00
based
583f786d74
finish mistral frontend integration + apikey status check
2023-12-16 07:15:57 +10:00
based
041957975a
add mistral completion source to UI
2023-12-16 06:08:41 +10:00
DonMoralez
10fb83ee53
Merge remote-tracking branch 'upstream/staging' into staging
2023-12-15 13:12:15 +02:00
Cohee
cde9903fcb
Fix Bison models
2023-12-14 22:18:34 +02:00
DonMoralez
6f16ccf01f
Merge branch 'staging' of https://github.com/DonMoralez/SillyTavern into staging
2023-12-14 20:17:41 +02:00
Cohee
6bb894286e
Migrate palm source to makersuite
2023-12-14 19:54:31 +02:00
based
5071b9a369
webstorm moment
2023-12-15 02:01:42 +10:00
based
60880cfd4d
merge
2023-12-15 01:39:12 +10:00
based
698850b514
Merge remote-tracking branch 'fork/staging' into gemini
...
# Conflicts:
# server.js
# src/endpoints/prompt-converters.js
# src/endpoints/tokenizers.js
2023-12-15 01:35:17 +10:00
based
d5bcd96eef
message inlining vision support
2023-12-15 01:28:54 +10:00
based
0b7c1a98cd
added google vision caption support
2023-12-14 22:37:53 +10:00
based
ca87f29771
added streaming for google models
2023-12-14 21:03:41 +10:00
based
3e82a7d439
tokenizer changes and fixes. + a toggle
2023-12-14 16:31:08 +10:00
based
e26159c00d
refactor and rework palm request to work with the 'content' format and added an endpoint for googles tokenizer
2023-12-14 15:49:50 +10:00
based
be396991de
finish implementing ui changes for google models
2023-12-14 11:53:26 +10:00
based
69e24c9686
change palm naming in UI
2023-12-14 11:14:41 +10:00
valadaptive
22e048b5af
Rename generate_altscale endpoint
2023-12-13 18:53:46 -05:00
valadaptive
92bd766bcb
Rename chat completions endpoints
...
OpenAI calls this the "Chat Completions API", in contrast to their
previous "Text Completions API", so that's what I'm naming it; both
because other services besides OpenAI implement it, and to avoid
confusion with the existing /api/openai route used for OpenAI extras.
2023-12-13 18:52:08 -05:00
DonMoralez
fec27820ff
(claude)reworked prefix assignment, sysprompt mode, console message display
2023-12-13 21:19:26 +02:00
Cohee
9160de7714
Run macros on impersonation prompt
2023-12-12 19:24:32 +02:00
Cohee
9176f46caf
Add /preset command
2023-12-12 19:14:17 +02:00
Cohee
b0e7b73a32
Fix streaming processor error handler hooks
2023-12-08 02:01:08 +02:00
valadaptive
5569a63595
Remove legacy_streaming setting
...
This was a workaround for older versions of Slaude that implemented SSE
improperly. This was fixed in Slaude 7 months ago, so the workaround can
be removed.
2023-12-07 18:00:36 -05:00
valadaptive
cdcd913805
Don't stream events if the API returned a 4xx code
2023-12-07 18:00:36 -05:00
valadaptive
5540c165cf
Refactor server-sent events parsing
...
Create one server-sent events stream class which implements the entire
spec (different line endings, chunking, etc) and use it in all the
streaming generators.
2023-12-07 18:00:36 -05:00
Cohee
72adb4c8aa
Fix window.ai streaming
2023-12-07 17:42:06 +02:00
Cohee
671df1f62e
Fix constant usage
2023-12-04 00:24:23 +02:00
valadaptive
e33c8bd955
Replace use_[source] with chat_completion_source
...
Same as the is_[api] replacement--it's easier to have one enum field
than several mutually-exclusive boolean ones
2023-12-03 15:03:39 -05:00
Cohee
8a1ead531c
Merge pull request #1439 from valadaptive/prompt-manager-class
...
Convert PromptManagerModule to a class
2023-12-03 21:52:27 +02:00
Cohee
1786b0d340
#1403 Add Aphrodite multi-swipe
2023-12-03 20:40:09 +02:00
valadaptive
b8b24540a9
Rename PromptManagerModule to PromptManager
...
The one place where it was imported renamed it to PromptManager anyway.
2023-12-03 12:14:56 -05:00
Cohee
a3bc51bcea
Fix type-in max context for OAI
2023-12-03 13:56:22 +02:00
Cohee
64a3564892
lint: Comma dangle
2023-12-02 22:06:57 +02:00
Cohee
c63cd87cc0
lint: Require semicolons
2023-12-02 21:11:06 +02:00
valadaptive
a37f874e38
Require single quotes
2023-12-02 13:04:51 -05:00
valadaptive
518bb58d5a
Enable no-unused-vars lint
...
This is the big one. Probably needs thorough review to make sure I
didn't accidentally remove any setInterval or fetch calls.
2023-12-02 12:11:19 -05:00
valadaptive
c893e2165e
Enable no-prototype-builtins lint
2023-12-02 12:10:31 -05:00
valadaptive
0a27275772
Enable no-extra-semi lint
2023-12-02 10:32:26 -05:00
valadaptive
367f3dba27
Enable no-unsafe-finally lint
2023-12-02 10:32:07 -05:00
Cohee
19c6370fa5
Revert preset checkbox update logic
2023-12-01 11:55:05 +02:00
Cohee
b96054f337
Update max token limit for palm2
2023-11-30 19:02:31 +02:00
Cohee
e9ad55aef2
Add seed input field for OpenAI settings #1412
2023-11-30 02:54:52 +02:00
Cohee
d263760b25
#1393 Configurable group nudges, scenario and personality templates for prompt manager
2023-11-27 23:57:56 +02:00
Cohee
61908935f5
Stop string for user-continue. Trim spaces after name2
2023-11-22 16:16:48 +02:00
Cohee
5f77b2f816
Add Claude 2.1
2023-11-21 20:07:37 +02:00
Cohee
73e081dd99
Don't use global state to build Chat Completion prompts
2023-11-21 14:38:15 +02:00
Cohee
0608c0afac
Add OpenRouter and Llava to captioning plugin.
2023-11-17 23:19:21 +02:00
Cohee
323b338cdd
Add images to quiet prompts if inlining enabled
2023-11-17 01:30:32 +02:00
Cohee
d114ebf6fa
Add default role for Message class if not set.
2023-11-16 16:20:33 +02:00
Cohee
314aca3f2c
Allow disabling system marker prompts
2023-11-14 22:27:07 +02:00
Cohee
d3e5f6ebc0
#1343 Move bypass check up
2023-11-12 23:08:24 +02:00
Cohee
9a1d1594d6
Fix formatting in openai.js
2023-11-12 22:14:35 +02:00
artisticMink
cc0b4e8174
Access oai_settings instead of dom
2023-11-12 20:55:29 +01:00
artisticMink
3bbbf0d8e4
Put openrouter model sorting in drawer
...
Renames 'Infinity'k tokens to 'Free'
2023-11-12 19:02:41 +01:00
artisticMink
cb2644cdea
Add sorting for openrouter models
...
Alphabetically (default), price, context size
2023-11-12 18:27:56 +01:00
artisticMink
a16e34bcef
Add optional toggle for grouping openrouter models
...
By vendor
2023-11-12 15:15:30 +01:00
Cohee
7afe9e6481
#1343 Add status check bypass
2023-11-12 13:23:46 +02:00
Cohee
4c0b3fb7ae
Add null checks for OR pricing
2023-11-12 13:07:57 +02:00
Cohee
879502c1e7
Only allow inlining if OAI is the selected API
2023-11-12 00:13:30 +02:00
Cohee
2c4f53e7b5
Add native GPT-4V image inlining
2023-11-12 00:09:48 +02:00
Cohee
2f5e7778cc
Don't add items of unknown type to chat completion
2023-11-10 01:08:18 +02:00
Cohee
0e89bf90bc
Use correct tokenizers for logit bias for Mistral and Llama models over OpenRouter
2023-11-09 01:03:54 +02:00
Cohee
d81354e2a5
Merge branch 'staging' of https://github.com/SillyTavern/SillyTavern into staging
2023-11-08 16:29:02 +02:00
Cohee
dbf995fd24
Add character card macros
2023-11-08 16:28:55 +02:00
RossAscends
a5fd33d08a
Kobold sampler restyle
2023-11-08 23:24:28 +09:00
Cohee
740f6548a2
Increase timeouts of OAI out of quota requests
2023-11-08 12:07:14 +02:00
Cohee
b2629d9718
Refactor status checks and indicators.
2023-11-08 02:52:03 +02:00
Cohee
2020d12217
Add new GPT 3.5 turbo model
2023-11-07 00:10:32 +02:00
Cohee
9b0ac48cda
Add GPT-4 preview model
2023-11-06 23:29:45 +02:00
Cohee
21e0a42060
Fix arch in models list, remove log
2023-11-05 22:03:20 +02:00
Cohee
c1e126985d
Merge branch 'staging' of https://github.com/SillyTavern/SillyTavern into staging
2023-11-05 21:54:23 +02:00
Cohee
fedc3b887f
Add llama2 tokenizer for OpenRouter models
2023-11-05 21:54:19 +02:00
Cohee
88df8501b3
Fix continue on forced OR instruct. Display proper itemized prompt
2023-11-05 02:20:15 +02:00
Cohee
f10833a516
Add prompt format override for OpenRouter
2023-11-03 00:34:22 +02:00
RossAscends
d50124e937
appwide slider overhaul
2023-10-26 13:20:47 +09:00
Cohee
d0637750e7
Add system message collapse for chat comps
2023-10-14 22:05:09 +03:00
Cohee
84098ae933
Fix injection order (again)
2023-10-11 22:56:17 +03:00
Cohee
e2f0162e5a
Fix injection order
2023-10-11 22:42:25 +03:00
Cohee
59ae661f62
Fix itemization viewer
2023-10-11 22:09:24 +03:00
Cohee
abb78d1d6b
Add at-depth position for custom Prompt Manager prompts
2023-10-11 16:03:36 +03:00
Cohee
82182015e2
Allow group nudge in chat completions.
2023-10-02 00:24:16 +03:00
Brian Dashore
bfda5a5492
Extra fixes ( #1185 )
2023-09-26 09:53:04 +03:00
Cohee
03e5ca054d
Limit number of custom stop strings for Palm API
2023-09-25 23:12:14 +03:00
Cohee
3e29d39f05
Fix Palm when streaming is enabled
2023-09-25 20:24:56 +03:00
Cohee
edb79d8c53
Synchronize max depths for plugins
2023-09-25 19:29:24 +03:00
Cohee
a081f78bd8
(WIP) PaLM API
2023-09-23 20:48:56 +03:00
Cohee
902acc44a2
Support "before main prompt" extension position in prompt manager
2023-09-21 20:46:08 +03:00
Maks
158aa79aed
add model gpt-3.5-turbo-instruct and 0914 variant ( #1154 )
2023-09-19 23:50:27 +03:00
Cohee
ab9aa28fe4
Move missed endpoints
2023-09-16 18:03:31 +03:00
Cohee
61995bb33f
Move preset management into a separate file
2023-09-16 17:36:54 +03:00
Cohee
eaca6ddaf0
Don't try to resolve unknown tiktoken models
2023-09-15 19:31:17 +03:00
Jason Wu
7a3869c476
Enable Smart Context (ChromaDB) support within OpenAI API ( #1125 )
...
* Add JetBrains IDE .idea folder to .gitignore
* Enable Smart Context (ChromaDB) support within OpenAI API
2023-09-13 13:01:56 +03:00
Cohee
65b4551864
Reserve 3 extra tokens for each chat completion
2023-09-11 17:22:31 +03:00
Cohee
a5acc7872d
Add OpenAI vector source.
2023-09-08 13:57:27 +03:00
Cohee
96df705409
Change insertion strategy to an extension block
2023-09-08 01:26:26 +03:00
Cohee
5ef79bd64d
Remove NSFW avoidance prompt from Prompt Manager
2023-09-05 18:14:56 +03:00
Cohee
67c8476cdf
Set 0 tokens for prompts with no content
2023-09-04 02:40:16 +03:00
Cohee
80e286fed2
Fix double insertion of persona description to prompt manager if position set to A/N
2023-09-04 02:26:15 +03:00
Cohee
4a6705cea8
Prompt manager configuration fixes ( #1078 )
...
* Refactor oai preset change event into before and after
* Simplify and reinforce prompt manager render without character
* Check if main prompt exists before adding nsfwAvoidance
* Sanitize prompt manager configuration on preset loading
---------
Co-authored-by: maver <kentucky@posteo.de>
2023-09-01 23:23:03 +03:00
Cohee
636c06ffdd
Autosize prompt manager boxes
2023-08-27 21:28:13 +03:00
Cohee
e74bca88f4
Disable OpenRouter fallback by default
2023-08-27 18:39:04 +03:00
Cohee
44661d0e2b
Merge branch 'staging' into generate-array
2023-08-26 01:07:19 +03:00
Cohee
42e6da4a36
Add support of stop strings to OpenAI / Claude
2023-08-26 00:12:11 +03:00
Cohee
12f1cdb3fd
#1020 Fix summarize + prompt manager. Clarify naming for insertion position.
2023-08-25 20:03:31 +03:00
Cohee
685e9b7e79
Stabilize extension injections order for prompt manager
2023-08-25 17:15:55 +03:00
Cohee
aceb32cfe9
Fix freq pen overwrite by pres pen
2023-08-25 02:52:38 +03:00
Cohee
4aa31fcba9
Add fallback option for OpenRouter
2023-08-24 03:21:17 +03:00
Cohee
52c2fcd407
Fix OpenRouter model not persisting on page reload
2023-08-24 00:59:57 +03:00
maver
65e595ad48
Increase prompt oder dummy user id by 1
2023-08-23 20:41:13 +02:00
maver
5a02250a1f
Add persona description to prompt manager order
2023-08-23 20:40:26 +02:00
Cohee
e77da62b85
Add padding to cache key. Fix Safari display issues. Fix 400 on empty translate. Reset bias cache on changing model.
2023-08-23 10:32:48 +03:00
Cohee
b385bd190a
Less strict rules for logit bias files import
2023-08-23 03:41:58 +03:00
Cohee
f633f62065
Don't save null values to OpenAI logit bias
2023-08-23 03:36:04 +03:00
Cohee
bc5fc67906
Put tokenizer functions to a separate file. Cache local models token counts
2023-08-23 02:38:43 +03:00
Cohee
7250770c5d
Don't reduce Claude token counts by 2
2023-08-22 23:20:53 +03:00
Cohee
41cc86af9f
Add example extension for chat variables. Allow registering custom text processing functions for extensions
2023-08-22 17:46:37 +03:00
based
3716fd51ef
add example names to initial system prompt
2023-08-22 22:29:57 +10:00
based
ba925f388c
added more options to the scale request
2023-08-22 21:29:18 +10:00
based
06902279a9
merge
2023-08-22 21:17:18 +10:00
Cohee
57b126bfbf
Save chat completions settings to an object. Update numeric setting types
2023-08-22 00:35:46 +03:00
maver
cb5b410daf
Fix group nudge causing error
2023-08-20 20:18:10 +02:00
Cohee
be6fedd626
Simplify Claude prefill code
2023-08-20 19:26:49 +03:00
Cohee
a27bef8b12
Merge branch 'staging' into qolfeatures
2023-08-20 18:47:43 +03:00
Cohee
c2c0007ad1
Merge pull request #982 from SillyTavern/prompt-manager-hotfix
...
Prompt manager hotfix
2023-08-20 18:41:09 +03:00
maver
07c24f363f
Render prompt manager before a character is selected
...
When prompt order strategy is global
2023-08-20 16:28:42 +02:00
maver
58ab266365
Make sure new example chat is not added without messages
2023-08-20 15:53:42 +02:00
maver
5fee1f6f96
Add group nudge to prompts array
2023-08-20 15:35:15 +02:00
maver
58a018deae
Check if newExample can be afforded before adding it
2023-08-20 15:35:01 +02:00
Cohee
efa0f12349
Fix prompt manager issues
2023-08-20 16:25:16 +03:00