DonMoralez
10fb83ee53
Merge remote-tracking branch 'upstream/staging' into staging
2023-12-15 13:12:15 +02:00
Cohee
cde9903fcb
Fix Bison models
2023-12-14 22:18:34 +02:00
DonMoralez
6f16ccf01f
Merge branch 'staging' of https://github.com/DonMoralez/SillyTavern into staging
2023-12-14 20:17:41 +02:00
Cohee
6bb894286e
Migrate palm source to makersuite
2023-12-14 19:54:31 +02:00
based
5071b9a369
webstorm moment
2023-12-15 02:01:42 +10:00
based
60880cfd4d
merge
2023-12-15 01:39:12 +10:00
based
698850b514
Merge remote-tracking branch 'fork/staging' into gemini
...
# Conflicts:
# server.js
# src/endpoints/prompt-converters.js
# src/endpoints/tokenizers.js
2023-12-15 01:35:17 +10:00
based
d5bcd96eef
message inlining vision support
2023-12-15 01:28:54 +10:00
based
0b7c1a98cd
added google vision caption support
2023-12-14 22:37:53 +10:00
based
ca87f29771
added streaming for google models
2023-12-14 21:03:41 +10:00
based
3e82a7d439
tokenizer changes and fixes. + a toggle
2023-12-14 16:31:08 +10:00
based
e26159c00d
refactor and rework palm request to work with the 'content' format and added an endpoint for googles tokenizer
2023-12-14 15:49:50 +10:00
based
be396991de
finish implementing ui changes for google models
2023-12-14 11:53:26 +10:00
based
69e24c9686
change palm naming in UI
2023-12-14 11:14:41 +10:00
valadaptive
22e048b5af
Rename generate_altscale endpoint
2023-12-13 18:53:46 -05:00
valadaptive
92bd766bcb
Rename chat completions endpoints
...
OpenAI calls this the "Chat Completions API", in contrast to their
previous "Text Completions API", so that's what I'm naming it; both
because other services besides OpenAI implement it, and to avoid
confusion with the existing /api/openai route used for OpenAI extras.
2023-12-13 18:52:08 -05:00
DonMoralez
fec27820ff
(claude)reworked prefix assignment, sysprompt mode, console message display
2023-12-13 21:19:26 +02:00
Cohee
9160de7714
Run macros on impersonation prompt
2023-12-12 19:24:32 +02:00
Cohee
9176f46caf
Add /preset command
2023-12-12 19:14:17 +02:00
Cohee
b0e7b73a32
Fix streaming processor error handler hooks
2023-12-08 02:01:08 +02:00
valadaptive
5569a63595
Remove legacy_streaming setting
...
This was a workaround for older versions of Slaude that implemented SSE
improperly. This was fixed in Slaude 7 months ago, so the workaround can
be removed.
2023-12-07 18:00:36 -05:00
valadaptive
cdcd913805
Don't stream events if the API returned a 4xx code
2023-12-07 18:00:36 -05:00
valadaptive
5540c165cf
Refactor server-sent events parsing
...
Create one server-sent events stream class which implements the entire
spec (different line endings, chunking, etc) and use it in all the
streaming generators.
2023-12-07 18:00:36 -05:00
Cohee
72adb4c8aa
Fix window.ai streaming
2023-12-07 17:42:06 +02:00
Cohee
671df1f62e
Fix constant usage
2023-12-04 00:24:23 +02:00
valadaptive
e33c8bd955
Replace use_[source] with chat_completion_source
...
Same as the is_[api] replacement--it's easier to have one enum field
than several mutually-exclusive boolean ones
2023-12-03 15:03:39 -05:00
Cohee
8a1ead531c
Merge pull request #1439 from valadaptive/prompt-manager-class
...
Convert PromptManagerModule to a class
2023-12-03 21:52:27 +02:00
Cohee
1786b0d340
#1403 Add Aphrodite multi-swipe
2023-12-03 20:40:09 +02:00
valadaptive
b8b24540a9
Rename PromptManagerModule to PromptManager
...
The one place where it was imported renamed it to PromptManager anyway.
2023-12-03 12:14:56 -05:00
Cohee
a3bc51bcea
Fix type-in max context for OAI
2023-12-03 13:56:22 +02:00
Cohee
64a3564892
lint: Comma dangle
2023-12-02 22:06:57 +02:00
Cohee
c63cd87cc0
lint: Require semicolons
2023-12-02 21:11:06 +02:00
valadaptive
a37f874e38
Require single quotes
2023-12-02 13:04:51 -05:00
valadaptive
518bb58d5a
Enable no-unused-vars lint
...
This is the big one. Probably needs thorough review to make sure I
didn't accidentally remove any setInterval or fetch calls.
2023-12-02 12:11:19 -05:00
valadaptive
c893e2165e
Enable no-prototype-builtins lint
2023-12-02 12:10:31 -05:00
valadaptive
0a27275772
Enable no-extra-semi lint
2023-12-02 10:32:26 -05:00
valadaptive
367f3dba27
Enable no-unsafe-finally lint
2023-12-02 10:32:07 -05:00
Cohee
19c6370fa5
Revert preset checkbox update logic
2023-12-01 11:55:05 +02:00
Cohee
b96054f337
Update max token limit for palm2
2023-11-30 19:02:31 +02:00
Cohee
e9ad55aef2
Add seed input field for OpenAI settings #1412
2023-11-30 02:54:52 +02:00
Cohee
d263760b25
#1393 Configurable group nudges, scenario and personality templates for prompt manager
2023-11-27 23:57:56 +02:00
Cohee
61908935f5
Stop string for user-continue. Trim spaces after name2
2023-11-22 16:16:48 +02:00
Cohee
5f77b2f816
Add Claude 2.1
2023-11-21 20:07:37 +02:00
Cohee
73e081dd99
Don't use global state to build Chat Completion prompts
2023-11-21 14:38:15 +02:00
Cohee
0608c0afac
Add OpenRouter and Llava to captioning plugin.
2023-11-17 23:19:21 +02:00
Cohee
323b338cdd
Add images to quiet prompts if inlining enabled
2023-11-17 01:30:32 +02:00
Cohee
d114ebf6fa
Add default role for Message class if not set.
2023-11-16 16:20:33 +02:00
Cohee
314aca3f2c
Allow disabling system marker prompts
2023-11-14 22:27:07 +02:00
Cohee
d3e5f6ebc0
#1343 Move bypass check up
2023-11-12 23:08:24 +02:00
Cohee
9a1d1594d6
Fix formatting in openai.js
2023-11-12 22:14:35 +02:00