Commit Graph

1839 Commits

Author SHA1 Message Date
Cohee b315778e32 Chunkify NovelAI TTS 2024-01-01 21:31:08 +02:00
Cohee 58462d96d2 Fix RVC after converting to group 2024-01-01 20:22:48 +02:00
Cohee 52637ccd39
Merge pull request #1619 from LenAnderson/worldinfo_updated-event
Add event when world info is updated
2024-01-01 18:35:23 +02:00
Cohee 9106696f2f Render prompt manager when switching APIs 2024-01-01 17:06:10 +02:00
Cohee 908bf7a61d Merge branch 'staging' into generate-cleanups-3 2024-01-01 16:49:35 +02:00
LenAnderson 8cd75cf03d add event when world info is updated 2024-01-01 14:34:09 +00:00
Cohee 30732ada32 Lint fix 2024-01-01 16:08:24 +02:00
Cohee 213ff4b89a
Merge pull request #1613 from LenAnderson/expressions-get-last
Add export and slash command for last set expressions
2024-01-01 16:06:57 +02:00
Cohee a2e4dc2950 Add chunking of vector storage messages 2023-12-31 04:00:04 +02:00
LenAnderson a2aa8ba6a0 add export and slash command for last set expressions 2023-12-30 11:37:13 +00:00
LenAnderson 0590b36838 only reload ST after extension popup closed 2023-12-28 10:46:25 +00:00
valadaptive 7899549754 Make "send message from chat box" into a function
Right now all it does is handle returning if there's already a message
being generated, but I'll extend it with more logic that I want to move
out of Generate().
2023-12-25 03:48:49 -05:00
valadaptive 0d3505c44b Remove OAI_BEFORE_CHATCOMPLETION
Not used in any internal code or extensions I can find.
2023-12-25 03:48:49 -05:00
valadaptive d2f8632368 Remove populateLegacyTokenCounts
Unused and the documentation says it should probably be removed
2023-12-25 03:48:49 -05:00
Cohee 47cb017a45 #1589 Add 'cache_prompt' for l.cpp 2023-12-25 02:42:03 +02:00
Cohee 352b00caca Merge branches 'staging' and 'staging' of https://github.com/SillyTavern/SillyTavern into staging 2023-12-24 23:11:11 +02:00
RigbyB b6570e775d ComfyUI request/prompt fix 2023-12-24 21:02:04 +00:00
Cohee f8dece9d88 Always remove logit bias and stop from vision 2023-12-24 20:01:59 +02:00
Cohee a8fb306c12 Add multimodal captioning for ooba 2023-12-24 01:43:29 +02:00
Doa 41ac2c07b2 Adding negative character prompts for img sources that support it 2023-12-23 16:19:22 +00:00
Cohee b7a338e130 Move all Horde requests to server 2023-12-22 22:10:09 +02:00
Cohee 89d70539b9 Alternative continue method for chat completions 2023-12-22 20:24:54 +02:00
Cohee 45f6cb0fa8 Add chunked translate for long messages 2023-12-22 00:05:23 +02:00
Cohee a85a6cf606 Allow displaying unreferenced macro in message texts 2023-12-21 20:49:03 +02:00
Cohee 39e0b0f5cb Remove custom Handlebars helpers for extensions. 2023-12-21 20:33:50 +02:00
Cohee bddccd0356 Missed several context bind cases 2023-12-21 17:19:42 +02:00
Cohee b5e59c819c Merge branch 'staging' into claude-rework 2023-12-21 16:52:43 +02:00
Cohee e1afe41c91 Fix custom expression duplication 2023-12-21 16:50:30 +02:00
Cohee dd661cf879 Instruct "Bind to context" is now an option 2023-12-21 15:12:30 +02:00
Cohee ee75adbd2d Update persona name if it is bound by user name input 2023-12-21 14:56:32 +02:00
Cohee 3001db3a47 Add additional parameters for custom endpoints 2023-12-20 23:39:10 +02:00
Cohee e42daa4098 Add caption ask prompt mode 2023-12-20 21:23:59 +02:00
Cohee ae64c99835 Add custom caption source 2023-12-20 21:05:20 +02:00
Cohee cf8d7e7d35 Merge branch 'staging' into custom 2023-12-20 18:37:47 +02:00
Cohee ebec26154c Welcome message fixed 2023-12-20 18:37:34 +02:00
Cohee 5734dbd17c Add custom endpoint type 2023-12-20 18:29:03 +02:00
Cohee 041b9d4b01 Add style sanitizer to message renderer 2023-12-20 17:03:37 +02:00
Cohee c212a71425 Fix ignore list of preset manager 2023-12-20 15:51:00 +02:00
Cohee b0a4341571
Merge pull request #1574 from artisticMink/feature/before-combine-event
Allow extensions to alter the context order.
2023-12-20 15:46:34 +02:00
Cohee 93db2bf953 Simplify extras summary settings 2023-12-20 01:56:35 +02:00
Cohee 4b131067e4 Add local multimodal caption sources 2023-12-20 00:45:45 +02:00
Cohee 029cf598ce Fix /peek command 2023-12-19 23:12:14 +02:00
maver 8d63ce5559 Log Novel Ai prompt to console
When prompt logging is enabled.
2023-12-19 19:27:24 +01:00
Cohee 67dd52c21b #1309 Ollama text completion backend 2023-12-19 16:38:11 +02:00
Cohee edd737e8bd #371 Add llama.cpp inference server support 2023-12-18 22:38:28 +02:00
DonMoralez 37807acc60 Merge remote-tracking branch 'upstream/staging' into staging 2023-12-18 22:01:38 +02:00
Cohee 6e8104873e #1569 Add logit bias for text completions 2023-12-18 18:57:10 +02:00
Cohee 08ea2095f8 Refactor Novel logit bias 2023-12-18 17:32:10 +02:00
Cohee be5d428706
Merge pull request #1565 from SillyTavern/togetherai
Add TogetherAI as a text completion source
2023-12-18 14:52:36 +02:00
Carsten Kragelund Jørgensen c2ad90eb2a
fix: verify QR exists when deleting through /qr-delete 2023-12-18 13:29:27 +01:00