9cf6cef0a4
Fix aborting generation on KoboldCpp via Text Completion
2023-12-21 23:14:28 +02:00
b782a8cc03
Add util for trim v1
2023-12-21 22:40:08 +02:00
09fd772a20
#1579 Add ooba character yaml import
2023-12-21 21:46:09 +02:00
4621834c87
Short formatting path for empty messages
2023-12-21 20:50:30 +02:00
a85a6cf606
Allow displaying unreferenced macro in message texts
2023-12-21 20:49:03 +02:00
39e0b0f5cb
Remove custom Handlebars helpers for extensions.
2023-12-21 20:33:50 +02:00
343c33e331
Stricter Horde prompt sanitation
2023-12-21 20:22:21 +02:00
8fb26284e2
Clean up Generate(), part 2 ( #1578 )
...
* Move StreamingProcessor constructor to the top
Typical code style is to declare the constructor at the top of the class
definition.
* Remove removePrefix
cleanupMessage does this already.
* Make message_already_generated local
We can pass it into StreamingProcessor so it doesn't have to be a global
variable.
* Consolidate setting isStopped and abort signal
Various places were doing some combination of setting isStopped, calling
abort on the streaming processor's abort controller, and calling
onStopStreaming. Let's consolidate all that functionality into
onStopStreaming/onErrorStreaming.
* More cleanly separate streaming/nonstreaming paths
* Replace promise with async function w/ handlers
By using onSuccess and onError as promise handlers, we can use normal
control flow and don't need to remember to use try/catch blocks or call
onSuccess every time.
* Remove runGenerate
Placing the rest of the code in a separate function doesn't really do
anything for its structure.
* Move StreamingProcessor() into streaming code path
* Fix return from circuit breaker
* Fix non-streaming chat completion request
* Fix Horde generation and quiet unblocking
---------
Co-authored-by: Cohee <18619528+Cohee1207@users.noreply.github.com >
2023-12-21 20:20:28 +02:00
75eaa09cc3
Merge remote-tracking branch 'upstream/staging' into staging
2023-12-21 17:57:08 +02:00
1c9643806b
Merge remote-tracking branch 'upstream/staging' into staging
2023-12-21 17:30:37 +02:00
bddccd0356
Missed several context bind cases
2023-12-21 17:19:42 +02:00
fac4169dd8
Merge pull request #1568 from DonMoralez/staging
...
(claude)reworked prefix, sysprompt, console messages, sequence check
2023-12-21 17:02:05 +02:00
ffb711d802
Unify Claude request logging with other API sources
2023-12-21 16:59:43 +02:00
b5e59c819c
Merge branch 'staging' into claude-rework
2023-12-21 16:52:43 +02:00
e1afe41c91
Fix custom expression duplication
2023-12-21 16:50:30 +02:00
b3dfe16706
#1575 Fix clean-up WI depth injections
2023-12-21 16:33:21 +02:00
e087f29496
Log MistralAI prompts to server console
2023-12-21 16:08:58 +02:00
dd661cf879
Instruct "Bind to context" is now an option
2023-12-21 15:12:30 +02:00
ee75adbd2d
Update persona name if it is bound by user name input
2023-12-21 14:56:32 +02:00
f3099ac270
Remove model icon fill colors
2023-12-21 14:43:36 +02:00
747867c6f4
Merge pull request #1580 from SillyTavern/custom
...
Custom API endpoint type for Chat Completion
2023-12-21 14:42:18 +02:00
1456ebd4bb
Merge branch 'staging' of https://github.com/DonMoralez/SillyTavern into staging
2023-12-21 13:39:30 +02:00
940da09fd4
Merge remote-tracking branch 'upstream/staging' into staging
2023-12-21 12:32:04 +02:00
afdd9d823e
Merge branch 'staging' of https://github.com/Cohee1207/SillyTavern into staging
2023-12-21 12:43:59 +09:00
348cc5f2a5
placeholder API icon for tabby
2023-12-21 12:43:57 +09:00
3001db3a47
Add additional parameters for custom endpoints
2023-12-20 23:39:10 +02:00
e42daa4098
Add caption ask prompt mode
2023-12-20 21:23:59 +02:00
ae64c99835
Add custom caption source
2023-12-20 21:05:20 +02:00
cf8d7e7d35
Merge branch 'staging' into custom
2023-12-20 18:37:47 +02:00
ebec26154c
Welcome message fixed
2023-12-20 18:37:34 +02:00
5734dbd17c
Add custom endpoint type
2023-12-20 18:29:03 +02:00
041b9d4b01
Add style sanitizer to message renderer
2023-12-20 17:03:37 +02:00
34decf1c05
add creating of new QR sets
2023-12-20 14:04:28 +00:00
c212a71425
Fix ignore list of preset manager
2023-12-20 15:51:00 +02:00
b0a4341571
Merge pull request #1574 from artisticMink/feature/before-combine-event
...
Allow extensions to alter the context order.
2023-12-20 15:46:34 +02:00
69d6b9379a
implement QR basics
2023-12-20 13:40:44 +00:00
e19bf1afdd
clean out QR extension
2023-12-20 13:39:09 +00:00
93db2bf953
Simplify extras summary settings
2023-12-20 01:56:35 +02:00
4b131067e4
Add local multimodal caption sources
2023-12-20 00:45:45 +02:00
d3024d3b9a
Merge remote-tracking branch 'upstream/staging' into staging
2023-12-20 00:06:24 +02:00
029cf598ce
Fix /peek command
2023-12-19 23:12:14 +02:00
8d63ce5559
Log Novel Ai prompt to console
...
When prompt logging is enabled.
2023-12-19 19:27:24 +01:00
f30f75b310
Add GENERATE_BEFORE_COMBINE_PROMPTS event
...
Allows for context to be ordered by extensions
2023-12-19 19:11:36 +01:00
da1e9cb3b2
Use const where possible
2023-12-19 19:48:42 +02:00
a78875ca08
Use native color util
2023-12-19 19:47:23 +02:00
3b22159f53
Fix spelling
2023-12-19 19:45:28 +02:00
423c2b70dc
Camel case variable name
2023-12-19 19:44:52 +02:00
6859e4443e
Fix ollama chunk wrapper
2023-12-19 19:17:19 +02:00
c7b93b690f
Merge pull request #1573 from StefanDanielSchwarz/Llama-2-Chat-separator-fix
...
Llama 2 Chat separator fix
2023-12-19 19:14:54 +02:00
44318fef22
Fix double logging of non-streamed replies
2023-12-19 16:49:21 +02:00