Cohee
6086cedf2b
Use XHR to load HTML templates
2024-01-12 22:00:08 +02:00
Cohee
4fe13fab8e
Customizable /gen instruct name
2024-01-12 19:16:42 +02:00
Cohee
4e5f01d785
Merge pull request #1668 from valadaptive/macro-cleanups-1
...
Move substituteParams into its own module
2024-01-12 11:57:21 +02:00
valadaptive
05003ccf78
Remove silly debug logging
2024-01-12 04:38:40 -05:00
valadaptive
89a999cfd4
Move macro substitution to new module
...
substituteParams has become a thin wrapper around the new evaluateMacros
function, and will become more of a compatibility shim as refactorings
and rewrites are done.
2024-01-10 22:22:30 -05:00
Cohee
3f6f32edad
Add {{mesExamplesRaw}} macro for story string
2024-01-10 14:11:02 +02:00
Cohee
adf82f2ba8
#1663 Add last prompt line to quiet prompts
2024-01-09 01:14:23 +02:00
Cohee
f7b1b490c7
Larger alternate greetings window
2024-01-06 19:59:48 +02:00
Cohee
5f93c30a96
#1627 Bypass status check and custom model for textgen type
2024-01-05 19:15:07 +02:00
Cohee
c69724e1da
Fix GUI Kobold
2024-01-02 10:28:34 +02:00
Cohee
52637ccd39
Merge pull request #1619 from LenAnderson/worldinfo_updated-event
...
Add event when world info is updated
2024-01-01 18:35:23 +02:00
Cohee
f53d937782
Fix mistral undefined name
2024-01-01 18:31:17 +02:00
Cohee
9106696f2f
Render prompt manager when switching APIs
2024-01-01 17:06:10 +02:00
Cohee
908bf7a61d
Merge branch 'staging' into generate-cleanups-3
2024-01-01 16:49:35 +02:00
LenAnderson
8cd75cf03d
add event when world info is updated
2024-01-01 14:34:09 +00:00
Cohee
30732ada32
Lint fix
2024-01-01 16:08:24 +02:00
maver
ee70593a7e
Add world info to generate_before_combine_prompts event data
2023-12-28 17:03:36 +01:00
Cohee
8dd4543e93
Remove macro from user messages when using bias
2023-12-28 11:19:56 +02:00
valadaptive
77b02a8d4b
Extract data.error check
2023-12-26 12:41:35 -05:00
valadaptive
0f8a16325b
Extract dryRun early return from finishGenerating
...
This means we only have to handle it in one place rather than two.
2023-12-25 03:48:49 -05:00
valadaptive
3c0207f6cb
Move "continue on send" logic out of Generate()
2023-12-25 03:48:49 -05:00
valadaptive
7899549754
Make "send message from chat box" into a function
...
Right now all it does is handle returning if there's already a message
being generated, but I'll extend it with more logic that I want to move
out of Generate().
2023-12-25 03:48:49 -05:00
valadaptive
1029ad90a2
Extract "not in a chat" check into guard clause
...
This lets us remove a layer of indentation, and reveal the error
handling logic that was previously hidden below a really long block of
code.
2023-12-25 03:48:49 -05:00
valadaptive
4fc2f15448
Reformat up Generate() group logic
...
The first two conditions in the group if/else blocks are the same, so we
can combine them.
2023-12-25 03:48:49 -05:00
valadaptive
0d3505c44b
Remove OAI_BEFORE_CHATCOMPLETION
...
Not used in any internal code or extensions I can find.
2023-12-25 03:48:49 -05:00
valadaptive
f53e051cbf
Lift precondition check out of processCommands
...
Instead of passing type and dryRun into processCommands, do the check in
Generate, the only function that calls it. This makes the logic clearer.
2023-12-25 03:48:49 -05:00
Cohee
a9e074dae1
Don't recreate first message if generation was run at least once
2023-12-24 02:47:00 +02:00
Cohee
db3bf42d63
Fix Firefox number arrows not updating the slider
2023-12-23 16:09:03 +02:00
Cohee
09fd772a20
#1579 Add ooba character yaml import
2023-12-21 21:46:09 +02:00
Cohee
4621834c87
Short formatting path for empty messages
2023-12-21 20:50:30 +02:00
Cohee
a85a6cf606
Allow displaying unreferenced macro in message texts
2023-12-21 20:49:03 +02:00
Cohee
39e0b0f5cb
Remove custom Handlebars helpers for extensions.
2023-12-21 20:33:50 +02:00
valadaptive
8fb26284e2
Clean up Generate(), part 2 ( #1578 )
...
* Move StreamingProcessor constructor to the top
Typical code style is to declare the constructor at the top of the class
definition.
* Remove removePrefix
cleanupMessage does this already.
* Make message_already_generated local
We can pass it into StreamingProcessor so it doesn't have to be a global
variable.
* Consolidate setting isStopped and abort signal
Various places were doing some combination of setting isStopped, calling
abort on the streaming processor's abort controller, and calling
onStopStreaming. Let's consolidate all that functionality into
onStopStreaming/onErrorStreaming.
* More cleanly separate streaming/nonstreaming paths
* Replace promise with async function w/ handlers
By using onSuccess and onError as promise handlers, we can use normal
control flow and don't need to remember to use try/catch blocks or call
onSuccess every time.
* Remove runGenerate
Placing the rest of the code in a separate function doesn't really do
anything for its structure.
* Move StreamingProcessor() into streaming code path
* Fix return from circuit breaker
* Fix non-streaming chat completion request
* Fix Horde generation and quiet unblocking
---------
Co-authored-by: Cohee <18619528+Cohee1207@users.noreply.github.com>
2023-12-21 20:20:28 +02:00
Cohee
b3dfe16706
#1575 Fix clean-up WI depth injections
2023-12-21 16:33:21 +02:00
Cohee
ee75adbd2d
Update persona name if it is bound by user name input
2023-12-21 14:56:32 +02:00
Cohee
cf8d7e7d35
Merge branch 'staging' into custom
2023-12-20 18:37:47 +02:00
Cohee
ebec26154c
Welcome message fixed
2023-12-20 18:37:34 +02:00
Cohee
5734dbd17c
Add custom endpoint type
2023-12-20 18:29:03 +02:00
Cohee
041b9d4b01
Add style sanitizer to message renderer
2023-12-20 17:03:37 +02:00
Cohee
b0a4341571
Merge pull request #1574 from artisticMink/feature/before-combine-event
...
Allow extensions to alter the context order.
2023-12-20 15:46:34 +02:00
maver
f30f75b310
Add GENERATE_BEFORE_COMBINE_PROMPTS event
...
Allows for context to be ordered by extensions
2023-12-19 19:11:36 +01:00
Cohee
67dd52c21b
#1309 Ollama text completion backend
2023-12-19 16:38:11 +02:00
Cohee
edd737e8bd
#371 Add llama.cpp inference server support
2023-12-18 22:38:28 +02:00
Cohee
b0d9f14534
Re-add Together as a text completion source
2023-12-17 23:38:03 +02:00
Cohee
180061337e
Merge branch 'staging' into anachronous/release
2023-12-17 21:35:49 +02:00
LenAnderson
fb25a90532
add GENERATION_STARTED event
2023-12-17 17:45:23 +00:00
anachronos
1e88c8922a
Merge branch 'staging' into release
2023-12-17 10:38:04 +01:00
Fayiron
9f2d32524c
Add TogetherAI as a chat completion source, basic
2023-12-16 14:39:30 +01:00
based
583f786d74
finish mistral frontend integration + apikey status check
2023-12-16 07:15:57 +10:00
Cohee
ef17702f6a
Merge branch 'staging' into bg-load-improvements
2023-12-15 17:02:10 +02:00