Cohee
eaadfea639
Extend debounce duration of logprobs renderer
2024-02-24 15:03:57 +02:00
Cohee
9287ff18de
Fix for non-streaming
2024-02-24 14:50:06 +02:00
Cohee
dab9bbb514
Merge pull request #1844 from infermaticAI/InfermaticAI
...
Add InfermaticAI as a text completion source
2024-02-24 14:28:09 +02:00
Deciare
445cbda02f
If token probability is a logarithm it'll be < 0
...
No need to read settings to find out if llama.cpp backend is in use...
2024-02-24 00:13:33 -05:00
Deciare
9eba076ae4
Sampler order for llama.cpp server backend
2024-02-23 23:01:04 -05:00
Deciare
936fbac6c5
Merge remote-tracking branch 'origin/staging' into llamacpp-probs
2024-02-23 17:45:54 -05:00
Cohee
737a0bd3ae
Fix purge extras and mistral vectors
2024-02-23 22:37:00 +02:00
Cohee
9b34ac1bde
Merge pull request #1852 from berbant/staging
...
Display TranslateProvider link
2024-02-23 21:43:59 +02:00
Cohee
cb536a7611
Save a list of safe to export secret keys
2024-02-23 21:41:54 +02:00
Cohee
82c5042bad
Prevent extra loop iterations on buffer init
2024-02-23 21:23:44 +02:00
Cohee
4baefeba68
Extend per-entry scan depth limit, add warnings on overflow
2024-02-23 21:18:40 +02:00
Deciare
344b9eedbc
Request token probabilities from llama.cpp backend
...
llama.cpp server token probabilities are given as values ranging from
0 to 1 instead of as logarithms.
2024-02-23 14:01:46 -05:00
NWilson
f82740a238
Change Non-streaming Handler
2024-02-22 15:51:11 -06:00
berbant
bc2010a762
Update secrets.js
2024-02-22 23:55:57 +04:00
berbant
eb89337f51
Update index.js
2024-02-22 23:49:47 +04:00
Cohee
c9f0d61f19
#1851 Substitute macros in new example chat
2024-02-22 18:45:50 +02:00
NWilson
f569424f3e
Merge branch 'staging' into InfermaticAI
2024-02-22 08:32:10 -06:00
Cohee
beb5e470a2
#1069 Fix hoisting of pristine cards in newest sort
2024-02-22 04:48:46 +02:00
Cohee
ece3b2a7c1
Fix Chat Completions status check on settings loading if another API is selected
2024-02-22 04:36:06 +02:00
Cohee
06c3ea7c51
Merge pull request #1811 from kalomaze/sampler-order-ooba
...
Sampler priority support (for text-generation-webui)
2024-02-22 02:55:38 +02:00
Cohee
0ccdfe4bb7
Fix duped line
2024-02-22 02:45:35 +02:00
Cohee
40aa971d11
Merge branch 'staging' into sampler-order-ooba
2024-02-22 02:44:32 +02:00
Cohee
fb6fa54c7f
Fix import fetch HTTP method
2024-02-21 19:57:38 +02:00
Cohee
fcf171931a
Merge pull request #1846 from SillyTavern/pygimport
...
Pygimport
2024-02-21 19:55:57 +02:00
Cohee
92af4137a9
Use new export endpoint
2024-02-21 11:28:59 +02:00
Cohee
711fd0517f
Merge branch 'staging' into pygimport
2024-02-21 11:26:47 +02:00
Cohee
d31195a704
Apply same width for Kobold order
...
Just in case
2024-02-21 01:02:23 +02:00
Cohee
10fb69f36a
Widen the block
2024-02-21 00:59:38 +02:00
Cohee
d353fa58d0
Close div properly
2024-02-21 00:56:40 +02:00
Cohee
96f1ce1fce
Skill issue?
2024-02-21 00:55:30 +02:00
Cohee
0c1cf9ff2e
Send sampler priority as array
2024-02-21 00:53:54 +02:00
NWilson
7c12c836f2
Implement Key Filter
2024-02-20 09:40:35 -06:00
NWilson
48b9eb8542
Revert "Add InfermaticAI Profile"
...
This reverts commit 1e7c2820da
.
2024-02-20 09:37:39 -06:00
Cohee
f43e686301
Merge pull request #1845 from underscorex86/patch-1
...
Update slash-commands.js
2024-02-20 16:57:25 +02:00
Cohee
f0141b4dd1
Update slash-commands.js
2024-02-20 16:57:00 +02:00
NWilson
1e7c2820da
Add InfermaticAI Profile
2024-02-20 08:12:59 -06:00
Sneha C
095cd873de
Update slash-commands.js
...
added the word "persona" to the /sync description to make it easier for users to find.
2024-02-20 16:48:43 +04:00
Cohee
8ba9b5c38b
Merge branch 'staging' into sampler-order-ooba
2024-02-20 02:32:33 +02:00
Cohee
8e66a14e37
Add hints to doc strings about additional command prompts
2024-02-20 02:29:14 +02:00
Cohee
79ba026486
Merge pull request #1840 from Wolfsblvt/slash-commands-menu-actions-allow-custom-prompts
...
Extend impersonate/continue/regenerate with possible custom prompts (via slash commands and popup menu)
2024-02-20 02:26:41 +02:00
kalomaze
cec0698400
Oopsie
2024-02-19 18:24:04 -06:00
kalomaze
f3971686ea
Move text-gen-webui sampler order under kcpp order
2024-02-19 18:18:57 -06:00
kalomaze
32ee58e5e6
fix kcpp order reset
2024-02-19 18:12:56 -06:00
kalomaze
0d8858285f
Merge branch 'SillyTavern:release' into sampler-order-ooba
2024-02-19 18:11:30 -06:00
Cohee
061b7c6922
Don't try to execute script commands if the message doesn't start with slash
2024-02-20 02:09:01 +02:00
Wolfsblvt
a5ee46cb2a
Only respect slash command, ignore text field
2024-02-19 22:36:32 +01:00
Wolfsblvt
550d8483cc
Extend impersonate/continue/regenerate with possible custom prompts
...
- Use custom prompt provided via slash command arguments (similar to /sysgen and others)
- Use written text from textbox, if the popout menu actions are clicked
2024-02-19 22:23:58 +01:00
Cohee
2e00a1baaf
[FEATURE_REQUEST] Can the unlocked max context size for OpenAI completion be increased from 102k to 200k for example? #1842
2024-02-19 19:37:18 +02:00
NWilson
030806bf1e
Merge remote-tracking branch 'origin/staging' into InfermaticAI
2024-02-19 10:14:06 -06:00
NWilson
e55d903613
Support more settings
2024-02-19 09:53:26 -06:00