Cohee
e8985c259c
Merge branch 'EugeoSynthesisThirtyTwo/release' into staging
2024-02-29 11:34:38 +02:00
Cohee
184fd1622f
Limit to ooba only. Exclude from presets
2024-02-29 11:33:47 +02:00
gabriel dhimoila
76669ff8bb
add max_tokens_second
2024-02-29 00:55:25 +01:00
Cohee
d024d7c700
Allow max value for per-entry depth
2024-02-27 23:34:07 +02:00
based
149a65cf62
migrate model name in old presets to new naming scheme
2024-02-27 02:23:07 +10:00
Cohee
f962ad5c02
Add OpenRouter as a text completion source
2024-02-25 22:47:07 +02:00
Cohee
9e5505a7d4
Autocomplete for WI automation IDs
2024-02-25 03:54:40 +02:00
Cohee
fc289126fa
Add event type for text completion generation request settings ready
2024-02-24 21:45:33 +02:00
Cohee
d5bf9fc28c
Non-streaming logprobs for Aphrodite
2024-02-24 20:53:23 +02:00
Cohee
d140b8d5be
Parse non-streaming tabby logprobs
2024-02-24 20:10:53 +02:00
Cohee
3cedf64f66
Add autocomplete for WI inclusion groups
2024-02-24 19:04:44 +02:00
Cohee
3441667336
#1853 Add WI/Script link by entry automation id
2024-02-24 17:22:51 +02:00
Cohee
7b8ac8f4c4
Properly use vector insert setting
2024-02-24 15:57:26 +02:00
Cohee
8848818d67
Fix dynatemp neutralization
2024-02-24 15:32:12 +02:00
Cohee
299bd9d563
Merge branch 'staging' into llamacpp-sampler-order
2024-02-24 15:10:58 +02:00
Cohee
13aebc623a
Merge pull request #1854 from deciare/llamacpp-probs
...
Request and display token probabilities from llama.cpp backend
2024-02-24 15:06:28 +02:00
Cohee
eaadfea639
Extend debounce duration of logprobs renderer
2024-02-24 15:03:57 +02:00
Cohee
9287ff18de
Fix for non-streaming
2024-02-24 14:50:06 +02:00
Cohee
dab9bbb514
Merge pull request #1844 from infermaticAI/InfermaticAI
...
Add InfermaticAI as a text completion source
2024-02-24 14:28:09 +02:00
Deciare
445cbda02f
If token probability is a logarithm it'll be < 0
...
No need to read settings to find out if llama.cpp backend is in use...
2024-02-24 00:13:33 -05:00
Deciare
9eba076ae4
Sampler order for llama.cpp server backend
2024-02-23 23:01:04 -05:00
Deciare
936fbac6c5
Merge remote-tracking branch 'origin/staging' into llamacpp-probs
2024-02-23 17:45:54 -05:00
Cohee
737a0bd3ae
Fix purge extras and mistral vectors
2024-02-23 22:37:00 +02:00
Cohee
9b34ac1bde
Merge pull request #1852 from berbant/staging
...
Display TranslateProvider link
2024-02-23 21:43:59 +02:00
Cohee
cb536a7611
Save a list of safe to export secret keys
2024-02-23 21:41:54 +02:00
Cohee
82c5042bad
Prevent extra loop iterations on buffer init
2024-02-23 21:23:44 +02:00
Cohee
4baefeba68
Extend per-entry scan depth limit, add warnings on overflow
2024-02-23 21:18:40 +02:00
Deciare
344b9eedbc
Request token probabilities from llama.cpp backend
...
llama.cpp server token probabilities are given as values ranging from
0 to 1 instead of as logarithms.
2024-02-23 14:01:46 -05:00
berbant
eb89337f51
Update index.js
2024-02-22 23:49:47 +04:00
Cohee
c9f0d61f19
#1851 Substitute macros in new example chat
2024-02-22 18:45:50 +02:00
NWilson
f569424f3e
Merge branch 'staging' into InfermaticAI
2024-02-22 08:32:10 -06:00
Cohee
ece3b2a7c1
Fix Chat Completions status check on settings loading if another API is selected
2024-02-22 04:36:06 +02:00
Cohee
0ccdfe4bb7
Fix duped line
2024-02-22 02:45:35 +02:00
Cohee
40aa971d11
Merge branch 'staging' into sampler-order-ooba
2024-02-22 02:44:32 +02:00
Cohee
0c1cf9ff2e
Send sampler priority as array
2024-02-21 00:53:54 +02:00
Cohee
f0141b4dd1
Update slash-commands.js
2024-02-20 16:57:00 +02:00
Sneha C
095cd873de
Update slash-commands.js
...
added the word "persona" to the /sync description to make it easier for users to find.
2024-02-20 16:48:43 +04:00
Cohee
8ba9b5c38b
Merge branch 'staging' into sampler-order-ooba
2024-02-20 02:32:33 +02:00
Cohee
8e66a14e37
Add hints to doc strings about additional command prompts
2024-02-20 02:29:14 +02:00
kalomaze
32ee58e5e6
fix kcpp order reset
2024-02-19 18:12:56 -06:00
Wolfsblvt
550d8483cc
Extend impersonate/continue/regenerate with possible custom prompts
...
- Use custom prompt provided via slash command arguments (similar to /sysgen and others)
- Use written text from textbox, if the popout menu actions are clicked
2024-02-19 22:23:58 +01:00
Cohee
2e00a1baaf
[FEATURE_REQUEST] Can the unlocked max context size for OpenAI completion be increased from 102k to 200k for example? #1842
2024-02-19 19:37:18 +02:00
NWilson
030806bf1e
Merge remote-tracking branch 'origin/staging' into InfermaticAI
2024-02-19 10:14:06 -06:00
Cohee
3c2113a6e7
Add ability to preserve file names when loading from assets downloader
2024-02-19 00:17:23 +02:00
Cohee
29b971a986
Merge branch 'staging' into slash-fix-bleed
2024-02-16 20:48:32 +02:00
Cohee
c06fe6abfc
Add character asset type
2024-02-16 20:42:56 +02:00
Cohee
a8cd6c9fe7
Allow finding characters in slash commands by exact PNG name
2024-02-16 20:24:47 +02:00
NWilson
8075e4cd1e
Changes
2024-02-16 09:07:06 -06:00
NWilson
b5887960b6
Merge branch 'release' into InfermaticAI
2024-02-16 08:53:04 -06:00
Cohee
0da0d490c7
#1796 Attempt to fix alltalk on remote servers
2024-02-14 19:44:47 +02:00