Commit Graph

4150 Commits

Author SHA1 Message Date
2a4b8ac438 Update displayed prompt bias when auto-saving edits.
When Auto-save Message Edits is enabled, the prompt bias string
displayed beneath the textarea wasn't being updated.
2024-02-27 05:17:38 -05:00
7885f19e86 Perform macro substitution while updating message.
This addresses 3 issues:
1. Prompt bias string was not removed from the text of the edited
  message.
2. Macro substitition was not performed in the prompt bias string.
3. Macro substitution was not performed in the edited message text.
2024-02-27 05:17:31 -05:00
1898192d37 Update readme-ja_jp.md 2024-02-27 18:00:56 +09:00
29c4334c46 #1859 Set keep_alive for ollama 2024-02-26 21:09:21 +02:00
73886c9fff Merge pull request #1863 from kingbased/mistral
mistral-large
2024-02-26 20:10:51 +02:00
149a65cf62 migrate model name in old presets to new naming scheme 2024-02-27 02:23:07 +10:00
617ae7d02c ??? 2024-02-27 01:42:22 +10:00
c58d0b2b94 subvers 2024-02-27 01:12:17 +10:00
e86fd08d0f update mistral models 2024-02-27 01:02:02 +10:00
f962ad5c02 Add OpenRouter as a text completion source 2024-02-25 22:47:07 +02:00
3c620effaf Update script.js 2024-02-25 21:19:28 +04:00
670f08fad2 Update group-chats.js
After deleting a group chat, the oldest chat became active. I've fixed it so that the most recent chat becomes active instead.
2024-02-25 21:11:56 +04:00
9e5505a7d4 Autocomplete for WI automation IDs 2024-02-25 03:54:40 +02:00
fc289126fa Add event type for text completion generation request settings ready 2024-02-24 21:45:33 +02:00
d5bf9fc28c Non-streaming logprobs for Aphrodite 2024-02-24 20:53:23 +02:00
d140b8d5be Parse non-streaming tabby logprobs 2024-02-24 20:10:53 +02:00
3cedf64f66 Add autocomplete for WI inclusion groups 2024-02-24 19:04:44 +02:00
0e357c191b Align label margins 2024-02-24 18:23:58 +02:00
3441667336 #1853 Add WI/Script link by entry automation id 2024-02-24 17:22:51 +02:00
7b8ac8f4c4 Properly use vector insert setting 2024-02-24 15:57:26 +02:00
16833fc238 Merge pull request #1855 from deciare/llamacpp-sampler-order
Sampler order for llama.cpp server backend
2024-02-24 15:45:44 +02:00
8848818d67 Fix dynatemp neutralization 2024-02-24 15:32:12 +02:00
299bd9d563 Merge branch 'staging' into llamacpp-sampler-order 2024-02-24 15:10:58 +02:00
13aebc623a Merge pull request #1854 from deciare/llamacpp-probs
Request and display token probabilities from llama.cpp backend
2024-02-24 15:06:28 +02:00
eaadfea639 Extend debounce duration of logprobs renderer 2024-02-24 15:03:57 +02:00
9287ff18de Fix for non-streaming 2024-02-24 14:50:06 +02:00
dab9bbb514 Merge pull request #1844 from infermaticAI/InfermaticAI
Add InfermaticAI as a text completion source
2024-02-24 14:28:09 +02:00
445cbda02f If token probability is a logarithm it'll be < 0
No need to read settings to find out if llama.cpp backend is in use...
2024-02-24 00:13:33 -05:00
9eba076ae4 Sampler order for llama.cpp server backend 2024-02-23 23:01:04 -05:00
936fbac6c5 Merge remote-tracking branch 'origin/staging' into llamacpp-probs 2024-02-23 17:45:54 -05:00
737a0bd3ae Fix purge extras and mistral vectors 2024-02-23 22:37:00 +02:00
9b34ac1bde Merge pull request #1852 from berbant/staging
Display TranslateProvider link
2024-02-23 21:43:59 +02:00
cb536a7611 Save a list of safe to export secret keys 2024-02-23 21:41:54 +02:00
82c5042bad Prevent extra loop iterations on buffer init 2024-02-23 21:23:44 +02:00
4baefeba68 Extend per-entry scan depth limit, add warnings on overflow 2024-02-23 21:18:40 +02:00
344b9eedbc Request token probabilities from llama.cpp backend
llama.cpp server token probabilities are given as values ranging from
0 to 1 instead of as logarithms.
2024-02-23 14:01:46 -05:00
f82740a238 Change Non-streaming Handler 2024-02-22 15:51:11 -06:00
bc2010a762 Update secrets.js 2024-02-22 23:55:57 +04:00
eb89337f51 Update index.js 2024-02-22 23:49:47 +04:00
c9f0d61f19 #1851 Substitute macros in new example chat 2024-02-22 18:45:50 +02:00
f569424f3e Merge branch 'staging' into InfermaticAI 2024-02-22 08:32:10 -06:00
beb5e470a2 #1069 Fix hoisting of pristine cards in newest sort 2024-02-22 04:48:46 +02:00
ece3b2a7c1 Fix Chat Completions status check on settings loading if another API is selected 2024-02-22 04:36:06 +02:00
06c3ea7c51 Merge pull request #1811 from kalomaze/sampler-order-ooba
Sampler priority support (for text-generation-webui)
2024-02-22 02:55:38 +02:00
0ccdfe4bb7 Fix duped line 2024-02-22 02:45:35 +02:00
40aa971d11 Merge branch 'staging' into sampler-order-ooba 2024-02-22 02:44:32 +02:00
fb6fa54c7f Fix import fetch HTTP method 2024-02-21 19:57:38 +02:00
fcf171931a Merge pull request #1846 from SillyTavern/pygimport
Pygimport
2024-02-21 19:55:57 +02:00
92af4137a9 Use new export endpoint 2024-02-21 11:28:59 +02:00
711fd0517f Merge branch 'staging' into pygimport 2024-02-21 11:26:47 +02:00