Commit Graph

1028 Commits

Author SHA1 Message Date
kingbri 1dd1607b94 Generate: Migrate to array-based message concatenation
String addition is very limited in how flexible it can be. Using
an array will instead allow for various ways to shift around extension
prompts and insertion depths.

To preserve message order, each mesSend object contains both a message
and an array of extension prompts which are later added on top of
the message itself.

Signed-off-by: kingbri <bdashore3@proton.me>
2023-08-24 18:02:17 -04:00
Cohee 1900ab9726 #1005 Replace mobile detection method in get sortable delay. Make deviceInfo loading sync. Init Ross mods via function call. 2023-08-24 23:52:03 +03:00
Cohee ce67101651
Merge pull request #1012 from StefanDanielSchwarz/llama-2-chat-instruct-preset-fixes
Llama 2 chat instruct preset fixes
2023-08-24 22:51:55 +03:00
Stefan Daniel Schwarz 4b389eba61 no eol separator after replaced sys prompt 2023-08-24 20:30:24 +02:00
Cohee c91ab3b5e0 Add Kobold tokenization to best match logic. Fix not being able to stop group chat regeneration 2023-08-24 21:23:35 +03:00
Stefan Daniel Schwarz 56a6398189 system prompt in system sequence 2023-08-24 19:33:04 +02:00
Cohee 14d94d9108 Merge branch 'staging' of https://github.com/SillyTavern/SillyTavern into staging 2023-08-24 20:20:03 +03:00
Cohee ab52af4fb5 Add support for Koboldcpp tokenization endpoint 2023-08-24 20:19:57 +03:00
Stefan Daniel Schwarz 582464a2e7 fix name 2023-08-24 18:47:15 +02:00
Cohee d4bd91f6ec Merge branch 'staging' of http://github.com/cohee1207/SillyTavern into staging 2023-08-24 18:32:46 +03:00
Cohee 7010e05f8e Sync bottom form height with the font size 2023-08-24 18:32:42 +03:00
Cohee cd2faea2b2 Fix group chats with streaming 2023-08-24 17:46:44 +03:00
Cohee 8fea486e57 #1009 Update Vicuna 1.1 template 2023-08-24 15:50:14 +03:00
Cohee f5fd15ffd2 #976 Return "Continue on send". Allow continuing the first chat message. Add debug function for i18n. 2023-08-24 15:13:04 +03:00
Cohee 3e25c3f51c
Merge pull request #1008 from mweldon/novel-generate-until-sentence
Novel generate until sentence
2023-08-24 11:25:56 +03:00
Cohee d147bc40dc Fix alternate greetings 2023-08-24 11:04:46 +03:00
Mike Weldon 8202fab376 Remove commented lines I added by mistake 2023-08-23 18:08:55 -07:00
Mike Weldon 1d1109e43b Set generate_until_sentence for NovelAI
* Set generate_until_sentence true for NovelAI
* Add a Story String file for NovelAI with persona before character
  which works better
* Remove hardcoded dinkus for chat_start since it is in the Story String
2023-08-23 18:04:56 -07:00
Cohee 4aa31fcba9 Add fallback option for OpenRouter 2023-08-24 03:21:17 +03:00
Cohee ffc8150eef Add missing space on continue 2023-08-24 02:38:02 +03:00
Cohee 4038e7f9e3 #999 Instruct mode fixes 2023-08-24 02:22:37 +03:00
Cohee e2507e8840 #976 Add "quick continue" button. Remove "send to continue" option. 2023-08-24 01:37:44 +03:00
Cohee 52c2fcd407 Fix OpenRouter model not persisting on page reload 2023-08-24 00:59:57 +03:00
Cohee d64c5880c8 Fix new chat reference not getting saved when starting a new chat 2023-08-24 00:54:36 +03:00
Cohee de0dbfb394 Merge branch 'roleplay-context' into staging 2023-08-24 00:30:04 +03:00
Cohee 17925423ae
Merge pull request #999 from StefanDanielSchwarz/roleplay-context
Return of the Roleplay Context
2023-08-24 00:29:05 +03:00
Cohee 1ce848c1c3 Move before / after char WI to story strings 2023-08-24 00:26:47 +03:00
Cohee fa7bb9143d
Merge pull request #1006 from deffcolony/staging
added Dutch (NL) translation
2023-08-23 23:40:44 +03:00
Stefan Daniel Schwarz dd7b21c63d renamed roleplay instruct preset 2023-08-23 22:23:51 +02:00
Stefan Daniel Schwarz 7cafa5d374 improved preset selection logic 2023-08-23 22:22:52 +02:00
deffcolony 58911e9eb8 added Dutch (NL) translation 2023-08-23 22:15:17 +02:00
Cohee 9aa03402fa
Merge pull request #1004 from SillyTavern/prompt-manager-persona-description
Prompt manager persona description
2023-08-23 22:11:47 +03:00
maver 65e595ad48 Increase prompt oder dummy user id by 1 2023-08-23 20:41:13 +02:00
maver 5a02250a1f Add persona description to prompt manager order 2023-08-23 20:40:26 +02:00
Cohee 460127ed3c
Merge pull request #1002 from qHiyokop/staging
Added Italian translation by qHiyokop
2023-08-23 21:39:15 +03:00
Cohee b1c1ac465c Merge branch 'release' into staging 2023-08-23 21:37:52 +03:00
RossAscends 632d03228f Merge branch 'staging' of https://github.com/Cohee1207/SillyTavern into staging 2023-08-24 03:34:25 +09:00
RossAscends 6c56fb0a6d 500ms delay for sliders on touch devices 2023-08-24 03:34:20 +09:00
Cohee 1d6b7c9947 Merge branch 'staging' of https://github.com/SillyTavern/SillyTavern into staging 2023-08-23 21:32:46 +03:00
Cohee 031a6cb2a4 Performance and data integrity improvements 2023-08-23 21:32:38 +03:00
Stefan Daniel Schwarz 9b932dfa15 byebye wizard 2023-08-23 20:24:45 +02:00
Stefan Daniel Schwarz 8938ea1d72 auto select presets 2023-08-23 20:17:45 +02:00
qHiyokop 2ccf029c9b Added Italian translation by qHiyokop 2023-08-24 04:08:28 +10:00
qHiyokop b0b6d925c0 Merge branch 'staging' of https://github.com/qHiyokop/SillyTavern into release 2023-08-24 03:54:33 +10:00
Cohee 2c58f9d903
Merge pull request #1001 from bdashore3/staging
More CFG fixes
2023-08-23 18:49:36 +03:00
kingbri 0460375647 CFG: Don't inject anything when guidance scale doesn't exist
If the guidance scale is 1, completely disable sending CFG and creating
a negative prompt.

Signed-off-by: kingbri <bdashore3@proton.me>
2023-08-23 11:27:58 -04:00
Cohee f48cc0db31 Unify cfgValues parsing between ooba/Novel 2023-08-23 18:26:56 +03:00
Cohee 2c2a68ef76 Fix instruct system sequence missing + {{original}} 2023-08-23 18:04:22 +03:00
Cohee fad6c164cb Don't set negative prompt from CFG extension to ooba at guidance scale 1.0 2023-08-23 17:44:38 +03:00
Cohee b751643364 Save pagination state on return to list from card 2023-08-23 16:48:44 +03:00