Commit Graph

1018 Commits

Author SHA1 Message Date
Henk
d041ec0921 Safer defaults and more flexibility
There have been a lot of reports from newer users who experience AI breakdown because not all models properly handle 2048 max tokens. 1024 is the only value that all models support and was the original value KoboldAI used. This commit reverts the decision to increase this to 2048, any existing configurations are not effected. Users who wish to increase the max tokens can do so themselves. Most models handle up to 1900 well (The GPT2 models are excluded), for many you can go all the way. (It is currently not yet known why some finetunes cause a decrease in maxtoken support,

In addition this commit contains a request for more consistent slider behavior, allowing the sliders to be changed at 0.01 intervals instead of some sliders being capped to 0.05.
2022-07-03 15:07:54 +02:00
henk717
a99518d0a8 Merge branch 'KoboldAI:main' into united 2022-07-02 12:59:53 +02:00
Henk
e2f7fed99f Don't turn gamestarted off 2022-07-02 12:59:14 +02:00
vfbd
aeed9bd8f7 Fix base fairseq dense models when using accelerate with a GPU 2022-07-01 20:16:39 -04:00
ebolam
b79ec8b1c5 Fix 2022-07-01 19:54:33 -04:00
ebolam
63f44f8204 Fix for select option 2022-07-01 19:40:34 -04:00
ebolam
516564ef6c Initial Load Story dialog 2022-07-01 19:24:20 -04:00
ebolam
0161966cea Env Fix 2022-07-01 17:24:06 -04:00
ebolam
6e841b87eb Fix for env 2022-07-01 16:53:51 -04:00
ebolam
73ad11c6d7 Fix for env variables 2022-07-01 16:49:47 -04:00
ebolam
ca07fdbe44 Added ability to use env variables instead of argparse (command argument = docker env variable) 2022-07-01 16:09:13 -04:00
ebolam
40bcf893d5 Preset Updates 2022-07-01 14:54:40 -04:00
ebolam
a56ef086e4 Estimated chunks going to generate 2022-07-01 11:27:43 -04:00
ebolam
9170aa7a4e Model Loading functional
Fix for mobile display
2022-07-01 08:09:10 -04:00
ebolam
3f8a7ab4bb Allowing edit in userscripts 2022-06-30 19:41:11 -04:00
ebolam
813540fe9b Added folder browser for softprompts and userscripts 2022-06-30 19:13:05 -04:00
ebolam
97e0df45d7 File Dialog complete 2022-06-30 15:57:27 -04:00
ebolam
16c5c580db Checkin 2022-06-30 13:40:47 -04:00
ebolam
58418c4aa5 Basic file browser with edit and delete functionality
Can be shown by going to /popup_test in a second tab.
2022-06-30 09:44:04 -04:00
ebolam
ce1bff1b84 TPU fixes 2022-06-29 17:56:25 -04:00
ebolam
72827ed149 Colab fix and send_to_ui fix 2022-06-29 17:44:22 -04:00
ebolam
de73aa2364 Single vars working with disabled framework for multi-story multi-user environment (LUA breaks) 2022-06-29 14:15:06 -04:00
vfbd
048bd0ff3b Add support for setting the RNG seed and full determinism 2022-06-28 13:21:05 -04:00
ebolam
0ffaa1bfcf Presets and Remaining time updates 2022-06-27 18:36:22 -04:00
ebolam
edd6dd7cd7 Fix for saved breakmodel settings on custom models
Fix for unit tests with new disk breakmodel
2022-06-27 10:12:54 -04:00
ebolam
057f3dd92d back, redo, retry functional 2022-06-26 21:06:06 -04:00
ebolam
b906742f61 Working options. 2022-06-26 16:36:07 -04:00
Henk
46678931b2 Better sentence spacing 2022-06-26 20:27:21 +02:00
vfbd
ebba79fed6 Remove trailing whitespace from submissions
(cherry picked from commit b99d1449c9)
2022-06-26 14:06:34 -04:00
vfbd
2a4d37ce60 Clean up whitespace at the end of actions when loading story
Specifically, we merge blank actions into the next action and we move
whitespace at the end of non-blank actions to the beginning of the next
action.

(cherry picked from commit 4b16600e49)
2022-06-26 14:04:36 -04:00
vfbd
b99d1449c9 Remove trailing whitespace from submissions 2022-06-26 13:15:55 -04:00
Henk
fa97d28cb3 Nerys V2 for United 2022-06-25 14:06:51 +02:00
Henk
9e7eb80db4 Nerys V2 part 2 2022-06-25 14:03:19 +02:00
Henk
ecc6ee9474 Nerys V2 2022-06-25 13:47:49 +02:00
henk717
10e85db89d Merge pull request #162 from VE-FORBRYDERNE/whitespace-cleanup
Story whitespace cleanup
2022-06-25 13:36:03 +02:00
Henk
d3fce44095 Merge branch 'main' into united 2022-06-24 18:31:45 +02:00
Henk
8be0964427 AIDG Import Fix 2022-06-24 18:29:06 +02:00
vfbd
4b16600e49 Clean up whitespace at the end of actions when loading story
Specifically, we merge blank actions into the next action and we move
whitespace at the end of non-blank actions to the beginning of the next
action.
2022-06-24 12:03:35 -04:00
ebolam
4c357abd78 metadata merged with actions 2022-06-24 09:22:59 -04:00
vfbd
3da885d408 GPT-NeoX HF model badwords fix 2022-06-23 15:02:43 -04:00
henk717
8098f4ec8f Merge branch 'KoboldAI:main' into united 2022-06-23 17:20:48 +02:00
vfbd
0eb9f8a879 Account for lnheader in budget calculation 2022-06-22 19:16:24 -04:00
ebolam
b0ac4581de UI v2 Initial Commit 2022-06-22 18:39:09 -04:00
ebolam
86553d329c Merge United 2022-06-22 14:32:58 -04:00
ebolam
83c0b9ee1e Vars Migration Fix for back/redo
Fix for pytest for back/redo and model loading with disk caching
2022-06-22 14:13:44 -04:00
vfbd
53034ee533 Delete all torch tensors before loading model 2022-06-22 12:07:36 -04:00
vfbd
922394c68f Don't blacklist </s> token in "s" newline mode 2022-06-22 11:23:03 -04:00
ebolam
13fcf462e9 Moved VARS to koboldai_settings and broken into model, story, user, system variables. story class also re-written to include options (actions_metadata). actions_metadata will be removed in UI2. 2022-06-22 11:14:37 -04:00
8c594c6869 Correct the padding token for GPT-NeoX 2022-06-21 19:37:43 -04:00
a7f667c34c Use NeoX badwords when loading from HF GPT-NeoX model 2022-06-21 19:33:25 -04:00