Commit Graph

868 Commits

Author SHA1 Message Date
610257b36e Output Streaming on by Default 2022-08-06 16:47:04 +02:00
8bcf4187ac Merge pull request #178 from one-some/token-prob
Add token probability visualizer
2022-08-05 14:27:46 +02:00
f6d046fe1b Add token probability visualizer 2022-08-04 13:49:37 -05:00
71e119f0b7 Fix for secondary model loads leaking settings into secondary model's settings file. 2022-08-02 19:45:36 -04:00
050e195420 Merge pull request #173 from one-some/token-streaming
Add token streaming option
2022-07-30 18:32:51 +02:00
a63f7cfa5a Merge pull request #174 from ebolam/united
Fix for blank model info box when downloading model
2022-07-29 22:15:58 +02:00
f97c10b794 Fix for blank model info box when downloading model 2022-07-28 19:40:27 -04:00
a4d81292f8 Add token streaming option 2022-07-27 22:13:08 -05:00
fe64e480ee Merge pull request #171 from ebolam/united
Add Download Model Status
2022-07-26 00:52:12 +02:00
7721b72184 Merge branch 'KoboldAI:main' into united 2022-07-26 00:42:35 +02:00
4d8a633351 Aetherroom instead of aidg.club 2022-07-26 00:41:51 +02:00
12acb50ee0 Fix for getting "model download status" when downloading config to figure out layer counts 2022-07-25 18:29:14 -04:00
9dc9966433 Added functionality to add any/all args via json 2022-07-23 22:02:03 -06:00
907cf74b13 Added status bar for downloading models 2022-07-22 13:58:20 -04:00
2b53598307 Fixes for file editor (#170)
Various fixes for the file editor by Ebolam
2022-07-20 00:50:03 +02:00
f58064e72c Revert "Fix for aidg.club website being taken read-only"
This reverts commit 23a031d852.
2022-07-19 16:54:32 -04:00
23a031d852 Fix for aidg.club website being taken read-only 2022-07-19 13:40:55 -04:00
68d143b80c Merge branch 'united' of https://github.com/ebolam/KoboldAI into united 2022-07-15 12:30:18 -04:00
d91ed3141d Fix for non ascii files in edit mode 2022-07-15 12:30:02 -04:00
e8c39992a1 Merge pull request #166 from ebolam/united
Add file browser to soft prompts and user scripts
2022-07-04 19:52:05 +02:00
328c0a38d7 Removed breadcrumbs on file browser before the jail directory 2022-07-03 16:02:55 -04:00
fd44f0ded3 Merge branch 'KoboldAI:main' into united 2022-07-03 15:12:12 +02:00
d041ec0921 Safer defaults and more flexibility
There have been a lot of reports from newer users who experience AI breakdown because not all models properly handle 2048 max tokens. 1024 is the only value that all models support and was the original value KoboldAI used. This commit reverts the decision to increase this to 2048, any existing configurations are not effected. Users who wish to increase the max tokens can do so themselves. Most models handle up to 1900 well (The GPT2 models are excluded), for many you can go all the way. (It is currently not yet known why some finetunes cause a decrease in maxtoken support,

In addition this commit contains a request for more consistent slider behavior, allowing the sliders to be changed at 0.01 intervals instead of some sliders being capped to 0.05.
2022-07-03 15:07:54 +02:00
a99518d0a8 Merge branch 'KoboldAI:main' into united 2022-07-02 12:59:53 +02:00
e2f7fed99f Don't turn gamestarted off 2022-07-02 12:59:14 +02:00
aeed9bd8f7 Fix base fairseq dense models when using accelerate with a GPU 2022-07-01 20:16:39 -04:00
3f8a7ab4bb Allowing edit in userscripts 2022-06-30 19:41:11 -04:00
813540fe9b Added folder browser for softprompts and userscripts 2022-06-30 19:13:05 -04:00
97e0df45d7 File Dialog complete 2022-06-30 15:57:27 -04:00
58418c4aa5 Basic file browser with edit and delete functionality
Can be shown by going to /popup_test in a second tab.
2022-06-30 09:44:04 -04:00
048bd0ff3b Add support for setting the RNG seed and full determinism 2022-06-28 13:21:05 -04:00
edd6dd7cd7 Fix for saved breakmodel settings on custom models
Fix for unit tests with new disk breakmodel
2022-06-27 10:12:54 -04:00
46678931b2 Better sentence spacing 2022-06-26 20:27:21 +02:00
ebba79fed6 Remove trailing whitespace from submissions
(cherry picked from commit b99d1449c9)
2022-06-26 14:06:34 -04:00
2a4d37ce60 Clean up whitespace at the end of actions when loading story
Specifically, we merge blank actions into the next action and we move
whitespace at the end of non-blank actions to the beginning of the next
action.

(cherry picked from commit 4b16600e49)
2022-06-26 14:04:36 -04:00
b99d1449c9 Remove trailing whitespace from submissions 2022-06-26 13:15:55 -04:00
fa97d28cb3 Nerys V2 for United 2022-06-25 14:06:51 +02:00
9e7eb80db4 Nerys V2 part 2 2022-06-25 14:03:19 +02:00
ecc6ee9474 Nerys V2 2022-06-25 13:47:49 +02:00
10e85db89d Merge pull request #162 from VE-FORBRYDERNE/whitespace-cleanup
Story whitespace cleanup
2022-06-25 13:36:03 +02:00
d3fce44095 Merge branch 'main' into united 2022-06-24 18:31:45 +02:00
8be0964427 AIDG Import Fix 2022-06-24 18:29:06 +02:00
4b16600e49 Clean up whitespace at the end of actions when loading story
Specifically, we merge blank actions into the next action and we move
whitespace at the end of non-blank actions to the beginning of the next
action.
2022-06-24 12:03:35 -04:00
3da885d408 GPT-NeoX HF model badwords fix 2022-06-23 15:02:43 -04:00
8098f4ec8f Merge branch 'KoboldAI:main' into united 2022-06-23 17:20:48 +02:00
0eb9f8a879 Account for lnheader in budget calculation 2022-06-22 19:16:24 -04:00
53034ee533 Delete all torch tensors before loading model 2022-06-22 12:07:36 -04:00
922394c68f Don't blacklist </s> token in "s" newline mode 2022-06-22 11:23:03 -04:00
8c594c6869 Correct the padding token for GPT-NeoX 2022-06-21 19:37:43 -04:00
a7f667c34c Use NeoX badwords when loading from HF GPT-NeoX model 2022-06-21 19:33:25 -04:00