henk717
050e195420
Merge pull request #173 from one-some/token-streaming
...
Add token streaming option
2022-07-30 18:32:51 +02:00
henk717
a63f7cfa5a
Merge pull request #174 from ebolam/united
...
Fix for blank model info box when downloading model
2022-07-29 22:15:58 +02:00
ebolam
f97c10b794
Fix for blank model info box when downloading model
2022-07-28 19:40:27 -04:00
somebody
a4d81292f8
Add token streaming option
2022-07-27 22:13:08 -05:00
henk717
fe64e480ee
Merge pull request #171 from ebolam/united
...
Add Download Model Status
2022-07-26 00:52:12 +02:00
henk717
7721b72184
Merge branch 'KoboldAI:main' into united
2022-07-26 00:42:35 +02:00
Henk
4d8a633351
Aetherroom instead of aidg.club
2022-07-26 00:41:51 +02:00
ebolam
12acb50ee0
Fix for getting "model download status" when downloading config to figure out layer counts
2022-07-25 18:29:14 -04:00
scott-ca
9dc9966433
Added functionality to add any/all args via json
2022-07-23 22:02:03 -06:00
ebolam
907cf74b13
Added status bar for downloading models
2022-07-22 13:58:20 -04:00
ebolam
2b53598307
Fixes for file editor ( #170 )
...
Various fixes for the file editor by Ebolam
2022-07-20 00:50:03 +02:00
ebolam
f58064e72c
Revert "Fix for aidg.club website being taken read-only"
...
This reverts commit 23a031d852
.
2022-07-19 16:54:32 -04:00
ebolam
23a031d852
Fix for aidg.club website being taken read-only
2022-07-19 13:40:55 -04:00
ebolam
68d143b80c
Merge branch 'united' of https://github.com/ebolam/KoboldAI into united
2022-07-15 12:30:18 -04:00
ebolam
d91ed3141d
Fix for non ascii files in edit mode
2022-07-15 12:30:02 -04:00
henk717
e8c39992a1
Merge pull request #166 from ebolam/united
...
Add file browser to soft prompts and user scripts
2022-07-04 19:52:05 +02:00
ebolam
328c0a38d7
Removed breadcrumbs on file browser before the jail directory
2022-07-03 16:02:55 -04:00
henk717
fd44f0ded3
Merge branch 'KoboldAI:main' into united
2022-07-03 15:12:12 +02:00
Henk
d041ec0921
Safer defaults and more flexibility
...
There have been a lot of reports from newer users who experience AI breakdown because not all models properly handle 2048 max tokens. 1024 is the only value that all models support and was the original value KoboldAI used. This commit reverts the decision to increase this to 2048, any existing configurations are not effected. Users who wish to increase the max tokens can do so themselves. Most models handle up to 1900 well (The GPT2 models are excluded), for many you can go all the way. (It is currently not yet known why some finetunes cause a decrease in maxtoken support,
In addition this commit contains a request for more consistent slider behavior, allowing the sliders to be changed at 0.01 intervals instead of some sliders being capped to 0.05.
2022-07-03 15:07:54 +02:00
henk717
a99518d0a8
Merge branch 'KoboldAI:main' into united
2022-07-02 12:59:53 +02:00
Henk
e2f7fed99f
Don't turn gamestarted off
2022-07-02 12:59:14 +02:00
vfbd
aeed9bd8f7
Fix base fairseq dense models when using accelerate with a GPU
2022-07-01 20:16:39 -04:00
ebolam
3f8a7ab4bb
Allowing edit in userscripts
2022-06-30 19:41:11 -04:00
ebolam
813540fe9b
Added folder browser for softprompts and userscripts
2022-06-30 19:13:05 -04:00
ebolam
97e0df45d7
File Dialog complete
2022-06-30 15:57:27 -04:00
ebolam
58418c4aa5
Basic file browser with edit and delete functionality
...
Can be shown by going to /popup_test in a second tab.
2022-06-30 09:44:04 -04:00
vfbd
048bd0ff3b
Add support for setting the RNG seed and full determinism
2022-06-28 13:21:05 -04:00
ebolam
edd6dd7cd7
Fix for saved breakmodel settings on custom models
...
Fix for unit tests with new disk breakmodel
2022-06-27 10:12:54 -04:00
Henk
46678931b2
Better sentence spacing
2022-06-26 20:27:21 +02:00
vfbd
ebba79fed6
Remove trailing whitespace from submissions
...
(cherry picked from commit b99d1449c9
)
2022-06-26 14:06:34 -04:00
vfbd
2a4d37ce60
Clean up whitespace at the end of actions when loading story
...
Specifically, we merge blank actions into the next action and we move
whitespace at the end of non-blank actions to the beginning of the next
action.
(cherry picked from commit 4b16600e49
)
2022-06-26 14:04:36 -04:00
vfbd
b99d1449c9
Remove trailing whitespace from submissions
2022-06-26 13:15:55 -04:00
Henk
fa97d28cb3
Nerys V2 for United
2022-06-25 14:06:51 +02:00
Henk
9e7eb80db4
Nerys V2 part 2
2022-06-25 14:03:19 +02:00
Henk
ecc6ee9474
Nerys V2
2022-06-25 13:47:49 +02:00
henk717
10e85db89d
Merge pull request #162 from VE-FORBRYDERNE/whitespace-cleanup
...
Story whitespace cleanup
2022-06-25 13:36:03 +02:00
Henk
d3fce44095
Merge branch 'main' into united
2022-06-24 18:31:45 +02:00
Henk
8be0964427
AIDG Import Fix
2022-06-24 18:29:06 +02:00
vfbd
4b16600e49
Clean up whitespace at the end of actions when loading story
...
Specifically, we merge blank actions into the next action and we move
whitespace at the end of non-blank actions to the beginning of the next
action.
2022-06-24 12:03:35 -04:00
vfbd
3da885d408
GPT-NeoX HF model badwords fix
2022-06-23 15:02:43 -04:00
henk717
8098f4ec8f
Merge branch 'KoboldAI:main' into united
2022-06-23 17:20:48 +02:00
vfbd
0eb9f8a879
Account for lnheader in budget calculation
2022-06-22 19:16:24 -04:00
vfbd
53034ee533
Delete all torch tensors before loading model
2022-06-22 12:07:36 -04:00
vfbd
922394c68f
Don't blacklist </s> token in "s" newline mode
2022-06-22 11:23:03 -04:00
Gnome Ann
8c594c6869
Correct the padding token for GPT-NeoX
2022-06-21 19:37:43 -04:00
Gnome Ann
a7f667c34c
Use NeoX badwords when loading from HF GPT-NeoX model
2022-06-21 19:33:25 -04:00
Gnome Ann
8593bf339b
Another typo fix
2022-06-21 15:36:25 -04:00
Gnome Ann
7e0ded6b47
Typo fix
2022-06-21 15:12:55 -04:00
Gnome Ann
91643be10a
Change soft prompt implementation to a more universal one
2022-06-21 15:03:43 -04:00
Gnome Ann
0ea4fa9c87
Automatically calculate badwords and pad_token_id
2022-06-21 14:35:52 -04:00