Commit Graph

2292 Commits

Author SHA1 Message Date
ebolam
56d150abc9 Scroll Test 2022-07-14 19:45:11 -04:00
ebolam
454bab3863 Scroll Test 2022-07-14 19:34:14 -04:00
ebolam
cdb321ea72 Scroll Test 2022-07-14 19:31:22 -04:00
ebolam
33cb49e7d6 Auto-scroll on text chunk update 2022-07-14 19:08:28 -04:00
henk717
68110c5930 Merge branch 'KoboldAI:main' into united 2022-07-12 23:03:09 +02:00
henk717
025db3bd04 Merge pull request #138 from VE-FORBRYDERNE/lazy-loader
Fix for lazy loader in PyTorch 1.12
2022-07-12 23:02:58 +02:00
henk717
836759d826 Merge pull request #137 from VE-FORBRYDERNE/jaxlib
TPU Colab hotfix
2022-07-12 23:02:40 +02:00
vfbd
39d48495ce Fix for lazy loader in PyTorch 1.12
There is no `torch._StorageBase` in PyTorch 1.12, but otherwise it still
works.
2022-07-12 16:48:01 -04:00
vfbd
70aa182671 Restrict jaxlib version in TPU Colabs 2022-07-12 16:30:26 -04:00
henk717
f900a17f3c Merge pull request #168 from VE-FORBRYDERNE/bloom
BLOOM support for TPU instances
2022-07-08 01:25:28 +02:00
vfbd
d9e7ca5b48 Upload map file for BLOOM 2022-07-07 17:48:00 -04:00
ebolam
8e0dd23810 Fix 2022-07-06 15:37:40 -04:00
ebolam
9825287ab9 Menu for right side
Fix for Chrome CSS
2022-07-06 12:39:48 -04:00
henk717
9e140e3ba9 Merge branch 'KoboldAI:main' into united 2022-07-05 21:35:53 +02:00
henk717
dd6da50e58 Merge pull request #136 from VE-FORBRYDERNE/opt
Fix base OPT-125M and finetuned OPT models in Colab TPU instances
2022-07-05 21:35:39 +02:00
vfbd
2a78b66932 Fix base OPT-125M and finetuned OPT models in Colab TPU instances 2022-07-05 15:28:58 -04:00
vfbd
c94f875608 Fix Z algorithm in basic phrase bias script 2022-07-05 14:43:58 -04:00
ebolam
796a1dcf16 CSS change to move options to below game text for mobile 2022-07-05 14:11:05 -04:00
ebolam
aedd7e966b Fix for edit files 2022-07-04 19:08:30 -04:00
ebolam
cc3ccb7f36 Text Token Length Fix 2022-07-04 15:05:20 -04:00
Henk
736a39b10b gitignore update 2022-07-04 20:12:11 +02:00
Henk
b76e82644a flask-session for conda 2022-07-04 20:07:11 +02:00
henk717
e8c39992a1 Merge pull request #166 from ebolam/united
Add file browser to soft prompts and user scripts
2022-07-04 19:52:05 +02:00
ebolam
3f28be16d4 Fix for token length 2022-07-03 16:57:41 -04:00
ebolam
8013bc2a98 Added background blur for popup file editor 2022-07-03 16:21:48 -04:00
ebolam
328c0a38d7 Removed breadcrumbs on file browser before the jail directory 2022-07-03 16:02:55 -04:00
henk717
fd44f0ded3 Merge branch 'KoboldAI:main' into united 2022-07-03 15:12:12 +02:00
Henk
d041ec0921 Safer defaults and more flexibility
There have been a lot of reports from newer users who experience AI breakdown because not all models properly handle 2048 max tokens. 1024 is the only value that all models support and was the original value KoboldAI used. This commit reverts the decision to increase this to 2048, any existing configurations are not effected. Users who wish to increase the max tokens can do so themselves. Most models handle up to 1900 well (The GPT2 models are excluded), for many you can go all the way. (It is currently not yet known why some finetunes cause a decrease in maxtoken support,

In addition this commit contains a request for more consistent slider behavior, allowing the sliders to be changed at 0.01 intervals instead of some sliders being capped to 0.05.
2022-07-03 15:07:54 +02:00
ebolam
1141e39fbd Fix 2022-07-02 12:48:19 -04:00
ebolam
8280063621 fix 2022-07-02 12:36:32 -04:00
henk717
a99518d0a8 Merge branch 'KoboldAI:main' into united 2022-07-02 12:59:53 +02:00
Henk
e2f7fed99f Don't turn gamestarted off 2022-07-02 12:59:14 +02:00
henk717
74547b31d6 Merge pull request #167 from VE-FORBRYDERNE/accelerate
Fix base fairseq dense models when using accelerate with a GPU
2022-07-02 02:19:41 +02:00
vfbd
aeed9bd8f7 Fix base fairseq dense models when using accelerate with a GPU 2022-07-01 20:16:39 -04:00
ebolam
54a5e61ec2 fix 2022-07-01 20:15:17 -04:00
ebolam
b79ec8b1c5 Fix 2022-07-01 19:54:33 -04:00
ebolam
63f44f8204 Fix for select option 2022-07-01 19:40:34 -04:00
ebolam
516564ef6c Initial Load Story dialog 2022-07-01 19:24:20 -04:00
ebolam
1e815e7f52 Fix for action count on new version of save file 2022-07-01 18:08:45 -04:00
ebolam
0161966cea Env Fix 2022-07-01 17:24:06 -04:00
ebolam
6e841b87eb Fix for env 2022-07-01 16:53:51 -04:00
ebolam
73ad11c6d7 Fix for env variables 2022-07-01 16:49:47 -04:00
ebolam
18dd97de6b requirements update for micromamba 2022-07-01 16:18:08 -04:00
ebolam
ca07fdbe44 Added ability to use env variables instead of argparse (command argument = docker env variable) 2022-07-01 16:09:13 -04:00
ebolam
40bcf893d5 Preset Updates 2022-07-01 14:54:40 -04:00
henk717
5d957e33ae Merge branch 'KoboldAI:main' into united 2022-07-01 20:33:36 +02:00
henk717
90c5cebb6d Merge pull request #134 from VE-FORBRYDERNE/editor
Editor fixes
2022-07-01 20:32:18 +02:00
vfbd
c336a43544 Fix some remaining editor whitespace-fixing issues 2022-07-01 13:45:57 -04:00
vfbd
c3eade8046 Fix editor bug in iOS when adding newline at end of the last action
Not only does iOS also have that issue that Chromium-based browsers
have, but it also has a different issue where it selects all text in the
last chunk of your story, so I added some code to deselect the text in
that case.
2022-07-01 13:12:57 -04:00
ebolam
a56ef086e4 Estimated chunks going to generate 2022-07-01 11:27:43 -04:00