ebolam
12acb50ee0
Fix for getting "model download status" when downloading config to figure out layer counts
2022-07-25 18:29:14 -04:00
henk717
46231519af
Merge pull request #172 from scott-ca/united
...
Added functionality to load all of the CLI arguments via a single JSON file
2022-07-26 00:14:46 +02:00
scott-ca
ce2efa0149
Update customsettings_template.json
2022-07-23 22:06:56 -06:00
scott-ca
9dc9966433
Added functionality to add any/all args via json
2022-07-23 22:02:03 -06:00
ebolam
0ab3612e49
Merge branch 'henk717:united' into united
2022-07-22 13:58:58 -04:00
ebolam
907cf74b13
Added status bar for downloading models
2022-07-22 13:58:20 -04:00
henk717
e860eb161d
Merge branch 'KoboldAI:main' into united
2022-07-22 15:33:25 +02:00
henk717
92c5cfeb9b
Merge pull request #145 from LightSaveUs/patch-1
...
Default value fix
2022-07-22 15:32:09 +02:00
LightSaveUs
3865a2a77b
Default value fix
...
Fixing the default value
2022-07-22 16:27:45 +03:00
henk717
f1fd46fca6
Merge branch 'KoboldAI:main' into united
2022-07-22 15:19:19 +02:00
henk717
db35b20b49
Merge pull request #144 from LightSaveUs/main
...
Formatting fixes
2022-07-22 15:17:35 +02:00
LightSaveUs
9c2fde91e4
TPU Formatting Fix
...
Removed the information about irrelevant style (Chatbot)
2022-07-22 13:28:32 +03:00
LightSaveUs
4bd3b53c0a
GPU Formatting Fix
...
Removed the information about irrelevant style (Chatbot)
2022-07-22 13:27:57 +03:00
LightSaveUs
1eecef8c8b
GPU Formatting Fix #2
...
Deleted information about removed Convo and C1 chatbot models.
2022-07-22 13:15:08 +03:00
henk717
5f3783a294
Merge branch 'KoboldAI:main' into united
2022-07-21 20:02:45 +02:00
henk717
4bd9ac5a66
Merge pull request #141 from VE-FORBRYDERNE/lua
...
Correct typos in bridge.lua and API documentation
2022-07-21 20:02:35 +02:00
vfbd
cb037a2ae9
Correct typos in bridge.lua and API documentation
2022-07-21 13:59:06 -04:00
LightSaveUs
bd69cccefc
GPU Formatting Fix
...
Formatting issues fixes and unification
2022-07-21 03:10:14 +03:00
LightSaveUs
97ff194261
TPU Formatting Fix
...
Minor code cleanup, formatting issues fixes and unification
2022-07-21 02:52:20 +03:00
ebolam
2b53598307
Fixes for file editor ( #170 )
...
Various fixes for the file editor by Ebolam
2022-07-20 00:50:03 +02:00
ebolam
a0475ba049
Moved emit action on fire browser to button rather than icon for easier clicking
2022-07-19 18:16:01 -04:00
ebolam
f58064e72c
Revert "Fix for aidg.club website being taken read-only"
...
This reverts commit 23a031d852
.
2022-07-19 16:54:32 -04:00
ebolam
c3fdee68a8
Revert "Revert "Fix for edit files""
...
This reverts commit 9c1fc5af8b
.
2022-07-19 16:53:45 -04:00
ebolam
9c1fc5af8b
Revert "Fix for edit files"
...
This reverts commit aedd7e966b
.
2022-07-19 14:02:27 -04:00
ebolam
23a031d852
Fix for aidg.club website being taken read-only
2022-07-19 13:40:55 -04:00
ebolam
68d143b80c
Merge branch 'united' of https://github.com/ebolam/KoboldAI into united
2022-07-15 12:30:18 -04:00
ebolam
d91ed3141d
Fix for non ascii files in edit mode
2022-07-15 12:30:02 -04:00
ebolam
9c136985a7
Merge branch 'henk717:united' into united
2022-07-15 12:29:00 -04:00
henk717
68110c5930
Merge branch 'KoboldAI:main' into united
2022-07-12 23:03:09 +02:00
henk717
025db3bd04
Merge pull request #138 from VE-FORBRYDERNE/lazy-loader
...
Fix for lazy loader in PyTorch 1.12
2022-07-12 23:02:58 +02:00
henk717
836759d826
Merge pull request #137 from VE-FORBRYDERNE/jaxlib
...
TPU Colab hotfix
2022-07-12 23:02:40 +02:00
vfbd
39d48495ce
Fix for lazy loader in PyTorch 1.12
...
There is no `torch._StorageBase` in PyTorch 1.12, but otherwise it still
works.
2022-07-12 16:48:01 -04:00
vfbd
70aa182671
Restrict jaxlib version in TPU Colabs
2022-07-12 16:30:26 -04:00
henk717
f900a17f3c
Merge pull request #168 from VE-FORBRYDERNE/bloom
...
BLOOM support for TPU instances
2022-07-08 01:25:28 +02:00
vfbd
d9e7ca5b48
Upload map file for BLOOM
2022-07-07 17:48:00 -04:00
henk717
9e140e3ba9
Merge branch 'KoboldAI:main' into united
2022-07-05 21:35:53 +02:00
henk717
dd6da50e58
Merge pull request #136 from VE-FORBRYDERNE/opt
...
Fix base OPT-125M and finetuned OPT models in Colab TPU instances
2022-07-05 21:35:39 +02:00
vfbd
2a78b66932
Fix base OPT-125M and finetuned OPT models in Colab TPU instances
2022-07-05 15:28:58 -04:00
vfbd
c94f875608
Fix Z algorithm in basic phrase bias script
2022-07-05 14:43:58 -04:00
ebolam
aedd7e966b
Fix for edit files
2022-07-04 19:08:30 -04:00
Henk
736a39b10b
gitignore update
2022-07-04 20:12:11 +02:00
Henk
b76e82644a
flask-session for conda
2022-07-04 20:07:11 +02:00
henk717
e8c39992a1
Merge pull request #166 from ebolam/united
...
Add file browser to soft prompts and user scripts
2022-07-04 19:52:05 +02:00
ebolam
8013bc2a98
Added background blur for popup file editor
2022-07-03 16:21:48 -04:00
ebolam
328c0a38d7
Removed breadcrumbs on file browser before the jail directory
2022-07-03 16:02:55 -04:00
henk717
fd44f0ded3
Merge branch 'KoboldAI:main' into united
2022-07-03 15:12:12 +02:00
Henk
d041ec0921
Safer defaults and more flexibility
...
There have been a lot of reports from newer users who experience AI breakdown because not all models properly handle 2048 max tokens. 1024 is the only value that all models support and was the original value KoboldAI used. This commit reverts the decision to increase this to 2048, any existing configurations are not effected. Users who wish to increase the max tokens can do so themselves. Most models handle up to 1900 well (The GPT2 models are excluded), for many you can go all the way. (It is currently not yet known why some finetunes cause a decrease in maxtoken support,
In addition this commit contains a request for more consistent slider behavior, allowing the sliders to be changed at 0.01 intervals instead of some sliders being capped to 0.05.
2022-07-03 15:07:54 +02:00
henk717
a99518d0a8
Merge branch 'KoboldAI:main' into united
2022-07-02 12:59:53 +02:00
Henk
e2f7fed99f
Don't turn gamestarted off
2022-07-02 12:59:14 +02:00
henk717
74547b31d6
Merge pull request #167 from VE-FORBRYDERNE/accelerate
...
Fix base fairseq dense models when using accelerate with a GPU
2022-07-02 02:19:41 +02:00