ebolam
|
4822619192
|
Fix for model backends that have no inputs not being able to load in the UI
|
2023-05-22 20:47:14 -04:00 |
|
ebolam
|
4c25d6fbbb
|
Fix for loading model multiple times loosing the gpu/cpu splits
|
2023-05-22 20:34:01 -04:00 |
|
ebolam
|
9e53bcf676
|
Fix for breakmodel loading to CPU when set to GPU
|
2023-05-22 20:24:57 -04:00 |
|
ebolam
|
f1a16f260f
|
Potential breakmodel fix
|
2023-05-22 16:10:41 -04:00 |
|
ebolam
|
ca770844b0
|
Fix for breakmodel
|
2023-05-22 15:07:59 -04:00 |
|
ebolam
|
dc20e6dde9
|
Fix for unloading models
|
2023-05-22 15:04:33 -04:00 |
|
ebolam
|
925cad2e2f
|
Better compatibility with hf model backend
|
2023-05-22 14:50:13 -04:00 |
|
ebolam
|
513b8575e7
|
Fix for missing import
Fix for model name being a path which caused save issues
|
2023-05-20 11:01:49 -04:00 |
|
ebolam
|
19559d5eef
|
Fix for colors in the classic UI
|
2023-05-19 19:15:25 -04:00 |
|
ebolam
|
128c77e0fd
|
Default model backend to huggingface if not present when loading a model through the command line
|
2023-05-19 19:01:11 -04:00 |
|
ebolam
|
a1ee6849dc
|
Custom Paths from Menu structure fixed
|
2023-05-19 18:28:47 -04:00 |
|
ebolam
|
6df5fe4ad0
|
partial load model from custom path in menu
|
2023-05-19 18:24:06 -04:00 |
|
ebolam
|
309f1c432a
|
Added the ability to disable model backends in the model backend code.
|
2023-05-19 17:43:13 -04:00 |
|
ebolam
|
b21884fc31
|
Better error reporting
|
2023-05-19 17:34:15 -04:00 |
|
ebolam
|
db30402c3b
|
Move RWKV to use Huggingface model backend
|
2023-05-19 17:30:36 -04:00 |
|
ebolam
|
756a33c63e
|
Added try loop on model backend so it will continue with other models.
|
2023-05-19 17:28:39 -04:00 |
|
0cc4m
|
c32932998d
|
Update GPTQ module to 0.0.4
|
2023-05-19 21:51:38 +02:00 |
|
ebolam
|
9df1f03b12
|
Fix for custom huggingface model menu entry
|
2023-05-19 14:28:36 -04:00 |
|
ebolam
|
a1036465af
|
Add warning about command line changes and new modular backend
|
2023-05-19 12:46:02 -04:00 |
|
ebolam
|
caef2edcfc
|
Migrated load dialog to UI1
|
2023-05-19 12:35:39 -04:00 |
|
0cc4m
|
d5eac13d9f
|
Fix 2, 3 and 8-bit loading
|
2023-05-19 18:22:26 +02:00 |
|
ebolam
|
7e0778c871
|
Remove extra debug stuff
|
2023-05-19 09:14:37 -04:00 |
|
ebolam
|
36e679b366
|
Merge branch 'Model_Plugins' of https://github.com/ebolam/KoboldAI into Model_Plugins
|
2023-05-19 09:11:22 -04:00 |
|
ebolam
|
99cffd4755
|
Colab GPU edition fixes
|
2023-05-19 09:11:08 -04:00 |
|
ebolam
|
3db231562f
|
Merge pull request #382 from henk717/united
Update to united
|
2023-05-19 06:05:25 -04:00 |
|
ebolam
|
56d2705f4b
|
removed breakmodel command line arguments (except nobreakmodel)
|
2023-05-18 20:19:33 -04:00 |
|
ebolam
|
06f59a7b7b
|
Moved model backends to separate folders
added some model backend settings save/load
|
2023-05-18 20:14:33 -04:00 |
|
ebolam
|
4040538d34
|
Model Backends now defined in the menu
|
2023-05-18 18:34:00 -04:00 |
|
ebolam
|
182ecff202
|
Added in model backend to the command line arguments
|
2023-05-18 16:01:17 -04:00 |
|
0cc4m
|
2c18d9f2b5
|
Update GPTQ module to 0.0.3
|
2023-05-18 21:51:03 +02:00 |
|
ebolam
|
f027d8b6e5
|
Better working valid detection and named model backends for UI
|
2023-05-17 21:15:31 -04:00 |
|
Henk
|
b2501e4693
|
4.29 was still to buggy
|
2023-05-16 22:15:59 +02:00 |
|
Henk
|
59c96b5b7a
|
Unban fix
|
2023-05-15 22:38:12 +02:00 |
|
Henk
|
c5100b4eab
|
Unban Tensor
|
2023-05-15 22:21:22 +02:00 |
|
Henk
|
56443bc7ea
|
Unban torch._tensor._rebuild_tensor_v2
|
2023-05-15 21:44:01 +02:00 |
|
0cc4m
|
3d4d5df76b
|
Remove rocm wheel, because it didn't work correctly
|
2023-05-13 20:33:13 +02:00 |
|
0cc4m
|
7f7b350741
|
Catch further error during multigpu 4bit setup
|
2023-05-13 20:31:01 +02:00 |
|
Henk
|
205c64f1ea
|
More universal pytorch folder detection
|
2023-05-13 20:26:55 +02:00 |
|
0cc4m
|
266c0574f6
|
Fix 4bit pt loading, add traceback output to GPT2 fallback
|
2023-05-13 20:15:11 +02:00 |
|
ebolam
|
c6b17889d0
|
Updated to latest united
|
2023-05-12 07:53:27 -04:00 |
|
Henk
|
67df9b917f
|
Reintroduce 4.29 Transformers
|
2023-05-12 09:08:07 +02:00 |
|
henk717
|
db32aba74d
|
Merge pull request #359 from one-some/gptj-fix
GPT-J fix
|
2023-05-12 08:40:00 +02:00 |
|
ebolam
|
aaa9133899
|
Disk Cache working
UI valid marker broken for disk cache
|
2023-05-11 21:22:33 -04:00 |
|
ebolam
|
a6f0e97ba0
|
Working(?) breakmodel
|
2023-05-11 20:40:05 -04:00 |
|
ebolam
|
69d942c00c
|
Kind of working breakmodel
|
2023-05-11 20:22:30 -04:00 |
|
somebody
|
3065c1b40e
|
Ignore missing keys in get_original_key
|
2023-05-11 17:10:43 -05:00 |
|
somebody
|
c16336f646
|
Add traceback to debug log on fallback
|
2023-05-11 17:10:19 -05:00 |
|
somebody
|
6838563ea9
|
Merge branch 'united' of https://github.com/henk717/KoboldAI into united
|
2023-05-11 16:32:25 -05:00 |
|
ebolam
|
a9c785d0f0
|
Fix for Horde
|
2023-05-11 14:20:14 -04:00 |
|
ebolam
|
e9c845dc2a
|
Fix for badwordIDs
|
2023-05-11 14:14:52 -04:00 |
|