Commit Graph

1595 Commits

Author SHA1 Message Date
somebody
1546b9efaa Hello its breaking breakmodel time 2023-05-27 16:31:53 -05:00
henk717
97d2a78899 Merge pull request #362 from ebolam/Model_Plugins
Implement modular model backends Phase 1
2023-05-27 15:33:20 +02:00
ebolam
cce5c1932c Fix for custom model names 2023-05-26 21:40:39 -04:00
ebolam
9bc9021843 Added better help message for model_parameters in command line arguments 2023-05-26 21:16:54 -04:00
ebolam
9723154bed Fix for --path 2023-05-26 20:10:11 -04:00
ebolam
acf5b40cd8 Bug fix 2023-05-26 19:38:37 -04:00
ebolam
d2c95bc60f Fix for non-jailed menu path navigation 2023-05-26 10:33:59 -04:00
ebolam
adb77b8651 Fix for horde and multi-selected models 2023-05-25 18:43:56 -04:00
ebolam
c9523a340e TPU Fix 2023-05-24 19:50:08 -04:00
ebolam
b0ed7da9dd more tpu debugging 2023-05-24 19:47:45 -04:00
ebolam
54221942ef TPU Fix 2023-05-24 19:43:32 -04:00
ebolam
6a62726575 TPU Fix? 2023-05-24 19:30:23 -04:00
ebolam
b116e22bca Fix for colab 2023-05-24 16:47:19 -04:00
ebolam
92f592ea20 Fix for model name not showing correctly on load in UI1 2023-05-24 11:48:25 -04:00
ebolam
c61e2b676a More environmental variable feedback 2023-05-24 09:05:21 -04:00
ebolam
9d708bc424 Logging of environmental variables over-riding command line arguments 2023-05-24 08:56:52 -04:00
Henk
37799af85c Show IP for localtunnel 2023-05-23 23:44:52 +02:00
ebolam
7a8e4c39da Fix for attention bias 2023-05-23 08:35:15 -04:00
ebolam
513b8575e7 Fix for missing import
Fix for model name being a path which caused save issues
2023-05-20 11:01:49 -04:00
ebolam
128c77e0fd Default model backend to huggingface if not present when loading a model through the command line 2023-05-19 19:01:11 -04:00
ebolam
a1ee6849dc Custom Paths from Menu structure fixed 2023-05-19 18:28:47 -04:00
ebolam
6df5fe4ad0 partial load model from custom path in menu 2023-05-19 18:24:06 -04:00
ebolam
309f1c432a Added the ability to disable model backends in the model backend code. 2023-05-19 17:43:13 -04:00
ebolam
b21884fc31 Better error reporting 2023-05-19 17:34:15 -04:00
ebolam
db30402c3b Move RWKV to use Huggingface model backend 2023-05-19 17:30:36 -04:00
ebolam
756a33c63e Added try loop on model backend so it will continue with other models. 2023-05-19 17:28:39 -04:00
ebolam
9df1f03b12 Fix for custom huggingface model menu entry 2023-05-19 14:28:36 -04:00
ebolam
7e0778c871 Remove extra debug stuff 2023-05-19 09:14:37 -04:00
ebolam
99cffd4755 Colab GPU edition fixes 2023-05-19 09:11:08 -04:00
ebolam
56d2705f4b removed breakmodel command line arguments (except nobreakmodel) 2023-05-18 20:19:33 -04:00
ebolam
06f59a7b7b Moved model backends to separate folders
added some model backend settings save/load
2023-05-18 20:14:33 -04:00
ebolam
4040538d34 Model Backends now defined in the menu 2023-05-18 18:34:00 -04:00
ebolam
182ecff202 Added in model backend to the command line arguments 2023-05-18 16:01:17 -04:00
ebolam
f027d8b6e5 Better working valid detection and named model backends for UI 2023-05-17 21:15:31 -04:00
ebolam
c6b17889d0 Updated to latest united 2023-05-12 07:53:27 -04:00
Henk
67df9b917f Reintroduce 4.29 Transformers 2023-05-12 09:08:07 +02:00
ebolam
aaa9133899 Disk Cache working
UI valid marker broken for disk cache
2023-05-11 21:22:33 -04:00
ebolam
69d942c00c Kind of working breakmodel 2023-05-11 20:22:30 -04:00
Henk
20b54eb9ff Revert 4.29 due to unforseen consequences 2023-05-11 19:06:39 +02:00
ebolam
4605d10c37 Next iteration. Model Loading is broken completely now :) 2023-05-11 12:08:35 -04:00
ebolam
77dd5aa725 Minor update 2023-05-11 09:09:09 -04:00
Henk
e932364a1e RWKV support 2023-05-11 14:56:12 +02:00
ebolam
71aee4dbd8 First concept of model plugins with a conceptual UI.
Completely breaks UI2 model loading.
2023-05-10 16:30:46 -04:00
somebody
a9e342ca64 Fix TPU API errors 2023-05-08 17:34:59 -05:00
somebody
f02ddab7c7 Merge branch 'united' of https://github.com/henk717/KoboldAI into peft 2023-05-06 10:47:14 -05:00
Henk
2730879c61 Better warning until something more robust is in 2023-05-05 21:28:06 +02:00
Henk
d508b4a319 More max_context_length flexibility 2023-05-05 19:50:56 +02:00
Henk
33969b5845 Basic HF code execution support 2023-05-05 17:23:01 +02:00
somebody
35b56117e6 Basic PEFT support 2023-05-03 18:51:01 -05:00
henk717
7f5242db17 Merge pull request #344 from pi6am/fix/llama-tokens
Fix/llama tokens
2023-05-03 19:07:47 +02:00