Commit Graph

88 Commits

Author SHA1 Message Date
Henk
565ab8a38f badwordids -> badwordsids typofix 2023-08-30 17:00:51 +02:00
Henk
49fa63052f Allow EOS unbanning 2023-08-29 20:51:09 +02:00
somebody
906d1f2522 Merge branch 'united' of https://github.com/henk717/KoboldAI into fixing-time 2023-08-07 16:22:04 -05:00
henk717
21d20854e4 Merge pull request #414 from one-some/submit-ctx-menu
Submit context menu
2023-07-30 01:58:44 +02:00
Henk
889fe8d548 Fix Peft 2023-07-26 19:35:55 +02:00
somebody
ba313883b6 Merge branch 'united' of https://github.com/henk717/KoboldAI into submit-ctx-menu 2023-07-24 10:09:38 -05:00
Henk
30495cf8d8 Fix GPT2 2023-07-24 02:05:07 +02:00
henk717
1facc73b66 Merge pull request #367 from 0cc4m/4bit-plugin
GPTQ module
2023-07-23 22:32:20 +02:00
0cc4m
09bb1021dd Fallback to transformers if hf_bleeding_edge not available 2023-07-23 07:16:52 +02:00
somebody
5f4216730e Make logit bias work correctly(?) when prob is -inf
samplers'll do that to you

though now i am curious: what kind of effect would running the bias
before the samplers have? maybe a future option
2023-07-21 18:33:35 -05:00
somebody
e5d0a597a1 Generation Mode: UNTIL_EOS
This mode enables the EOS token and will generate infinitely until
hitting it.
2023-07-21 15:36:32 -05:00
somebody
fa0a099943 Update comment 2023-07-21 10:38:17 -05:00
somebody
fef42a6273 API: Fix loading 2023-07-19 11:52:39 -05:00
0cc4m
e78361fc8f Pull upstream changes, fix conflicts 2023-07-15 23:01:52 +02:00
Henk
2c50d5d092 Don't ruin breakmodel 2023-07-15 14:14:06 +02:00
somebody
3928d86339 Fall back to unpatched HF 2023-07-08 14:36:45 -05:00
somebody
c2ee30af32 Add --panic to raise when loading fails 2023-07-08 14:04:46 -05:00
Henk
16240878bc Restore --peft support 2023-07-04 20:42:29 +02:00
somebody
bce1a907e5 Update aux device to depend on primary device 2023-07-03 19:36:31 -05:00
somebody
6f7e6422ef Actually get correct primary device 2023-07-03 19:04:48 -05:00
somebody
59c731f805 Fix static primary_device
and some small cleanup
2023-07-03 18:37:48 -05:00
Henk
81e72329af CPU fixes 2023-07-02 21:50:23 +02:00
Henk
1da4580e8b Remove wrong usegpu behavior 2023-06-22 07:07:02 +02:00
somebody
5ee20bd7d6 Fix for CPU loading 2023-06-21 21:18:43 -05:00
somebody
b81f61b820 Clean debug 2023-06-21 18:35:56 -05:00
somebody
947bcc58e4 Experiments 2023-06-21 17:33:14 -05:00
somebody
c40649a74e Probably fix f32 2023-06-21 16:54:41 -05:00
somebody
aca2b532d7 Remove debug 2023-06-21 14:15:38 -05:00
somebody
5f224e1366 Restore choice of lazyload or not 2023-06-21 14:13:14 -05:00
somebody
0052ad401a Basic breakmodel ui support
Seems to work
2023-06-21 13:57:32 -05:00
somebody
f326fc07e8 Seems to work 2023-05-31 14:42:05 -05:00
somebody
24b0b32829 Maybe works now...? 2023-05-31 14:31:08 -05:00
somebody
ac4384ef75 Auto _no_split_modules 2023-05-31 10:55:46 -05:00
somebody
58ffad237b OPT hack 2023-05-29 13:34:11 -05:00
somebody
ceaefa9f5e Not quite 2023-05-28 14:57:45 -05:00
0cc4m
d71a63fa49 Merge ebolam's model-plugins branch 2023-05-28 09:26:13 +02:00
somebody
1546b9efaa Hello its breaking breakmodel time 2023-05-27 16:31:53 -05:00
ebolam
5561cc1f22 Fix for GPU generation 2023-05-23 08:33:19 -04:00
ebolam
4c25d6fbbb Fix for loading model multiple times loosing the gpu/cpu splits 2023-05-22 20:34:01 -04:00
ebolam
9e53bcf676 Fix for breakmodel loading to CPU when set to GPU 2023-05-22 20:24:57 -04:00
ebolam
f1a16f260f Potential breakmodel fix 2023-05-22 16:10:41 -04:00
ebolam
ca770844b0 Fix for breakmodel 2023-05-22 15:07:59 -04:00
ebolam
3db231562f Merge pull request #382 from henk717/united
Update to united
2023-05-19 06:05:25 -04:00
ebolam
56d2705f4b removed breakmodel command line arguments (except nobreakmodel) 2023-05-18 20:19:33 -04:00
ebolam
06f59a7b7b Moved model backends to separate folders
added some model backend settings save/load
2023-05-18 20:14:33 -04:00
Henk
205c64f1ea More universal pytorch folder detection 2023-05-13 20:26:55 +02:00
0cc4m
266c0574f6 Fix 4bit pt loading, add traceback output to GPT2 fallback 2023-05-13 20:15:11 +02:00
somebody
3065c1b40e Ignore missing keys in get_original_key 2023-05-11 17:10:43 -05:00
somebody
c16336f646 Add traceback to debug log on fallback 2023-05-11 17:10:19 -05:00
ebolam
71aee4dbd8 First concept of model plugins with a conceptual UI.
Completely breaks UI2 model loading.
2023-05-10 16:30:46 -04:00