Commit Graph

701 Commits

Author SHA1 Message Date
a28e553412 Remove unused gettokenids 2022-03-09 11:59:33 +01:00
0943926f6a Fix for lazy loading 2022-03-07 19:52:44 -05:00
bfc07073e3 layer count fix 2022-03-07 19:33:24 -05:00
d8ab58892d saved layer value fix 2022-03-07 19:21:55 -05:00
da53d7edb3 Custom Path Load fix 2022-03-07 18:54:11 -05:00
d1a64e25da Custom Model Load Fix 2022-03-07 18:44:37 -05:00
70f1c2da9c Added stub for model name feedback 2022-03-07 14:20:25 -05:00
d0553779ab Bug Fix 2022-03-07 12:33:35 -05:00
c50fe77a7d Load Fix 2022-03-07 11:57:33 -05:00
49fc854e55 Added saving of breakmodel values so that it defaults to it on next load 2022-03-07 11:49:34 -05:00
2cf6b6e650 Merge branch 'henk717:united' into united 2022-03-07 11:31:14 -05:00
123cd45b0e Breakmodel working now with the web UI 2022-03-07 11:27:23 -05:00
7434c9221b Expand OAI Setting Compatibility 2022-03-07 08:56:47 +01:00
5e00f7daf0 Next evolution of web ui model selection. Custom Paths not working quite right. 2022-03-06 20:55:11 -05:00
2ddf45141b Initial UI based model loading. Includes all parameters except breakmodel chunks, engine # for OAI, and url for ngrok url for google colab 2022-03-06 19:51:35 -05:00
f6c95f18fa Fix for Redo (#94)
* Corrected redo to skip blank steps (blank from "deleting" the chunk with the edit function)

* Removed debug code
2022-03-06 23:18:14 +01:00
f857696224 OAI ConfigName Bugfix 2022-03-06 20:18:42 +01:00
3ddc9647eb Basic GooseAI Support 2022-03-06 20:10:30 +01:00
daea4b8d15 Fix Breakmodel RAM Regression 2022-03-06 08:26:50 +01:00
105d3831b5 Lazy Load Float32 for CPU 2022-03-06 07:56:04 +01:00
373f7b9bd5 Don't convert tensors to float16 if using CPU-only mode 2022-03-05 14:30:26 -05:00
579e85820c Resolve merge conflict 2022-03-05 14:13:56 -05:00
2e19ea1bb6 Auto detect if we're in a Colab TPU instance 2022-03-05 14:07:23 -05:00
4a8d7f5e0b Merge branch 'henk717:united' into united 2022-03-05 13:25:10 -05:00
0a258a6282 Support for loading HF models on TPU with --colab_tpu 2022-03-05 12:33:33 -05:00
86ac562b0c Lazy loader should convert model tensors to float16 before moving them 2022-03-05 11:31:34 -05:00
4dd119c38d Redo no longer goes through formatting function (thereby getting changed) 2022-03-05 11:15:33 -05:00
353817b4da Remove debug print statements 2022-03-05 10:35:06 -05:00
221f264fa7 Redo fix. Fix for actions structure to not error out when asking for next_id when the actions list is empty. 2022-03-05 10:31:28 -05:00
a00dede610 Put the XGLM embedding patch behind a version check 2022-03-04 19:10:15 -05:00
5674516f0c Merge branch 'united' into lazy-loader 2022-03-04 18:27:51 -05:00
5f92cbc231 Merge branch 'united' of https://github.com/ebolam/KoboldAI into united 2022-03-04 15:37:34 -05:00
321f45ccad Fix debug to never crash (would on some initialization steps) 2022-03-04 15:36:13 -05:00
ee883fc4da Merge branch 'henk717:united' into united 2022-03-04 14:15:16 -05:00
26b9268391 Redo bug fix 2022-03-04 14:14:44 -05:00
eb247d69c3 Merge branch 'KoboldAI:main' into united 2022-03-04 18:24:56 +01:00
a1fedca2c8 Use lazy loading automatically if a config file exists for the model 2022-03-04 11:11:33 -05:00
ae143e896c Fixed unnecessary spacing in chatmode
This makes it go from "john :" to "John:", as it's supposed to be. As simple as it is, it can easily throw a chatbot model for a loop.
2022-03-04 08:46:00 -06:00
f0629958b1 Merge branch 'united' into lazy-loader 2022-03-04 00:37:25 -05:00
58a2c18821 Add lazy torch loading support to transformers backend 2022-03-04 00:33:10 -05:00
e033b04f87 Restore United 2022-03-02 11:40:50 +01:00
f9ac23ba4e Add Janeway and Shinen 2022-03-02 09:51:25 +01:00
3f73f84b69 bug fix 2022-02-28 19:04:12 -05:00
6003b2369b Debug and load story fix for actions_metadata variable 2022-02-28 10:39:36 -05:00
47d102635e Merge branch 'united' into united 2022-02-28 08:37:45 -05:00
7803fbb137 Fixed error in redo action when editing previous entries and/or editing right after a redo 2022-02-28 08:31:26 -05:00
13fe472264 Menu Polish 2022-02-28 02:47:15 +01:00
f628929401 Merge pull request #85 from VE-FORBRYDERNE/sp
Fix a bug with soft prompts when using transformers XGLM
2022-02-28 02:33:18 +01:00
4849a30d88 Merge pull request #84 from mrseeker/patch-3
Added KoboldAI/fairseq-dense-2.7B-Janeway
2022-02-28 02:33:07 +01:00
a466e13c00 Model List Support 2022-02-26 12:34:07 +01:00