Commit Graph

966 Commits

Author SHA1 Message Date
4625158d30 Fix typo in previous commit 2022-03-05 12:56:42 -05:00
0a258a6282 Support for loading HF models on TPU with --colab_tpu 2022-03-05 12:33:33 -05:00
86ac562b0c Lazy loader should convert model tensors to float16 before moving them 2022-03-05 11:31:34 -05:00
4dd119c38d Redo no longer goes through formatting function (thereby getting changed) 2022-03-05 11:15:33 -05:00
353817b4da Remove debug print statements 2022-03-05 10:35:06 -05:00
221f264fa7 Redo fix. Fix for actions structure to not error out when asking for next_id when the actions list is empty. 2022-03-05 10:31:28 -05:00
a00dede610 Put the XGLM embedding patch behind a version check 2022-03-04 19:10:15 -05:00
5674516f0c Merge branch 'united' into lazy-loader 2022-03-04 18:27:51 -05:00
8e12b7df61 Merge pull request #90 from ebolam/united
Redo Bug Fix
2022-03-04 22:10:49 +01:00
5f92cbc231 Merge branch 'united' of https://github.com/ebolam/KoboldAI into united 2022-03-04 15:37:34 -05:00
321f45ccad Fix debug to never crash (would on some initialization steps) 2022-03-04 15:36:13 -05:00
ee883fc4da Merge branch 'henk717:united' into united 2022-03-04 14:15:16 -05:00
26b9268391 Redo bug fix 2022-03-04 14:14:44 -05:00
eb247d69c3 Merge branch 'KoboldAI:main' into united 2022-03-04 18:24:56 +01:00
657de72ada Merge: Better name formatting for chatmode 2022-03-04 18:24:39 +01:00
4474607f88 Merge branch 'united' into lazy-loader 2022-03-04 11:12:29 -05:00
a1fedca2c8 Use lazy loading automatically if a config file exists for the model 2022-03-04 11:11:33 -05:00
ff1be78f72 Merge pull request #1 from MrReplikant/MrReplikant-patch-1
Fixed unnecessary spacing in chatmode
2022-03-04 08:46:43 -06:00
ae143e896c Fixed unnecessary spacing in chatmode
This makes it go from "john :" to "John:", as it's supposed to be. As simple as it is, it can easily throw a chatbot model for a loop.
2022-03-04 08:46:00 -06:00
addc7edd49 Merge branch 'KoboldAI:main' into united 2022-03-04 11:34:04 +01:00
749d4a1c48 Update Colab Descriptions (GPU) 2022-03-04 11:33:05 +01:00
fade5fdd60 Update model descriptions (TPU) 2022-03-04 11:31:03 +01:00
2936778dbc Merge branch 'KoboldAI:main' into united 2022-03-04 09:56:35 +01:00
2aeb2c6607 Add Janeway 6B and Shinen 6B 2022-03-04 09:53:34 +01:00
f0629958b1 Merge branch 'united' into lazy-loader 2022-03-04 00:37:25 -05:00
58a2c18821 Add lazy torch loading support to transformers backend 2022-03-04 00:33:10 -05:00
1515996fca Fix torch_lazy_loader seek offset calculation 2022-03-03 23:53:40 -05:00
24bc0f81ea Remove duplicate torch_load definition 2022-03-03 19:55:31 -05:00
8e6e04be5f (torch_lazy_loader.py) Add dematerialized modules setting 2022-03-03 11:17:59 -05:00
1ecc452dc8 (torch_lazy_loader.py) Add support for materializing from a ZipExtFile 2022-03-02 13:08:21 -05:00
e033b04f87 Restore United 2022-03-02 11:40:50 +01:00
f9ac23ba4e Add Janeway and Shinen 2022-03-02 09:51:25 +01:00
c8ece04b1d Merge pull request #99 from VE-FORBRYDERNE/mutation-observer
Re-enable the editor mutation observer
2022-03-02 09:39:03 +01:00
c338b52d68 (torch_lazy_loader.py) Handle checkpoints with merged storage blocks 2022-03-02 01:02:35 -05:00
4fa4dbac50 Clean up when error is thrown in use_lazy_torch_load 2022-03-01 19:30:22 -05:00
a0344b429c Upload torch_lazy_loader.py 2022-03-01 15:40:44 -05:00
3f73f84b69 bug fix 2022-02-28 19:04:12 -05:00
d8e99b12f1 Re-enable the editor mutation observer 2022-02-28 19:00:26 -05:00
50ad6864c9 Merge pull request #87 from ebolam/united
Debug and load story fix for actions_metadata variable
2022-02-28 16:58:49 +01:00
6003b2369b Debug and load story fix for actions_metadata variable 2022-02-28 10:39:36 -05:00
261981da45 Merge pull request #86 from ebolam/united
Fixed error in redo action
2022-02-28 14:43:53 +01:00
47d102635e Merge branch 'united' into united 2022-02-28 08:37:45 -05:00
7803fbb137 Fixed error in redo action when editing previous entries and/or editing right after a redo 2022-02-28 08:31:26 -05:00
13fe472264 Menu Polish 2022-02-28 02:47:15 +01:00
f628929401 Merge pull request #85 from VE-FORBRYDERNE/sp
Fix a bug with soft prompts when using transformers XGLM
2022-02-28 02:33:18 +01:00
4849a30d88 Merge pull request #84 from mrseeker/patch-3
Added KoboldAI/fairseq-dense-2.7B-Janeway
2022-02-28 02:33:07 +01:00
a466e13c00 Model List Support 2022-02-26 12:34:07 +01:00
a22d59e191 Fix a bug with soft prompts when using transformers XGLM 2022-02-25 12:35:23 -05:00
0a7376a711 Added KoboldAI/fairseq-dense-2.7B-Janeway
With pleasure I am introducing KoboldAI/fairseq-dense-2.7B-Janeway.
2022-02-24 09:00:56 +01:00
1fc173890e Merge pull request #83 from VE-FORBRYDERNE/loadsettings
Load settings earlier to avoid TPU badwords issues
2022-02-24 04:24:28 +01:00