Commit Graph

998 Commits

Author SHA1 Message Date
MrReplikant ae143e896c
Fixed unnecessary spacing in chatmode
This makes it go from "john :" to "John:", as it's supposed to be. As simple as it is, it can easily throw a chatbot model for a loop.
2022-03-04 08:46:00 -06:00
henk717 addc7edd49
Merge branch 'KoboldAI:main' into united 2022-03-04 11:34:04 +01:00
henk717 749d4a1c48 Update Colab Descriptions (GPU) 2022-03-04 11:33:05 +01:00
henk717 fade5fdd60 Update model descriptions (TPU) 2022-03-04 11:31:03 +01:00
henk717 2936778dbc
Merge branch 'KoboldAI:main' into united 2022-03-04 09:56:35 +01:00
henk717 2aeb2c6607 Add Janeway 6B and Shinen 6B 2022-03-04 09:53:34 +01:00
Gnome Ann f0629958b1 Merge branch 'united' into lazy-loader 2022-03-04 00:37:25 -05:00
Gnome Ann 58a2c18821 Add lazy torch loading support to transformers backend 2022-03-04 00:33:10 -05:00
Gnome Ann 1515996fca Fix torch_lazy_loader seek offset calculation 2022-03-03 23:53:40 -05:00
Gnome Ann 24bc0f81ea Remove duplicate `torch_load` definition 2022-03-03 19:55:31 -05:00
Gnome Ann 8e6e04be5f (torch_lazy_loader.py) Add dematerialized modules setting 2022-03-03 11:17:59 -05:00
Gnome Ann 1ecc452dc8 (torch_lazy_loader.py) Add support for materializing from a ZipExtFile 2022-03-02 13:08:21 -05:00
henk717 e033b04f87 Restore United 2022-03-02 11:40:50 +01:00
henk717 f9ac23ba4e Add Janeway and Shinen 2022-03-02 09:51:25 +01:00
henk717 c8ece04b1d
Merge pull request #99 from VE-FORBRYDERNE/mutation-observer
Re-enable the editor mutation observer
2022-03-02 09:39:03 +01:00
Gnome Ann c338b52d68 (torch_lazy_loader.py) Handle checkpoints with merged storage blocks 2022-03-02 01:02:35 -05:00
Gnome Ann 4fa4dbac50 Clean up when error is thrown in `use_lazy_torch_load` 2022-03-01 19:30:22 -05:00
Gnome Ann a0344b429c Upload torch_lazy_loader.py 2022-03-01 15:40:44 -05:00
ebolam 3f73f84b69 bug fix 2022-02-28 19:04:12 -05:00
Gnome Ann d8e99b12f1 Re-enable the editor mutation observer 2022-02-28 19:00:26 -05:00
henk717 50ad6864c9
Merge pull request #87 from ebolam/united
Debug and load story fix for actions_metadata variable
2022-02-28 16:58:49 +01:00
ebolam 6003b2369b Debug and load story fix for actions_metadata variable 2022-02-28 10:39:36 -05:00
henk717 261981da45
Merge pull request #86 from ebolam/united
Fixed error in redo action
2022-02-28 14:43:53 +01:00
ebolam 47d102635e
Merge branch 'united' into united 2022-02-28 08:37:45 -05:00
ebolam 7803fbb137 Fixed error in redo action when editing previous entries and/or editing right after a redo 2022-02-28 08:31:26 -05:00
henk717 13fe472264 Menu Polish 2022-02-28 02:47:15 +01:00
henk717 f628929401
Merge pull request #85 from VE-FORBRYDERNE/sp
Fix a bug with soft prompts when using transformers XGLM
2022-02-28 02:33:18 +01:00
henk717 4849a30d88
Merge pull request #84 from mrseeker/patch-3
Added KoboldAI/fairseq-dense-2.7B-Janeway
2022-02-28 02:33:07 +01:00
henk717 a466e13c00 Model List Support 2022-02-26 12:34:07 +01:00
Gnome Ann a22d59e191 Fix a bug with soft prompts when using transformers XGLM 2022-02-25 12:35:23 -05:00
Julius ter Pelkwijk 0a7376a711
Added KoboldAI/fairseq-dense-2.7B-Janeway
With pleasure I am introducing KoboldAI/fairseq-dense-2.7B-Janeway.
2022-02-24 09:00:56 +01:00
henk717 1fc173890e
Merge pull request #83 from VE-FORBRYDERNE/loadsettings
Load settings earlier to avoid TPU badwords issues
2022-02-24 04:24:28 +01:00
Gnome Ann 072ca87977 Load soft prompt at the end instead of inside `loadsettings()` 2022-02-23 21:15:08 -05:00
Gnome Ann 8120e4dfa2 Need to set `vars.allowsp` to True before calling `loadsettings()` 2022-02-23 21:09:31 -05:00
Gnome Ann c45ba497c9 Load settings earlier to avoid TPU badwords issues 2022-02-23 20:39:11 -05:00
henk717 ac59e55d62 Smaller optimizations 2022-02-24 01:14:26 +01:00
henk717 8e9d9faa97
Merge pull request #82 from VE-FORBRYDERNE/tpu-config
Allow TPU models to specify settings/config in config.json
2022-02-24 00:53:40 +01:00
Gnome Ann ad10ac8871 Allow TPU models to specify settings/config in config.json 2022-02-23 18:22:18 -05:00
henk717 7de3311000 Fix sentencepiece model saving 2022-02-23 22:04:41 +01:00
henk717 6151d16df0
Merge pull request #81 from VE-FORBRYDERNE/dematerialized
Use dematerialized loading in TPU backend for lower device memory usage
2022-02-23 07:11:26 +01:00
Gnome Ann 7ec549c726 Use dematerialized loading in TPU backend for lower device memory usage 2022-02-22 19:43:13 -05:00
henk717 fd7ba9f70e Also check for Config in models/ 2022-02-22 19:22:08 +01:00
henk717 306d96a8eb Seperate Drive Disconnect 2022-02-22 18:03:06 +01:00
henk717 a0518edc36 Temporary Transformers Git for XGLM 2022-02-22 02:42:04 +01:00
henk717 74012a24c9 Expose GDrive Models 2022-02-22 02:35:27 +01:00
henk717 9aeae94d0e Cleanup leakage (Didn't appear in my commit list) 2022-02-22 02:32:02 +01:00
henk717 cb6ccacd64 Dependencies required for newer models 2022-02-21 21:17:12 +01:00
henk717 4ace11f5b8
Merge pull request #80 from VE-FORBRYDERNE/xglm-position-ids
Temporary fix for XGLM positional embedding issues
2022-02-21 00:47:20 +01:00
henk717 300db651de Open models folder by default 2022-02-21 00:46:18 +01:00
Gnome Ann da10e2dc1d Don't crash if `XGLMSinusoidalPositionalEmbedding` doesn't exist 2022-02-20 17:41:00 -05:00