Commit Graph

608 Commits

Author SHA1 Message Date
Henk e98cc3cb16 OPT models 2022-05-12 23:55:21 +02:00
Henk 376e76f5da S mode for OPT 2022-05-12 02:18:14 +02:00
Gnome Ann 46cfa1367f Add `--no_aria2` command line flag 2022-05-11 00:44:56 -04:00
Gnome Ann f09959f9be Fix patching code of `PreTrainedModel.from_pretrained()` 2022-05-11 00:41:53 -04:00
Gnome Ann 4b49d1c464 Make sure `vars.revision` is defined 2022-05-10 22:51:36 -04:00
Gnome Ann b97b2a02d6 Add `--revision` command line flag 2022-05-10 22:14:56 -04:00
Gnome Ann 937d9ee06a Change default `model.save_pretrained` shard size to 500 MiB 2022-05-10 22:04:25 -04:00
Gnome Ann a388c63023 Use aria2 to download split checkpoints 2022-05-10 21:28:13 -04:00
subtlewave 9c83ef7fa9
Replaced Adventure 125M and added C1-1.3B 2022-04-28 22:35:04 +00:00
Gnome Ann ea82867e4d Merge branch 'united' into settings 2022-04-26 13:58:01 -04:00
Henk 11280a6e66 LocalTunnel Linux Fix 2022-04-19 14:41:21 +02:00
Henk b8e79afe5e LocalTunnel support 2022-04-19 13:47:44 +02:00
Gnome Ann c7b03398f6 Merge 'nolialsea/patch-1' into settings without Colab changes 2022-04-17 12:15:36 -04:00
henk717 372eb4c981
Merge pull request #119 from VE-FORBRYDERNE/scripting-sp
Allow userscripts to change the soft prompt
2022-04-14 21:33:20 +02:00
henk717 78d6ee491d
Merge pull request #117 from mrseeker/patch-7
Shinen FSD 13B (NSFW)
2022-04-14 21:33:08 +02:00
henk717 e180db88aa
Merge pull request #118 from VE-FORBRYDERNE/lazy-loader
Fix lazy loader in aiserver.py
2022-04-14 21:33:00 +02:00
Gnome Ann bd6f7798b9 Fix lazy loader in aiserver.py 2022-04-14 14:33:10 -04:00
Julius ter Pelkwijk ad94f6c01c
Shinen FSD 13B (NSFW) 2022-04-14 08:23:50 +02:00
Julius ter Pelkwijk 945c34e822
Shinen FSD 6.7B (NSFW) 2022-04-13 14:47:22 +02:00
Henk eeff126df4 Memory Sizes 2022-04-13 12:41:21 +02:00
Gnome Ann a3a52dc9c3 Add support for changing soft prompt from userscripts 2022-04-12 15:59:05 -04:00
Henk 26909e6cf3 Model Categories 2022-04-10 20:53:15 +02:00
Julius ter Pelkwijk 6fcb0af488
Adding Janeway 13B 2022-04-10 15:03:39 +02:00
Gnome Ann 359a0a1c99 Copy Python 3.6 compatible lazy loader to aiserver.py 2022-04-08 19:40:12 -04:00
Julius ter Pelkwijk 1974761f70
Releasing Janeway 6.7B 2022-04-08 08:13:36 +02:00
Wes Brown 09fee52abd Add `num_seqs` support to GooseAI/OpenAI client handler. 2022-04-07 14:50:23 -04:00
Henky!! 5feda462fb OAI - Fixes last commit 2022-04-07 02:39:37 +02:00
Henky!! 34b6c907f0 OAI Max Token Slider 2022-04-07 02:26:15 +02:00
Henky!! b568e31381 OAI Path Support 2022-04-06 05:15:25 +02:00
Henky!! 699b3fc10b OAI Redo Fixes 2 2022-04-06 04:54:27 +02:00
Henky!! b5a633e69b OAI Redo Fix 2022-04-06 04:45:01 +02:00
henk717 ee682702ee
Merge branch 'KoboldAI:main' into united 2022-04-05 01:35:22 +02:00
Henky!! 8153f21d5c Convo 6B 2022-04-05 01:33:51 +02:00
Henky!! e644963564 OpenAI Fixes 2022-03-28 02:02:37 +02:00
Gnome Ann 20e48b11d7 Typical sampling 2022-03-27 16:25:50 -04:00
Noli aa8de64aa4 fix default port 2022-03-25 23:26:27 +01:00
Noli 3e003d3b42 add port to the command options 2022-03-25 22:18:28 +01:00
Gnome Ann 0348970b19 Make sure AI is not busy when using retry to regenerate random story 2022-03-23 22:09:35 -04:00
Gnome Ann 4832dd6f37 Allow regenerating random story using Retry button
Commit b55e5a8e0b removed this feature, so
this commit adds it back.
2022-03-23 13:39:46 -04:00
henk717 cf99f02ca5 Merge branch 'main' into united 2022-03-20 19:22:53 +01:00
henk717 20eab085dd Fix AutoSave Toggle 2022-03-20 19:12:11 +01:00
henk717 5c795609e4 KML Fix 2022-03-20 13:10:56 +01:00
Gnome Ann b1125a6705 Add EOS and padding token to default NeoX badwords 2022-03-19 01:30:02 -04:00
Gnome Ann 85a4959efa Merge branch 'united' into neox 2022-03-18 11:19:03 -04:00
henk717 a3e5e052b3 Newer umamba + slope tweak 2022-03-16 18:34:02 +01:00
Gnome Ann 95c4251db9 Print two newlines before loading HF models 2022-03-15 13:58:53 -04:00
Gnome Ann 9dc48b15f0 Add custom badwords and pad token ID for GPT-NeoX 2022-03-14 23:31:49 -04:00
Gnome Ann 88f247d535 GPT-NeoX-20B support in Colab TPU instances 2022-03-14 23:14:20 -04:00
henk717 4892556059 Model saving for colab mode 2022-03-13 11:22:44 +01:00
Gnome Ann 2b8c46338e Change current working directory to KoboldAI folder 2022-03-13 01:22:11 -05:00
ebolam 8ae0a4a3e7 Online Services Working now (without a way to test as I don't have accounts) 2022-03-12 14:21:11 -05:00
ebolam b55e5a8e0b Retry Bug Fix 2022-03-12 10:32:27 -05:00
ebolam ae854bab3d Fix for retry causing issues for future redo actions 2022-03-11 11:40:55 -05:00
ebolam 772ae2eb80 Added model info to show model load progress in UI 2022-03-11 11:31:41 -05:00
henk717 b02d5e8696 Allows missing model_config again 2022-03-10 19:59:10 +01:00
henk717 172a548fa1 Fallback to generic GPT2 Tokenizer 2022-03-10 19:52:15 +01:00
henk717 9dee9b5c6d Ignore incorrect problems 2022-03-09 12:03:37 +01:00
henk717 a28e553412 Remove unused gettokenids 2022-03-09 11:59:33 +01:00
ebolam 0943926f6a Fix for lazy loading 2022-03-07 19:52:44 -05:00
ebolam bfc07073e3 layer count fix 2022-03-07 19:33:24 -05:00
ebolam d8ab58892d saved layer value fix 2022-03-07 19:21:55 -05:00
ebolam da53d7edb3 Custom Path Load fix 2022-03-07 18:54:11 -05:00
ebolam d1a64e25da Custom Model Load Fix 2022-03-07 18:44:37 -05:00
ebolam 70f1c2da9c Added stub for model name feedback 2022-03-07 14:20:25 -05:00
ebolam d0553779ab Bug Fix 2022-03-07 12:33:35 -05:00
ebolam c50fe77a7d Load Fix 2022-03-07 11:57:33 -05:00
ebolam 49fc854e55 Added saving of breakmodel values so that it defaults to it on next load 2022-03-07 11:49:34 -05:00
ebolam 2cf6b6e650
Merge branch 'henk717:united' into united 2022-03-07 11:31:14 -05:00
ebolam 123cd45b0e Breakmodel working now with the web UI 2022-03-07 11:27:23 -05:00
henk717 7434c9221b Expand OAI Setting Compatibility 2022-03-07 08:56:47 +01:00
ebolam 5e00f7daf0 Next evolution of web ui model selection. Custom Paths not working quite right. 2022-03-06 20:55:11 -05:00
ebolam 2ddf45141b Initial UI based model loading. Includes all parameters except breakmodel chunks, engine # for OAI, and url for ngrok url for google colab 2022-03-06 19:51:35 -05:00
ebolam f6c95f18fa
Fix for Redo (#94)
* Corrected redo to skip blank steps (blank from "deleting" the chunk with the edit function)

* Removed debug code
2022-03-06 23:18:14 +01:00
henk717 f857696224 OAI ConfigName Bugfix 2022-03-06 20:18:42 +01:00
henk717 3ddc9647eb Basic GooseAI Support 2022-03-06 20:10:30 +01:00
henk717 daea4b8d15 Fix Breakmodel RAM Regression 2022-03-06 08:26:50 +01:00
henk717 105d3831b5 Lazy Load Float32 for CPU 2022-03-06 07:56:04 +01:00
Gnome Ann 373f7b9bd5 Don't convert tensors to float16 if using CPU-only mode 2022-03-05 14:30:26 -05:00
Gnome Ann 579e85820c Resolve merge conflict 2022-03-05 14:13:56 -05:00
Gnome Ann 2e19ea1bb6 Auto detect if we're in a Colab TPU instance 2022-03-05 14:07:23 -05:00
ebolam 4a8d7f5e0b
Merge branch 'henk717:united' into united 2022-03-05 13:25:10 -05:00
Gnome Ann 0a258a6282 Support for loading HF models on TPU with `--colab_tpu` 2022-03-05 12:33:33 -05:00
Gnome Ann 86ac562b0c Lazy loader should convert model tensors to float16 before moving them 2022-03-05 11:31:34 -05:00
ebolam 4dd119c38d Redo no longer goes through formatting function (thereby getting changed) 2022-03-05 11:15:33 -05:00
ebolam 353817b4da Remove debug print statements 2022-03-05 10:35:06 -05:00
ebolam 221f264fa7 Redo fix. Fix for actions structure to not error out when asking for next_id when the actions list is empty. 2022-03-05 10:31:28 -05:00
Gnome Ann a00dede610 Put the XGLM embedding patch behind a version check 2022-03-04 19:10:15 -05:00
Gnome Ann 5674516f0c Merge branch 'united' into lazy-loader 2022-03-04 18:27:51 -05:00
ebolam 5f92cbc231 Merge branch 'united' of https://github.com/ebolam/KoboldAI into united 2022-03-04 15:37:34 -05:00
ebolam 321f45ccad Fix debug to never crash (would on some initialization steps) 2022-03-04 15:36:13 -05:00
ebolam ee883fc4da
Merge branch 'henk717:united' into united 2022-03-04 14:15:16 -05:00
ebolam 26b9268391 Redo bug fix 2022-03-04 14:14:44 -05:00
henk717 eb247d69c3
Merge branch 'KoboldAI:main' into united 2022-03-04 18:24:56 +01:00
Gnome Ann a1fedca2c8 Use lazy loading automatically if a config file exists for the model 2022-03-04 11:11:33 -05:00
MrReplikant ae143e896c
Fixed unnecessary spacing in chatmode
This makes it go from "john :" to "John:", as it's supposed to be. As simple as it is, it can easily throw a chatbot model for a loop.
2022-03-04 08:46:00 -06:00
Gnome Ann f0629958b1 Merge branch 'united' into lazy-loader 2022-03-04 00:37:25 -05:00
Gnome Ann 58a2c18821 Add lazy torch loading support to transformers backend 2022-03-04 00:33:10 -05:00
henk717 e033b04f87 Restore United 2022-03-02 11:40:50 +01:00
henk717 f9ac23ba4e Add Janeway and Shinen 2022-03-02 09:51:25 +01:00
ebolam 3f73f84b69 bug fix 2022-02-28 19:04:12 -05:00