Commit Graph

748 Commits

Author SHA1 Message Date
henk717 e8c39992a1
Merge pull request #166 from ebolam/united
Add file browser to soft prompts and user scripts
2022-07-04 19:52:05 +02:00
ebolam 328c0a38d7 Removed breadcrumbs on file browser before the jail directory 2022-07-03 16:02:55 -04:00
henk717 fd44f0ded3
Merge branch 'KoboldAI:main' into united 2022-07-03 15:12:12 +02:00
Henk d041ec0921 Safer defaults and more flexibility
There have been a lot of reports from newer users who experience AI breakdown because not all models properly handle 2048 max tokens. 1024 is the only value that all models support and was the original value KoboldAI used. This commit reverts the decision to increase this to 2048, any existing configurations are not effected. Users who wish to increase the max tokens can do so themselves. Most models handle up to 1900 well (The GPT2 models are excluded), for many you can go all the way. (It is currently not yet known why some finetunes cause a decrease in maxtoken support,

In addition this commit contains a request for more consistent slider behavior, allowing the sliders to be changed at 0.01 intervals instead of some sliders being capped to 0.05.
2022-07-03 15:07:54 +02:00
henk717 a99518d0a8
Merge branch 'KoboldAI:main' into united 2022-07-02 12:59:53 +02:00
Henk e2f7fed99f Don't turn gamestarted off 2022-07-02 12:59:14 +02:00
vfbd aeed9bd8f7 Fix base fairseq dense models when using accelerate with a GPU 2022-07-01 20:16:39 -04:00
ebolam 3f8a7ab4bb Allowing edit in userscripts 2022-06-30 19:41:11 -04:00
ebolam 813540fe9b Added folder browser for softprompts and userscripts 2022-06-30 19:13:05 -04:00
ebolam 97e0df45d7 File Dialog complete 2022-06-30 15:57:27 -04:00
ebolam 58418c4aa5 Basic file browser with edit and delete functionality
Can be shown by going to /popup_test in a second tab.
2022-06-30 09:44:04 -04:00
vfbd 048bd0ff3b Add support for setting the RNG seed and full determinism 2022-06-28 13:21:05 -04:00
ebolam edd6dd7cd7 Fix for saved breakmodel settings on custom models
Fix for unit tests with new disk breakmodel
2022-06-27 10:12:54 -04:00
Henk 46678931b2 Better sentence spacing 2022-06-26 20:27:21 +02:00
vfbd ebba79fed6 Remove trailing whitespace from submissions
(cherry picked from commit b99d1449c9)
2022-06-26 14:06:34 -04:00
vfbd 2a4d37ce60 Clean up whitespace at the end of actions when loading story
Specifically, we merge blank actions into the next action and we move
whitespace at the end of non-blank actions to the beginning of the next
action.

(cherry picked from commit 4b16600e49)
2022-06-26 14:04:36 -04:00
vfbd b99d1449c9 Remove trailing whitespace from submissions 2022-06-26 13:15:55 -04:00
Henk fa97d28cb3 Nerys V2 for United 2022-06-25 14:06:51 +02:00
Henk 9e7eb80db4 Nerys V2 part 2 2022-06-25 14:03:19 +02:00
Henk ecc6ee9474 Nerys V2 2022-06-25 13:47:49 +02:00
henk717 10e85db89d
Merge pull request #162 from VE-FORBRYDERNE/whitespace-cleanup
Story whitespace cleanup
2022-06-25 13:36:03 +02:00
Henk d3fce44095 Merge branch 'main' into united 2022-06-24 18:31:45 +02:00
Henk 8be0964427 AIDG Import Fix 2022-06-24 18:29:06 +02:00
vfbd 4b16600e49 Clean up whitespace at the end of actions when loading story
Specifically, we merge blank actions into the next action and we move
whitespace at the end of non-blank actions to the beginning of the next
action.
2022-06-24 12:03:35 -04:00
vfbd 3da885d408 GPT-NeoX HF model badwords fix 2022-06-23 15:02:43 -04:00
henk717 8098f4ec8f
Merge branch 'KoboldAI:main' into united 2022-06-23 17:20:48 +02:00
vfbd 0eb9f8a879 Account for lnheader in budget calculation 2022-06-22 19:16:24 -04:00
vfbd 53034ee533 Delete all torch tensors before loading model 2022-06-22 12:07:36 -04:00
vfbd 922394c68f Don't blacklist </s> token in "s" newline mode 2022-06-22 11:23:03 -04:00
Gnome Ann 8c594c6869 Correct the padding token for GPT-NeoX 2022-06-21 19:37:43 -04:00
Gnome Ann a7f667c34c Use NeoX badwords when loading from HF GPT-NeoX model 2022-06-21 19:33:25 -04:00
Gnome Ann 8593bf339b Another typo fix 2022-06-21 15:36:25 -04:00
Gnome Ann 7e0ded6b47 Typo fix 2022-06-21 15:12:55 -04:00
Gnome Ann 91643be10a Change soft prompt implementation to a more universal one 2022-06-21 15:03:43 -04:00
Gnome Ann 0ea4fa9c87 Automatically calculate badwords and pad_token_id 2022-06-21 14:35:52 -04:00
Gnome Ann 6b172306f6 move_model_to_devices no longer crashes if you don't have accelerate 2022-06-21 13:15:46 -04:00
Gnome Ann ff69e9fbfe Put layers_module_names, module_names and named_buffers in utils.py 2022-06-20 17:17:42 -04:00
Gnome Ann 1620ac4148 Lazy loader needs to cache named buffers of layers in the disk cache 2022-06-20 17:08:52 -04:00
Gnome Ann ab5ab79003 Set primary device to CPU if in CPU-only mode 2022-06-20 16:25:01 -04:00
Gnome Ann bd7d7b41a1 Don't enable accelerate if no layers are in disk cache or GPUs 2022-06-20 16:21:44 -04:00
Gnome Ann 90fd8b1845 Disk cache support in CPU-only mode 2022-06-20 16:06:09 -04:00
Gnome Ann af07d7a15f Disk cache support for computers with at least one GPU 2022-06-20 14:49:54 -04:00
Gnome Ann 47a58a36b8 Add disk cache slider 2022-06-19 22:53:30 -04:00
Gnome Ann 4dd59e0a9d Correct the type hint for lazy_load_callback 2022-06-19 17:17:41 -04:00
Gnome Ann 21de36c4b0 Lazy loader now moves all non-layer weights to primary device 2022-06-19 16:44:23 -04:00
Gnome Ann 26c319519e Lazy loader now attempts to pin layers if accelerate is enabled 2022-06-19 16:35:23 -04:00
Gnome Ann 042cf3e560 Automatically support soft prompts for all transformers models 2022-06-19 13:11:58 -04:00
Gnome Ann cc56718a7e Fix lazy loader putting too many layers on CPU 2022-06-19 00:29:35 -04:00
Gnome Ann 1380eb0bb0 Disable lazy loader when using GPT-2 2022-06-18 23:54:11 -04:00
Gnome Ann f9732eb143 Always enable breakmodel if accelerate is available 2022-06-18 23:46:09 -04:00
Gnome Ann 8b4efc5d0a Use `accelerate.dispatch_model()` instead of breakmodel if possible 2022-06-18 23:41:36 -04:00
Gnome Ann f7ffdd7b6b Add more model querying utilities 2022-06-18 18:16:56 -04:00
Gnome Ann e143963161 Merge branch 'united' into accelerate 2022-06-18 13:47:38 -04:00
henk717 b209cf9868
NS mode as default
Experimental change that makes NS the default, more and more models seem to be requiring this as megatron based models are getting traction, neither does this seem to break the original models (with the exception of a user not being able to use </s> in generated outputs, the extremely rare case someone would be effected by this they can manually switch the mode by editing their settings file).

If this breaks nothing ns will remain the default, however the n mode should remain a choice for those who need it. In case it does get reversed I have also added the bloom model type to the ns list since its models require this.
2022-06-18 19:46:16 +02:00
Gnome Ann 0eedc541c8 Merge branch 'main' into united-merge 2022-06-18 13:39:23 -04:00
Gnome Ann 5e71f7fe97 Use slow tokenizer if fast tokenizer is not available 2022-06-17 21:08:37 -04:00
Gnome Ann f71bae254a Fix OPT tokenization problems 2022-06-17 13:29:42 -04:00
ebolam 2964175d8b Fix for flaskwebgui 2022-06-17 08:17:22 -04:00
Henk f112fc3493 Initial flaskwebgui support 2022-06-17 13:49:03 +02:00
Gnome Ann 8bdf17f598 Lazy loader can now use accelerate's `init_empty_weights()` 2022-06-16 18:56:16 -04:00
Gnome Ann 5253cdcb36 Lazy loader no longer requires map file except when loading to TPU 2022-06-16 18:45:11 -04:00
Gnome Ann 96d3d397ab Don't use fallback loading if we run out of memory during loading 2022-06-15 14:35:32 -04:00
Henk fb2b6f1026 Model Path Hardening 2022-06-15 13:29:10 +02:00
Henk 24d34647e0 Block navigation on all remote modes 2022-06-15 12:32:19 +02:00
Henk f39e24d87f Localtunnel fix, small polish 2022-06-15 12:22:00 +02:00
henk717 de07b1749f
Merge pull request #150 from ebolam/Web-UI
Delete model fixes and model info ui cleanup
2022-06-15 01:50:39 +02:00
ebolam 095cd2a19d Prevent on server side deletion of folders other than in models in the executing directory
Removed delete icon for model folders outside the models directory
2022-06-14 19:39:11 -04:00
ebolam f444ad851f Potential catch for if somehow a user sends a delete model with a .. in it. 2022-06-14 19:30:01 -04:00
henk717 9add3b0761
Merge pull request #149 from ebolam/Web-UI
--remote jailed to model directory and delete of models from UI
2022-06-15 01:14:06 +02:00
ebolam 462206fa86 added --remote not allowing navigation outside of the model folder for custom models.
added a delete custom models option (will not delete models outside of the models directory, nor will it delete non-model directories)
2022-06-14 19:11:30 -04:00
Henk 01b3c9932a 1.18.1 version bump 2022-06-15 00:58:49 +02:00
Henk 661a2d2727 1.18.1 version bump
Since 1.18 Kobold had a few smaller features added, specifically the ability to re-order sampling options and a new sampler. Since it is a smaller addition a minor version bump was chosen since there are no breaking changes.
2022-06-15 00:55:12 +02:00
Gnome Ann 107966fef8 Merge branch 'united' into overhaul-merge 2022-06-14 18:47:38 -04:00
Gnome Ann a61e06f876 Merge commit '4c7d6f42d99d557130511f5d185249b34f9db5a1' into overhaul-merge 2022-06-14 18:43:25 -04:00
Gnome Ann 979640bd2f Merge commit '2d3db7b4ba388f566aaec88a0e76678fe4fade8d' into overhaul-merge 2022-06-14 18:42:14 -04:00
Gnome Ann 130d530e7c Merge commit 'a273a5ebc49935bfafdcf1aaf4b98c9bf4bc33b1' into overhaul-merge 2022-06-14 18:38:25 -04:00
Gnome Ann 18218a99bc Merge commit '8a38b258f497281af06fcb0c2559f382b419b938' into overhaul-merge 2022-06-14 18:36:37 -04:00
Gnome Ann 6231106f95 Add Samplers menu 2022-06-13 20:18:09 -04:00
Gnome Ann 4c7d6f42d9 Add `sampler_order` to settings file 2022-06-13 19:14:38 -04:00
Gnome Ann 2d3db7b4ba Implement support for sampler order in the backend code 2022-06-13 19:12:23 -04:00
ebolam 11ed55f34a Added custom text box for loading models from specific path, or loading other models from hugging face. 2022-06-13 13:48:45 -04:00
Henk 66c0dda485 Hide (Broken) Chatbot Models
Removing this option because they are currently unavailable. People who still have them can load them trough the load from file option. Once they have been retrained and reuploaded I will add the menu back.
2022-06-11 22:54:51 +02:00
Henk 5c81374a48 Top A for GooseAi 2022-06-11 22:04:37 +02:00
Gnome Ann fdb2a7fa4c Top-A sampling 2022-06-10 22:28:20 -04:00
ebolam cfd1147d5a Bug fix for loading model after loading a model duplicating the settings menu until the website is refreshed
Fixed escaping warnings
Added back/redo unit test
2022-06-10 14:47:52 -04:00
ebolam ed428f2e73 Merge branch 'Web-UI' of https://github.com/ebolam/KoboldAI into Web-UI 2022-06-10 09:12:18 -04:00
ebolam 4a920724d9 fix for folder paths on linux 2022-06-10 09:12:04 -04:00
ebolam 6200908582
Merge pull request #10 from henk717/overhaul
Overhaul
2022-06-10 08:40:15 -04:00
ebolam 13f17d3eca Changed unit tests so that they run with a simple pytest command 2022-06-10 08:39:15 -04:00
Gnome Ann ce582f188f Merge branch 'united' into overhaul-merge 2022-06-09 23:48:28 -04:00
Gnome Ann fe619d4677 Update list of versions with broken OPT again
They released another version of transformers that still doesn't have
the OPT patch so I decided it would be safer to just mark all 4.19
transformers versions as needing the OPT patch.
2022-06-09 17:42:46 -04:00
ebolam 663dee784d Unit Tests using pytest and Minor modifications to allow unit testing 2022-06-09 13:16:32 -04:00
ebolam 606c276f9d Potential fix for tokenizer using a fallback 2022-06-09 09:01:40 -04:00
ebolam db9a94ca2a Added GPU name to the UI when using break models.
Added total layers to the UI
Added favicon
2022-06-09 08:42:35 -04:00
ebolam c565978fff Fix for multi-gpu not showing appropriately
Slight visual improvement for custom model load breadcrumbs
2022-06-08 19:39:04 -04:00
ebolam 4548dcf1b0 Fix for --model with custom paths 2022-06-08 18:53:56 -04:00
ebolam 001439be45
Merge pull request #9 from henk717/overhaul
Overhaul
2022-06-08 18:44:21 -04:00
ebolam 622a3fc8db Fix for model loading by moving monkey patching functions into a run-once function
Added folder navigation to custom model loading (Needs prittying)
2022-06-08 18:42:44 -04:00
Henk 1a46d97ad5 Send correct settings after load 2022-06-08 13:26:30 +02:00
Henk 461cd04932 Fix Essential Code + selectfolder fix
As part of the restructuring essential code was removed that handled the --path parameter correctly. This has now been restored. Selectfolder was also updated to use its NeoCustom counterpart instead of specifying a model so that the underlying code that corrects model names is being hit again.
2022-06-08 11:30:00 +02:00