455 Commits

Author SHA1 Message Date
henk717
c9c370aa17
Merge branch 'KoboldAI:main' into united 2021-10-28 23:29:29 +02:00
Gnome Ann
bf4e7742ac Patch GPTJForCausalLM, if it exists, to support soft prompting 2021-10-28 17:18:28 -04:00
Gnome Ann
40b4631f6c Clamp input_ids in place
Apparently transformers maintains an internal reference to input_ids
(to use for repetition penalty) so we have to clamp the internal
version, too, because otherwise transformers will throw an out-of-bounds
error upon attempting to access token IDs that are not in the
vocabulary.
2021-10-28 16:52:39 -04:00
Gnome Ann
24d5d63c9f Use the correct generation min and max when using soft prompt 2021-10-28 16:39:59 -04:00
Gnome Ann
511817132a Don't change the shape of transformer.wte 2021-10-28 15:39:59 -04:00
Gnome Ann
a1ae11630a Make sure to cast vars.sp to the correct dtype 2021-10-28 13:22:07 -04:00
Gnome Ann
1556bd32a5 Use torch.where to inject the soft prompt instead of torch.cat 2021-10-28 13:20:14 -04:00
Gnome Ann
248e0bd24b Fix soft prompt loading code 2021-10-28 00:29:42 -04:00
Gnome Ann
4e3cc93020 Merge branch 'united' into sp 2021-10-23 11:45:03 -04:00
henk717
7b73d7cfdd Single Line Mode
Adds Single Line mode, optimized for things like chatbot testing and other cases where you want to have control over what happens after a paragraph.

This can also be used as a foundation for a chatbot optimized interface mode.
2021-10-23 17:30:48 +02:00
Gnome Ann
1f449a9dda Soft prompt support (6B Colabs not supported yet) 2021-10-22 14:18:10 -04:00
Gnome Ann
3501f03153 Create settings directory if it doesn't exist when using InferKit/OAI 2021-10-21 23:33:32 -04:00
henk717
fa0f8af1d6
Merge branch 'KoboldAI:main' into united 2021-10-15 08:23:06 +02:00
henk717
9513240dfb
Version bump
Since VE fixed important things in the editor i want users to be able to see this easier
2021-10-15 08:22:32 +02:00
henk717
c854a62549 Clarified GPU Layers
breakmodel_layers and layers is confusing, changed the new method to breakmodel_gpulayers. The old one should no longer be used by people, but since it works in reverse we leave it in so scripts don't break.
2021-10-06 18:55:01 +02:00
henk717
bd063f7590
Merge pull request #19 from VE-FORBRYDERNE/multi-gpu
Multiple GPU support
2021-10-06 18:50:58 +02:00
henk717
82c7eaffb5
Merge branch 'KoboldAI:main' into united 2021-10-06 00:26:08 +02:00
henk717
8893916fef
Don't always submit prompt by default
Feedback from users is that its better to not always submit the prompt, this is consistent with the randomly generated stories. You can always toggle it on if you need this for coherency. This change does not override existing user settings.
2021-10-06 00:25:05 +02:00
Gnome Ann
aa59f8b4b2 Fix CPU layers not displaying correctly when using --layers 2021-10-05 11:29:47 -04:00
Gnome Ann
91352ea9f1 Change the command line flags for breakmodel 2021-10-05 11:22:09 -04:00
Gnome Ann
a1e4405aa6 Automatically use breakmodel instead of GPU-only where supported
There's really no reason to use GPU-only mode if breakmodel is supported
because breakmodel can run in GPU-only mode too.
2021-10-05 10:36:51 -04:00
Gnome Ann
fb90a7ed17 Change the help text for breakmodel to be more helpful 2021-10-05 10:31:28 -04:00
Gnome Ann
231621e7c2 Use AutoModelForCausalLM for custom models with a model_type 2021-10-05 09:45:12 -04:00
Gnome Ann
a283d34b27 Multiple GPU support 2021-10-05 09:38:57 -04:00
Gnome Ann
a42b580027 Merge branch 'united' into multi-gpu 2021-10-02 11:44:26 -04:00
henk717
dab58d8393
Merge branch 'KoboldAI:main' into united 2021-09-29 17:05:06 +02:00
Gnome Ann
a179bb2820 Bump version number to 1.16.2 2021-09-28 21:50:33 -04:00
Gnome Ann
e6cd28243e Scroll to the bottom of the gamescreen after retrying 2021-09-28 21:34:36 -04:00
Gnome Ann
bb323152d7 Disable vars.recentedit again 2021-09-28 21:24:08 -04:00
Gnome Ann
2b89bcb16e Fix random story generator 2021-09-28 21:04:26 -04:00
Gnome Ann
af93c96c0f Submit Action mode action in Story mode if action is empty 2021-09-28 19:50:00 -04:00
Gnome Ann
9ab1d182ac Guard against empty prompts 2021-09-28 19:48:43 -04:00
henk717
da55ed3b49
Merge branch 'KoboldAI:main' into united 2021-09-28 10:41:01 +02:00
Gnome Ann
03c1a3ebf9 Put vars.recentedit = True in deleterequest() for consistency 2021-09-28 01:10:20 -04:00
Gnome Ann
97e1760af5 Prevent retry from popping chunks after edit/delete 2021-09-28 01:07:11 -04:00
Gnome Ann
231290608d Do a better job of preventing editing of text when required 2021-09-28 00:48:37 -04:00
Gnome Ann
13b81c7523 Prevent the user from deleting the prompt 2021-09-27 22:21:14 -04:00
henk717
01b30b315f
Merge branch 'KoboldAI:main' into united 2021-09-28 02:31:20 +02:00
Gnome Ann
e29e7b11ec Bump version number to 1.16.1 2021-09-27 18:12:15 -04:00
Gnome Ann
a327eed2c3 Fix editor scrolling issues 2021-09-27 17:44:22 -04:00
henk717
01339f0b87
Merge branch 'KoboldAI:main' into united 2021-09-25 17:44:51 +02:00
Gnome Ann
5893e495b6 Change AutoModel to AutoModelForCausalLM
This fixes breakmodel mode for the official models from the model
selection menu.
2021-09-25 11:41:15 -04:00
henk717
c9290d02dc Update aiserver.py
Better way of checking for the model type
2021-09-25 16:50:24 +02:00
henk717
7d35f825c6 Huggingface GPT-J Support
Finetune's fork has unofficial support which we supported, but this is not compatible with models designed for the official version. In this update we let models decide which transformers backend to use, and fall back to Neo if they don't choose any. We also add the 6B to the menu and for the time being switch to the github version of transformers to be ahead of the waiting time. (Hopefully we can switch back to the conda version before merging upstream).
2021-09-25 16:26:17 +02:00
Gnome Ann
4d9eab3785 K80 test 2021-09-23 20:57:18 -04:00
henk717
6520cac75d
Support models that are formatted with CRLF
A new model was released that uses a different formatting for its enters, this causes to many enters in the UI. In this change we fix the issue so that when this happens the UI still displays the content as you would expect. Removing the formatting burden from the Model developers.
2021-09-22 00:34:05 +02:00
henk717
30a7e945a1
Merge pull request #18 from VE-FORBRYDERNE/doc
Correct misindicated model VRAM requirements
2021-09-21 18:54:19 +02:00
henk717
dd1c3ab67e Allow models to set formatting defaults
Originally omitted when model settings were forced. Now that models can only define the defaults for KoboldAI its a good idea to give model authors control over what formatting they think works best for their models.
2021-09-21 15:46:54 +02:00
Gnome Ann
bbf2bd4026 Correct misindicated model VRAM requirements 2021-09-20 18:49:17 -04:00
Gnome Ann
8df2ccae5b Update client-side story name when saving
If you save a story as a different name than it was loaded with, and
then try to download it as JSON/plaintext, the downloaded file's name
will now match the new story name.
2021-09-19 23:40:52 -04:00