Commit Graph

1129 Commits

Author SHA1 Message Date
Gnome Ann 7c099fe93c Allow remote mode to load from client-side story files 2021-11-04 19:33:17 -04:00
henk717 2829c45ed6
Merge pull request #21 from VE-FORBRYDERNE/united
Softprompting bug fixes
2021-11-04 15:38:21 +01:00
Gnome Ann 81bd058caf Make sure calcsubmitbudget uses the correct reference to vars.actions 2021-11-03 18:57:02 -04:00
Gnome Ann a2d7735a51 Dynamic WI scanner should ignore triggers that are already in context 2021-11-03 18:55:53 -04:00
Gnome Ann ecfbbdb4a9 Merge branch 'united' into scan-test 2021-11-03 18:23:22 -04:00
Gnome Ann 0fa47b1249 Fix budget calculation for stories with at least one non-prompt chunk 2021-11-03 18:22:31 -04:00
Gnome Ann c11dab894e Put placeholder variables into calcsubmitbudget 2021-11-03 18:02:19 -04:00
Gnome Ann 9b18068999 Shallow copy story chunks when generating 2021-11-03 17:53:38 -04:00
Gnome Ann b8c3d8c12e Fix generator output having the wrong length 2021-11-03 16:10:12 -04:00
Gnome Ann 5b3ce4510f Make sure that soft_tokens is on the correct device 2021-11-03 16:07:50 -04:00
Gnome Ann 90fd5a538a Merge branch 'united' into scan-test 2021-11-03 12:42:18 -04:00
Gnome Ann fe2987d894 Fix missing break statement in device_config 2021-11-03 12:42:04 -04:00
Gnome Ann bd76ab333c Set numseqs to 1 if using dynamic world info scan 2021-11-03 12:28:17 -04:00
Gnome Ann 0a91ea27b3 Make the dynamic world info scan toggleable 2021-11-03 12:18:48 -04:00
Gnome Ann de3664e73c Add an assertion for the value of already_generated 2021-11-03 12:01:45 -04:00
Gnome Ann ec8ec55256 Dynamic world info scan 2021-11-03 11:54:48 -04:00
henk717 aa998ba5e9
Merge pull request #20 from VE-FORBRYDERNE/sp
Soft prompt support for PyTorch models
2021-10-30 00:35:44 +02:00
Gnome Ann 206c01008e Fix budget calculation when using soft prompt 2021-10-29 11:44:51 -04:00
henk717 c9c370aa17
Merge branch 'KoboldAI:main' into united 2021-10-28 23:29:29 +02:00
henk717 c59673efde
Merge pull request #77 from VE-FORBRYDERNE/patch
Create settings directory if it doesn't exist when using InferKit/OAI
2021-10-28 23:29:17 +02:00
Gnome Ann bf4e7742ac Patch GPTJForCausalLM, if it exists, to support soft prompting 2021-10-28 17:18:28 -04:00
Gnome Ann 40b4631f6c Clamp input_ids in place
Apparently transformers maintains an internal reference to input_ids
(to use for repetition penalty) so we have to clamp the internal
version, too, because otherwise transformers will throw an out-of-bounds
error upon attempting to access token IDs that are not in the
vocabulary.
2021-10-28 16:52:39 -04:00
Gnome Ann 24d5d63c9f Use the correct generation min and max when using soft prompt 2021-10-28 16:39:59 -04:00
Gnome Ann 511817132a Don't change the shape of transformer.wte 2021-10-28 15:39:59 -04:00
Gnome Ann a1ae11630a Make sure to cast vars.sp to the correct dtype 2021-10-28 13:22:07 -04:00
Gnome Ann 1556bd32a5 Use torch.where to inject the soft prompt instead of torch.cat 2021-10-28 13:20:14 -04:00
Gnome Ann 248e0bd24b Fix soft prompt loading code 2021-10-28 00:29:42 -04:00
Gnome Ann 4e3cc93020 Merge branch 'united' into sp 2021-10-23 11:45:03 -04:00
henk717 7b73d7cfdd Single Line Mode
Adds Single Line mode, optimized for things like chatbot testing and other cases where you want to have control over what happens after a paragraph.

This can also be used as a foundation for a chatbot optimized interface mode.
2021-10-23 17:30:48 +02:00
Gnome Ann 9e82ce34a6 HTML-escape strings in the soft prompt selection menu 2021-10-22 14:25:25 -04:00
Gnome Ann 1f449a9dda Soft prompt support (6B Colabs not supported yet) 2021-10-22 14:18:10 -04:00
Gnome Ann 3501f03153 Create settings directory if it doesn't exist when using InferKit/OAI 2021-10-21 23:33:32 -04:00
henk717 0f38dbc0ed Using VE's fork for now
Switching the official huggingface to VE's fork for the time being until some of these changes land upstream.
2021-10-19 11:33:49 +02:00
henk717 fa0f8af1d6
Merge branch 'KoboldAI:main' into united 2021-10-15 08:23:06 +02:00
henk717 9513240dfb
Version bump
Since VE fixed important things in the editor i want users to be able to see this easier
2021-10-15 08:22:32 +02:00
henk717 fd7f9b7edf
Merge pull request #76 from VE-FORBRYDERNE/editor
Fix enter key behaviour in the editor when not using Firefox
2021-10-15 08:21:35 +02:00
Gnome Ann fdbe730a1f Fix an incorrect document.queryCommandSupported call 2021-10-13 12:51:31 -04:00
Gnome Ann 99d26c44e0 Handle CRLF newlines properly when pasting 2021-10-13 12:43:45 -04:00
Gnome Ann 718af6f7fa Pasting fallback for browsers with no execCommand support 2021-10-13 12:41:43 -04:00
Gnome Ann 407d8f7419 Also apply the enter patch to pasted text 2021-10-13 12:05:15 -04:00
Gnome Ann 3f5a3102a9 Change application.js version to avoid caching issues 2021-10-13 00:48:21 -04:00
Gnome Ann aaa0c3374e Fix problems with stories that end in newlines
Today I learned that the editor only works properly when the last
<chunk> tag has a <br> inside it at the end.  This last <br> is
invisible and is automatically created by all major browsers when you
use the enter key to type a newline at the end of a story to "prevent
the element from collapsing".  When there's more than one <br> at the
end of the last <chunk>, only the last of those <br>s is invisible, so
if you have three <br>s, they are rendered as two newlines.  This only
applies to the last <chunk>, so if the second last <chunk> has three
<br>s at the end, they are still rendered as three newlines.  Since
the browser is really insistent on doing this, this commit mostly deals
with dynamically creating and deleting <br> tags at the ends of <chunk>
tags as needed to provide a consistent experience, and making sure
that all <br> tags actually go inside of <chunk> tags to prevent
breaking the editor.  The latter behaviour was exhibited by Chrome and
caused a bug when you added a newline at the end of your story using
the editor.
2021-10-13 00:42:03 -04:00
Gnome Ann b3d33cc852 Fix enter key behaviour in the editor when not using Firefox 2021-10-12 00:09:02 -04:00
henk717 0b62ed0892
Merge branch 'KoboldAI:main' into united 2021-10-07 03:17:03 +02:00
henk717 436d492b42
Merge pull request #74 from AngryBeeSec/patch-1
Update docker-compose.yml
2021-10-07 03:16:10 +02:00
AngryBeeSec 18d30a1235
Update docker-compose.yml
Should fix GPU issues on Arch based systems.
2021-10-06 20:50:50 -04:00
henk717 c854a62549 Clarified GPU Layers
breakmodel_layers and layers is confusing, changed the new method to breakmodel_gpulayers. The old one should no longer be used by people, but since it works in reverse we leave it in so scripts don't break.
2021-10-06 18:55:01 +02:00
henk717 bd063f7590
Merge pull request #19 from VE-FORBRYDERNE/multi-gpu
Multiple GPU support
2021-10-06 18:50:58 +02:00
Gnome Ann 3649ba9fa4 Breakmodel's CUDA stream should be on primary device 2021-10-06 12:04:56 -04:00
henk717 82c7eaffb5
Merge branch 'KoboldAI:main' into united 2021-10-06 00:26:08 +02:00