Commit Graph

1617 Commits

Author SHA1 Message Date
Divided by Zer0
2de9672b95 attempt1 2023-02-23 18:27:11 +01:00
Divided by Zer0
d459bdb1c0 adjust 2023-02-22 23:12:34 +01:00
jojorne
d3bedfcbda Include koboldai_vars.save_story(). 2023-02-22 15:42:56 -03:00
jojorne
d6c9f5f1f5 Save/Load Story with UTF-8 encoding. 2023-02-22 14:40:42 -03:00
Henk
9e6a5db745 UI1 Botname 2023-02-19 16:22:26 +01:00
henk717
93f313d6e3 Merge pull request #291 from pi6am/fix/save-as
Fix an exception using Save As from the classic UI
2023-02-19 13:21:10 +01:00
Llama
117f0659c3 Fix exception using Save As from the classic UI
The `saveas` method was modified to take a data dict but one of the
else blocks still referred to the previous `name` parameter. Assign
to `name` to fix the `NameError: name 'name' is not defined` exception.
2023-02-18 23:41:17 -08:00
Henk
cd566caf20 Revision Fixes (Removes the workaround) 2023-02-19 00:51:50 +01:00
Henk
aa6ce9088b Fix vscode artifacting 2023-02-18 18:47:15 +01:00
Henk
a9a724e38c Merge branch 'main' into united 2023-02-18 18:14:03 +01:00
ebolam
c8059ae07e Fix for UI1 upload bug 2023-02-13 14:35:55 -05:00
henk717
d7cce6473a Merge pull request #279 from ebolam/UI2
Chat mode stopper fix, Class Split Adjustments
2023-02-11 23:58:54 +01:00
--global
a680d2baa5 Fix boolean API schemas name clash 2023-02-11 15:11:35 +02:00
ebolam
d2ff58c2dd Chat mode stopper modified to only stop, not trim output (using Henky's regex for that) 2023-02-10 16:10:23 -05:00
ebolam
4439e1d637 Chat Mode Stopper Test 2023-02-09 11:40:00 -05:00
ebolam
0487b271b8 Merge branch 'UI2' of https://github.com/ebolam/KoboldAI into UI2 2023-02-09 10:54:48 -05:00
ebolam
b29d4cef29 Add basic authentication option for webUI image generation 2023-02-09 10:54:38 -05:00
ebolam
709bd634b3 Merge pull request #365 from ebolam/chatmodestoppertest
Fix for chat mode stopper
2023-02-09 10:52:34 -05:00
ebolam
ea6af45e95 Fix for chat mode stopper 2023-02-01 18:51:17 -05:00
henk717
2f09916b20 Merge pull request #272 from ebolam/UI2
Fixed for UI2
2023-02-02 00:03:09 +01:00
Henk
3921ed1216 Lock CORS behind --host 2023-02-01 23:52:35 +01:00
henk717
669b68748d Merge pull request #271 from LostRuins/united
Adding CORS support to allow cross origin requests.
2023-02-01 23:41:00 +01:00
Henk
0920085695 Experimental EOT Support 2023-01-31 21:00:17 +01:00
ebolam
26aabaa6fb Merge branch 'henk717:united' into UI2 2023-01-31 07:34:08 -05:00
Henk
739cccd8ed Revision Fixes 2023-01-31 04:48:46 +01:00
Henk
640bd64037 Revision Fixes (And Var Workaround) 2023-01-31 04:00:35 +01:00
Henk
b276c5ff15 EOS Hiding Workaround (For models that want EOS behavior) 2023-01-31 01:46:55 +01:00
Concedo
d9e4b74e37 Adding CORS support to allow cross origin requests. Added new dependency for package flask-cors required.package 2023-01-30 12:57:58 +08:00
ebolam
5aca142034 Added blank image. Shows when action is selected and there is no action image for that action 2023-01-19 15:55:10 -05:00
ebolam
33f7478bf4 Merge branch 'UI2' of https://github.com/ebolam/KoboldAI into UI2 2023-01-19 08:06:46 -05:00
ebolam
bf665838e0 Potential desync fix 2023-01-19 08:06:40 -05:00
SammCheese
2e93b12aff add a info reference for the model selection 2023-01-15 20:50:00 +01:00
Henk
ed62d104ee --cacheonly 2023-01-14 21:14:39 +01:00
Henk
469fb8a5fe Transformers 4.25.1
This is a breaking change that allows 4.25.1 to work because they also have done breaking changes. If you do not make use of our automatic updater please update the dependencies when updating to this build.
2023-01-13 19:11:21 +01:00
Henk
f1739dd184 Chatmode Regex 2023-01-13 13:04:54 +01:00
Henk
d6a941de61 Restore Chat Models Menu + Safetensors Workaround
This commit restores the chat models menu now we finally have good chat models available again.

Unfortunately huggingface reports back pytorch_model.bin even if the model's name is model.safetensors. I don't have a good way to combat this at the moment, so instead we now do a hack where if the model copy fails it manually tries model.safetensors instead hoping that it will work.

This fixes Pygmalion for now, if new issues arise from this in the future from other models we have to implement a cleaner method.
2023-01-11 22:24:12 +01:00
Henk
271e4ed06b Chat Mode Improvements
This commit decouples single line mode, well behaved models no longer need this since we stop at the You:.

There are scenario's however where this potentially breaks chatmode completely or makes models more frustrating to use. Users who experience this can enable the Single Line mode in the formatting menu to restore the old behavior.

I have also allowed token streaming again, since the issues with it have already been resolved.
2023-01-11 21:33:25 +01:00
ebolam
489c1ffd80 Correction to UI2 UI to redirect to API when --no_ui is present. 2023-01-10 16:06:21 -05:00
ebolam
5d8485273a Remove really old debug message for chat mode 2023-01-10 08:58:09 -05:00
ebolam
9b1138bafa Added in alternative rep pen calculation (log instead of linear application) as an option. 2023-01-10 08:45:55 -05:00
ebolam
3b551ec7ab Added custom model path for model selection. Shows the custom model input box for the user to enter a path and/or a hugging face model 2023-01-09 13:02:57 -05:00
henk717
86f1694290 Merge pull request #254 from one-some/united
Fix image retry
2023-01-09 01:34:13 +01:00
somebody
ac5c001fc3 Fix image retry 2023-01-08 18:27:23 -06:00
henk717
535b9bdfdf Merge pull request #251 from one-some/bias-improvements
Bias improvements
2023-01-08 17:45:41 +01:00
somebody
f6c4bfb390 New bias token control (and lack thereof)
Allows square bracket syntax for using token ids, curly bracket syntax
for strict phrasing, and normal now biases alternative phrases with
space prefixes
2023-01-07 23:59:46 -06:00
somebody
76c1398917 Small cleanup with imports
and unused threading code
2023-01-07 20:19:31 -06:00
somebody
d298915182 Disable load model button on colab/when prompted
Currently switching models makes the TPU cry and we do not want to upset
our dearest hardware friend.

In the future it'd probably be best to teach the TPU how to not be
afraid of switching models, provided we have some limitations that
prevent loading itty bitty or way too big models on the TPU.
2023-01-07 13:38:27 -06:00
henk717
08e72d0f4d Merge pull request #244 from pi6am/fix/load-non-started
Fix loading a non-started V1 story json in UI2
2023-01-06 16:29:52 +01:00
Llama
ae215b94d5 ix loading a non-started V1 story json in UI2
There was an exception loading or importing a V1 story json into UI2 if
the story had zero actions. Do to incorrect indendation, the V1 load
process would attempt to use an uninitialized temp_story_class variable.

However, even after fixing this, it was impossible to continue playing
the saved story, because it was not possible to submit a blank action
when a story only had a prompt and zero actions. Modify the logic around
the first submission to handle a non-empty initial prompt. This allows
the player to begin playing the loaded game.
2023-01-05 23:42:24 -08:00
somebody
ab0277549c Merge branch 'united' of https://github.com/henk717/KoboldAI into screenshot 2023-01-02 12:14:38 -06:00