Commit Graph

455 Commits

Author SHA1 Message Date
Gnome Ann b0ab30cec4 Re-enable GPU-only generation option 2021-11-14 18:24:51 -05:00
henk717 3e38b462c6 Hidden Size fix for GPT2 Custom
Replaced the JS Hidden Size load with the newer function to fix these models
2021-11-14 16:40:04 +01:00
henk717 f227a876c0
Merge pull request #27 from VE-FORBRYDERNE/united
Merge branch 'main' into united
2021-11-14 03:59:26 +01:00
Gnome Ann 21b19b81dd Merge branch 'main' into united 2021-11-13 21:58:27 -05:00
henk717 7b47a8457a
Merge pull request #80 from VE-FORBRYDERNE/main
Improved Unix Support
2021-11-14 03:56:56 +01:00
henk717 ecea169553 Improved Unix Support
Changes the line-endings to the Unix format and sets KoboldAI to launch with Python3 if executed directly.

(cherry picked from commit 5b0977ceb6807c0f80ce6717891ef5e23c8eeb77)
2021-11-13 21:54:32 -05:00
henk717 1596a238f7 Breakmodel automation
The only changes are a small addition to the breakmodel section where GPU0 is automatically chosen if the CLI options are used without specifying breakmodel. Lineendings have been changed to Linux formatting for compatibility reasons.
2021-11-14 03:13:52 +01:00
henk717 8a916116e3
Remove device=0 because of incompatibility
Device=0 breaks some of the pytorch implementations, removed to restore hardware compatibility to 0.16 levels.
2021-11-14 02:33:27 +01:00
henk717 4bcffc614e
Allow directly running KoboldAI from CLI in Linux
Its made for Python3, so we assume python3 is installed in its usual location. If it isn't you can always run it yourself with whatever command you used prior to this change.
2021-11-14 01:57:43 +01:00
henk717 21ae45e9ab
Merge branch 'KoboldAI:main' into united 2021-11-11 17:05:39 +01:00
henk717 8ad3863854
Merge pull request #26 from VE-FORBRYDERNE/sp-patch
More softprompting bug fixes
2021-11-11 17:05:32 +01:00
henk717 4ebece0a6f
Merge pull request #79 from VE-FORBRYDERNE/broadcast-patch
Don't broadcast emit calls inside do_connect()
2021-11-11 17:05:13 +01:00
Gnome Ann 1fadcbe1e3 Send allowsp command on connect instead of on startup 2021-11-11 00:18:46 -05:00
Gnome Ann 2fe815e092 Don't broadcast emit calls inside do_connect()
This prevents the "thinking" animation from appearing on top of the
submit button under certain circumstances:

* When someone connects to the KoboldAI server while the model is
  generating (occurs after generation finishes)
* Occasionally, the browser may suddenly disconnect and reconnect from
  Flask-SocketIO during generation, which causes the same problem
2021-11-11 00:14:12 -05:00
Gnome Ann 11b0291bc4 Use model.transformer.embed_dim if model.transformer.hidden_size doesn't exist 2021-11-10 17:47:14 -05:00
Gnome Ann 752e19a2bb Fix vars.modeldim not always being set 2021-11-10 17:38:30 -05:00
henk717 e6599db78f
Merge pull request #25 from VE-FORBRYDERNE/united
Merge branch 'main' into united
2021-11-10 03:37:43 +01:00
Gnome Ann 2679df9664 Merge branch 'main' into united 2021-11-09 21:33:14 -05:00
henk717 c2371cf801
Merge pull request #23 from VE-FORBRYDERNE/scan-test
Dynamic world info scan
2021-11-10 03:31:42 +01:00
henk717 d5a26e8c20
Fixed root permissions
The Docker has been changed to no longer run these commands as root, added root permissions for the relevant commands to fix the docker.
2021-11-10 03:17:12 +01:00
henk717 4af0d9dabd
Merge pull request #78 from VE-FORBRYDERNE/patch
Allow remote mode to load from client-side story files
2021-11-06 16:58:05 +01:00
Gnome Ann 02a56945de Version bump 2021-11-06 11:50:56 -04:00
henk717 bc0f9c8032 Allow remote mode to load from client-side story files
(cherry picked from commit a1345263df)
2021-11-06 11:48:20 -04:00
Gnome Ann 7ea6f58b1a Resolve merge conflict 2021-11-05 11:03:50 -04:00
henk717 a1345263df
Merge pull request #22 from VE-FORBRYDERNE/filereader
Allow remote mode to load from client-side story files
2021-11-05 02:30:20 +01:00
Gnome Ann 7a0b0b0d2d Remove debug logging from application.js 2021-11-04 19:36:45 -04:00
Gnome Ann 7c099fe93c Allow remote mode to load from client-side story files 2021-11-04 19:33:17 -04:00
henk717 2829c45ed6
Merge pull request #21 from VE-FORBRYDERNE/united
Softprompting bug fixes
2021-11-04 15:38:21 +01:00
Gnome Ann 81bd058caf Make sure calcsubmitbudget uses the correct reference to vars.actions 2021-11-03 18:57:02 -04:00
Gnome Ann a2d7735a51 Dynamic WI scanner should ignore triggers that are already in context 2021-11-03 18:55:53 -04:00
Gnome Ann ecfbbdb4a9 Merge branch 'united' into scan-test 2021-11-03 18:23:22 -04:00
Gnome Ann 0fa47b1249 Fix budget calculation for stories with at least one non-prompt chunk 2021-11-03 18:22:31 -04:00
Gnome Ann c11dab894e Put placeholder variables into calcsubmitbudget 2021-11-03 18:02:19 -04:00
Gnome Ann 9b18068999 Shallow copy story chunks when generating 2021-11-03 17:53:38 -04:00
Gnome Ann b8c3d8c12e Fix generator output having the wrong length 2021-11-03 16:10:12 -04:00
Gnome Ann 5b3ce4510f Make sure that soft_tokens is on the correct device 2021-11-03 16:07:50 -04:00
Gnome Ann 90fd5a538a Merge branch 'united' into scan-test 2021-11-03 12:42:18 -04:00
Gnome Ann fe2987d894 Fix missing break statement in device_config 2021-11-03 12:42:04 -04:00
Gnome Ann bd76ab333c Set numseqs to 1 if using dynamic world info scan 2021-11-03 12:28:17 -04:00
Gnome Ann 0a91ea27b3 Make the dynamic world info scan toggleable 2021-11-03 12:18:48 -04:00
Gnome Ann de3664e73c Add an assertion for the value of already_generated 2021-11-03 12:01:45 -04:00
Gnome Ann ec8ec55256 Dynamic world info scan 2021-11-03 11:54:48 -04:00
henk717 aa998ba5e9
Merge pull request #20 from VE-FORBRYDERNE/sp
Soft prompt support for PyTorch models
2021-10-30 00:35:44 +02:00
Gnome Ann 206c01008e Fix budget calculation when using soft prompt 2021-10-29 11:44:51 -04:00
henk717 c9c370aa17
Merge branch 'KoboldAI:main' into united 2021-10-28 23:29:29 +02:00
henk717 c59673efde
Merge pull request #77 from VE-FORBRYDERNE/patch
Create settings directory if it doesn't exist when using InferKit/OAI
2021-10-28 23:29:17 +02:00
Gnome Ann bf4e7742ac Patch GPTJForCausalLM, if it exists, to support soft prompting 2021-10-28 17:18:28 -04:00
Gnome Ann 40b4631f6c Clamp input_ids in place
Apparently transformers maintains an internal reference to input_ids
(to use for repetition penalty) so we have to clamp the internal
version, too, because otherwise transformers will throw an out-of-bounds
error upon attempting to access token IDs that are not in the
vocabulary.
2021-10-28 16:52:39 -04:00
Gnome Ann 24d5d63c9f Use the correct generation min and max when using soft prompt 2021-10-28 16:39:59 -04:00
Gnome Ann 511817132a Don't change the shape of transformer.wte 2021-10-28 15:39:59 -04:00