270 Commits

Author SHA1 Message Date
Gnome Ann
caef3b7460 Disable low_cpu_mem_usage when using GPT-2
Attempting to use transformers 4.11.0's experimental `low_cpu_mem_usage`
feature with GPT-2 models usually results in the output repeating a
token over and over or otherwise containing an incoherent response.
2021-12-20 19:54:19 -05:00
henk717
7b56940ed7
Merge pull request #47 from VE-FORBRYDERNE/scripting
Lua API fixes
2021-12-20 04:32:25 +01:00
Gnome Ann
341b153360 Lua API fixes
* `print()` and `warn()` now work correctly with `nil` arguments
* Typo: `gpt-neo-1.3M` has been corrected to `gpt-neo-1.3B`
* Regeneration is no longer triggered when writing to `keysecondary` of
  a non-selective key
* Handle `genamt` changes in generation modifier properly
* Writing to `kobold.settings.numseqs` from a generation modifier no
  longer affects
* Formatting options in `kobold.settings` have been fixed
* Added aliases for setting names
* Fix behaviour of editing story chunks from a generation modifier
* Warnings are now yellow instead of red
* kobold.logits is now the raw logits prior to being filtered, like
  the documentation says, rather than after being filtered
* Some erroneous comments and error messages have been corrected
* These parts of the API have now been implemented properly:
    * `compute_context()` methods
    * `kobold.authorsnote`
    * `kobold.restart_generation()`
2021-12-19 20:18:28 -05:00
Gnome Ann
6aba869fb7 Make sure uninitialized WI entries are given UIDs when loading saves 2021-12-18 18:00:06 -05:00
Gnome Ann
769333738d Fix behaviour of kobold.outputs with read-only and no prompt gen 2021-12-17 12:59:01 -05:00
henk717
6d9063fb8b No Prompt Gen
Allow people to enter a prompt without generating anything by the AI. Combined with the always add prompt this is a very useful feature that allows people to write world information first, and then do a specific action. This mimics the behavior previously seen in AI Dungeon forks where it prompts for world information and then asks an action and can be particularly useful for people who want the prompt to always be part of the generation.
2021-12-16 12:47:44 +01:00
henk717
f3b4ecabca
Merge pull request #44 from VE-FORBRYDERNE/patch
Fix an error that occurs when all layers are on second GPU
2021-12-16 01:43:03 +01:00
henk717
e3d9c2d690 New download machanism
Automatically converts Huggingface cache models to full models on (down)load.
WARNING: Does wipe old cache/ dir inside the KoboldAI folder, make a backup before you run these models if you are bandwith constraint.
2021-12-16 01:40:04 +01:00
Gnome Ann
19d2356253 Fix an error that occurs when all layers are on second GPU 2021-12-15 19:03:49 -05:00
henk717
5e3e3f3578 Fix float16 models
Forcefully convert float16 models to work on the CPU
2021-12-16 00:31:51 +01:00
Gnome Ann
9097aac4a8 Show full stack trace for generator errors to help in diagnosing errors 2021-12-15 02:03:08 -05:00
Gnome Ann
2687135e05 Fix a strange bug where max tokens was capped at 1024
This seems to be related to the model config files, because only certain
models have this problem, and replacing ALL configuration files of a
"bad" model with those of a "good" model of the same type would fix the
problem.

Shouldn't be required anymore.
2021-12-15 00:45:41 -05:00
Gnome Ann
1551c45ba4 Prevent dynamic scanning from generating too many tokens 2021-12-14 23:39:04 -05:00
Gnome Ann
629988ce13 Fix a problem with the Lua regeneration API
It was an egregious typo that caused tokens to be rearranged on
regeneration.
2021-12-14 23:04:03 -05:00
henk717
6670168a47 Update aiserver.py 2021-12-14 16:26:23 +01:00
Gnome Ann
6e6e0b2b4d Allow Lua to stop generation from input modifier 2021-12-13 19:32:01 -05:00
Gnome Ann
e9ed8602b2 Add a "corescript" setting 2021-12-13 19:28:33 -05:00
Gnome Ann
e5bb20cc8f Fix Lua regeneration system 2021-12-13 19:17:18 -05:00
Gnome Ann
462040ed6f Restore missing loadsettings() call 2021-12-13 18:39:33 -05:00
Gnome Ann
661cca63e8 Make sure stopping criteria still work with dynamic scan off 2021-12-13 18:10:51 -05:00
Gnome Ann
338d437ea3 Use eventlet instead of gevent-websocket 2021-12-13 17:19:04 -05:00
Gnome Ann
34c52a1a23 Remove escape characters from all error messages 2021-12-13 11:47:34 -05:00
Gnome Ann
11f9866dbe Enable more of the IO library in Lua sandbox
Also changes the Lua warning color to red.
2021-12-13 11:22:58 -05:00
Gnome Ann
28e86563b8 Change self.scores to scores in aiserver.py 2021-12-13 11:18:01 -05:00
Gnome Ann
82e149ee02 Catch Lua errors properly 2021-12-13 02:32:09 -05:00
Gnome Ann
5f06d20085 Format Lua printed messages and warnings 2021-12-13 01:59:53 -05:00
Gnome Ann
d2f5544468 Add Userscripts menu into GUI 2021-12-13 01:03:26 -05:00
Gnome Ann
5d13339a52 Allow the retry button to call the Lua scripts properly 2021-12-12 20:48:10 -05:00
Gnome Ann
39bfb0862a Allow user input to be modified from Lua
Also adds some handlers in the Lua code for when the game is not started
yet
2021-12-12 20:44:03 -05:00
Gnome Ann
fbf3e7615b Add API for generated tokens and output text 2021-12-12 19:27:20 -05:00
Gnome Ann
ceabd2ef7b Add Lua API for editing logits during generation
TPU backend not supported yet.
2021-12-12 16:18:45 -05:00
Gnome Ann
e2c3ac041b Complete the Lua generation halting API 2021-12-12 12:52:03 -05:00
Gnome Ann
d76dd35791 Add Lua API for reading model information 2021-12-12 12:09:59 -05:00
Gnome Ann
00eb125ad0 Allow Lua API to toggle dynamic scan 2021-12-12 01:55:46 -05:00
Gnome Ann
5692a7dfe2 Add Lua API for reading the text the user submitted to the AI 2021-12-12 01:52:42 -05:00
Gnome Ann
03453c4e27 Change script directory tree
Userscripts have been moved from /scripts/userscripts to /userscripts.

Core scripts have been moved from /scripts/corescripts to /cores.
2021-12-11 23:46:30 -05:00
Gnome Ann
36209bfe69 Add Lua API for story chunks 2021-12-11 23:44:07 -05:00
Gnome Ann
8e6a62259e Fix the Lua tokenizer API 2021-12-11 21:24:34 -05:00
Gnome Ann
67974947b2 Fix numerous problems in the Lua world info API 2021-12-11 19:11:38 -05:00
Gnome Ann
3327f1b471 Fix Lua settings API 2021-12-11 17:01:41 -05:00
Gnome Ann
f8aa578f41 Enable generation modifiers for transformers backend only 2021-12-11 16:28:25 -05:00
Gnome Ann
e289a0d360 Connect bridge.lua to aiserver.py
Also enables the use of input modifiers and output modifiers, but not
generation modifiers.
2021-12-11 12:45:45 -05:00
Gnome Ann
35966b2007 Upload bridge.lua, default.lua and some Lua libs
base64
inspect
json.lua
Lua-hashings
Lua-nums
Moses
mt19937ar-lua
Penlight
Serpent
2021-12-10 19:45:57 -05:00
Gnome Ann
683bcb824f Merge branch 'united' into world-info 2021-12-05 13:06:32 -05:00
Gnome Ann
6d8517e224 Fix some minor coding errors 2021-12-05 11:39:59 -05:00
Gnome Ann
150ce033c9 TPU backend no longer needs to recompile after changing softprompt 2021-12-05 02:49:15 -05:00
Gnome Ann
b99ac92a52 WI folders and WI drag-and-drop 2021-12-04 23:59:28 -05:00
henk717
44d8068bab Ngrok Support
Not recommended for home users due to DDoS risks, but might make Colab tunnels more reliable.
2021-11-29 18:11:14 +01:00
Gnome Ann
9f51c42dd4 Allow bad words filter to ban <|endoftext|> token
The official transformers bad words filter doesn't allow this by
default. Finetune's version does allow this by default, however.
2021-11-27 11:42:06 -05:00
henk717
2bc93ba37a
Whitelist 6B in breakmodel
Now that we properly support it, allow the menu option to use breakmodel
2021-11-27 10:09:54 +01:00