6aba869fb7
Make sure uninitialized WI entries are given UIDs when loading saves
2021-12-18 18:00:06 -05:00
769333738d
Fix behaviour of kobold.outputs
with read-only and no prompt gen
2021-12-17 12:59:01 -05:00
6d9063fb8b
No Prompt Gen
...
Allow people to enter a prompt without generating anything by the AI. Combined with the always add prompt this is a very useful feature that allows people to write world information first, and then do a specific action. This mimics the behavior previously seen in AI Dungeon forks where it prompts for world information and then asks an action and can be particularly useful for people who want the prompt to always be part of the generation.
2021-12-16 12:47:44 +01:00
f3b4ecabca
Merge pull request #44 from VE-FORBRYDERNE/patch
...
Fix an error that occurs when all layers are on second GPU
2021-12-16 01:43:03 +01:00
e3d9c2d690
New download machanism
...
Automatically converts Huggingface cache models to full models on (down)load.
WARNING: Does wipe old cache/ dir inside the KoboldAI folder, make a backup before you run these models if you are bandwith constraint.
2021-12-16 01:40:04 +01:00
19d2356253
Fix an error that occurs when all layers are on second GPU
2021-12-15 19:03:49 -05:00
5e3e3f3578
Fix float16 models
...
Forcefully convert float16 models to work on the CPU
2021-12-16 00:31:51 +01:00
9097aac4a8
Show full stack trace for generator errors to help in diagnosing errors
2021-12-15 02:03:08 -05:00
2687135e05
Fix a strange bug where max tokens was capped at 1024
...
This seems to be related to the model config files, because only certain
models have this problem, and replacing ALL configuration files of a
"bad" model with those of a "good" model of the same type would fix the
problem.
Shouldn't be required anymore.
2021-12-15 00:45:41 -05:00
1551c45ba4
Prevent dynamic scanning from generating too many tokens
2021-12-14 23:39:04 -05:00
629988ce13
Fix a problem with the Lua regeneration API
...
It was an egregious typo that caused tokens to be rearranged on
regeneration.
2021-12-14 23:04:03 -05:00
6670168a47
Update aiserver.py
2021-12-14 16:26:23 +01:00
6e6e0b2b4d
Allow Lua to stop generation from input modifier
2021-12-13 19:32:01 -05:00
e9ed8602b2
Add a "corescript" setting
2021-12-13 19:28:33 -05:00
e5bb20cc8f
Fix Lua regeneration system
2021-12-13 19:17:18 -05:00
462040ed6f
Restore missing loadsettings()
call
2021-12-13 18:39:33 -05:00
661cca63e8
Make sure stopping criteria still work with dynamic scan off
2021-12-13 18:10:51 -05:00
338d437ea3
Use eventlet instead of gevent-websocket
2021-12-13 17:19:04 -05:00
34c52a1a23
Remove escape characters from all error messages
2021-12-13 11:47:34 -05:00
11f9866dbe
Enable more of the IO library in Lua sandbox
...
Also changes the Lua warning color to red.
2021-12-13 11:22:58 -05:00
28e86563b8
Change self.scores
to scores
in aiserver.py
2021-12-13 11:18:01 -05:00
82e149ee02
Catch Lua errors properly
2021-12-13 02:32:09 -05:00
5f06d20085
Format Lua printed messages and warnings
2021-12-13 01:59:53 -05:00
d2f5544468
Add Userscripts menu into GUI
2021-12-13 01:03:26 -05:00
5d13339a52
Allow the retry button to call the Lua scripts properly
2021-12-12 20:48:10 -05:00
39bfb0862a
Allow user input to be modified from Lua
...
Also adds some handlers in the Lua code for when the game is not started
yet
2021-12-12 20:44:03 -05:00
fbf3e7615b
Add API for generated tokens and output text
2021-12-12 19:27:20 -05:00
ceabd2ef7b
Add Lua API for editing logits during generation
...
TPU backend not supported yet.
2021-12-12 16:18:45 -05:00
e2c3ac041b
Complete the Lua generation halting API
2021-12-12 12:52:03 -05:00
d76dd35791
Add Lua API for reading model information
2021-12-12 12:09:59 -05:00
00eb125ad0
Allow Lua API to toggle dynamic scan
2021-12-12 01:55:46 -05:00
5692a7dfe2
Add Lua API for reading the text the user submitted to the AI
2021-12-12 01:52:42 -05:00
03453c4e27
Change script directory tree
...
Userscripts have been moved from /scripts/userscripts to /userscripts.
Core scripts have been moved from /scripts/corescripts to /cores.
2021-12-11 23:46:30 -05:00
36209bfe69
Add Lua API for story chunks
2021-12-11 23:44:07 -05:00
8e6a62259e
Fix the Lua tokenizer API
2021-12-11 21:24:34 -05:00
67974947b2
Fix numerous problems in the Lua world info API
2021-12-11 19:11:38 -05:00
3327f1b471
Fix Lua settings API
2021-12-11 17:01:41 -05:00
f8aa578f41
Enable generation modifiers for transformers backend only
2021-12-11 16:28:25 -05:00
e289a0d360
Connect bridge.lua to aiserver.py
...
Also enables the use of input modifiers and output modifiers, but not
generation modifiers.
2021-12-11 12:45:45 -05:00
35966b2007
Upload bridge.lua, default.lua and some Lua libs
...
base64
inspect
json.lua
Lua-hashings
Lua-nums
Moses
mt19937ar-lua
Penlight
Serpent
2021-12-10 19:45:57 -05:00
683bcb824f
Merge branch 'united' into world-info
2021-12-05 13:06:32 -05:00
6d8517e224
Fix some minor coding errors
2021-12-05 11:39:59 -05:00
150ce033c9
TPU backend no longer needs to recompile after changing softprompt
2021-12-05 02:49:15 -05:00
b99ac92a52
WI folders and WI drag-and-drop
2021-12-04 23:59:28 -05:00
44d8068bab
Ngrok Support
...
Not recommended for home users due to DDoS risks, but might make Colab tunnels more reliable.
2021-11-29 18:11:14 +01:00
9f51c42dd4
Allow bad words filter to ban <|endoftext|> token
...
The official transformers bad words filter doesn't allow this by
default. Finetune's version does allow this by default, however.
2021-11-27 11:42:06 -05:00
2bc93ba37a
Whitelist 6B in breakmodel
...
Now that we properly support it, allow the menu option to use breakmodel
2021-11-27 10:09:54 +01:00
b56ee07ffa
Fix for CPU mode
...
Recent optimizations caused the CPU version to load in an incompatible format, now we convert it back to the correct format after loading it efficiently first.
2021-11-27 05:34:29 +01:00
e5e2fb088a
Remember to actually import GPTJModel
2021-11-26 12:38:52 -05:00
871ed65570
Remove an unnecessary **maybe_low_cpu_mem_usage()
2021-11-26 11:42:04 -05:00