Commit Graph

453 Commits

Author SHA1 Message Date
Gnome Ann 4710ea3949 Restructure the `execute` API in bridge.lua 2021-12-11 02:42:40 -05:00
Gnome Ann 68c2cb3b98 Fix a few problems in bridge.lua
* Use `python.iter` instead of `pairs` to iterate through `_bridged`
* Use `old_loadfile` instead of `safe_require_with_env` to load scripts
  in order to handle unusual file names
* Prevent modules imported by scripts from accessing bridge.lua's
  environment
* Fix behaviour of `KoboldWorldInfoEntry_mt._kobold_next(t, k)`
* New `next` implementation now has more safety checks
2021-12-11 01:21:18 -05:00
Gnome Ann 35966b2007 Upload bridge.lua, default.lua and some Lua libs
base64
inspect
json.lua
Lua-hashings
Lua-nums
Moses
mt19937ar-lua
Penlight
Serpent
2021-12-10 19:45:57 -05:00
Gnome Ann cb384ce25b Merge branch 'united' into world-info 2021-12-10 16:09:43 -05:00
henk717 64ca337c5d Update colabkobold.sh 2021-12-10 21:34:41 +01:00
henk717 47f2544630 Location Fix
Fix the extraction location for zstd
2021-12-10 21:24:23 +01:00
henk717 bc1c0c4fa7 pip requirements
For Colab
2021-12-09 23:50:21 +01:00
henk717 d546cbd8c6 Update dependencies
Updates dependencies, play.sh didn't work properly so removing that for now since manually running aiserver.py is superior on Linux until I can get conda to init inside the script
2021-12-09 23:49:35 +01:00
henk717 9054c71515
zstd path fix 2021-12-09 18:06:54 +01:00
Gnome Ann 8212fb701b Merge branch 'united' into world-info 2021-12-08 22:54:12 -05:00
henk717 d15b50f334 Conda based Play
For people who want to use Conda instead of Docker.
2021-12-08 18:44:13 +01:00
henk717 20e0b59fb9
Migrate to official transformers
No longer using VE's fork since its obsolete, in line with what we already did for the CUDA varient.
2021-12-07 23:35:28 +01:00
Gnome Ann 683bcb824f Merge branch 'united' into world-info 2021-12-05 13:06:32 -05:00
henk717 c36bc376c0
Revert typo fix
Wasn't a typo (Testing on git again since its easier with the colabs)
2021-12-05 18:36:54 +01:00
henk717 337941e356
Typo fix
Accidentally had a negative which should have been a positive, this should fix repo downloads.
2021-12-05 18:34:20 +01:00
henk717 a442a2a67e
Merge pull request #41 from VE-FORBRYDERNE/jax21
TPU backend improvements
2021-12-05 18:10:52 +01:00
Gnome Ann 6d8517e224 Fix some minor coding errors 2021-12-05 11:39:59 -05:00
Gnome Ann 1393eac882 Add indentation to WI folder contents 2021-12-05 03:29:13 -05:00
Gnome Ann d46ef8550b Fix WI sortable handle events not being bound correctly
This fixes a problem where WI entries/folders are sometimes able to be
dragged into places they shouldn't be. Steps to reproduce:

1. Start a blank story
2. Refresh the browser
3. Open the W Info screen
4. Add a world info folder
5. Add a world info entry into that folder
6. Drag that world info entry
2021-12-05 03:04:45 -05:00
Gnome Ann 85aa180a90 Put safeguards on dragging and dropping into invalid positions 2021-12-05 02:50:42 -05:00
Gnome Ann 150ce033c9 TPU backend no longer needs to recompile after changing softprompt 2021-12-05 02:49:15 -05:00
Gnome Ann 3e0b1a9e63 Fix scrolling problems with WI entries with long names/comments 2021-12-05 01:39:42 -05:00
Gnome Ann 08992dec7e Use a green horizontal line as the drag-and-drop placeholder 2021-12-05 00:34:44 -05:00
Gnome Ann b99ac92a52 WI folders and WI drag-and-drop 2021-12-04 23:59:28 -05:00
Gnome Ann d2d338d314 Improve TPU backend compilation times with `numseqs > 1`
A Python `for` loop was replaced with a `jax.lax.scan` call so that JAX
only compiles the `transformer.generate_initial` function one time
instead of `numseqs` times. This is because JAX unrolls Python built-in
loops like `for`. The compilation times should now be about the same as
they were before the upgrade to JAX 0.2.21.
2021-11-30 19:22:40 -05:00
Gnome Ann c1e7c1643f Fix unbound axis error in tpu_mtj_backend.py when `numseqs > 1` 2021-11-30 14:06:46 -05:00
Gnome Ann 3c349e6aaf Modify TPU backend code to support JAX 0.2.21
The original one supported versions of JAX up to 0.2.12, and possibly also some
earlier versions. This new code supports exclusively JAX 0.2.21 and does not
work with any earlier or later versions of JAX. However, this new code benefits
from not needing to recompile when changing "Amount To Generate" and also from
supporting stopping generation early, which makes an implementation of Dynamic
World Info Scan finally possible.
2021-11-30 10:13:02 -05:00
henk717 9e3318c696 Update colabkobold.sh
Bugfix
2021-11-29 18:42:40 +01:00
henk717 4244b588cb Ngrok requirements
Adds ngrok to the requirements.txt files
2021-11-29 18:13:30 +01:00
henk717 fd19e2bfd6 Allow Ngrok
Adds --ngrok to the Colab script
2021-11-29 18:12:45 +01:00
henk717 44d8068bab Ngrok Support
Not recommended for home users due to DDoS risks, but might make Colab tunnels more reliable.
2021-11-29 18:11:14 +01:00
henk717 eef675ce21
Further Streamlining Dependencies
Hopefully this will make Kaggle work
2021-11-29 16:43:45 +01:00
henk717 ff99e4c0e9
Merge branch 'KoboldAI:main' into united 2021-11-29 12:06:03 +01:00
henk717 d4a7ff5ccb Better Repetition Penalty Slider
Allow users more control since 6B is sensitive
2021-11-29 08:28:51 +01:00
henk717 f7993d5ef1
Merge pull request #84 from adcar/patch-1
Fixed a small typo
2021-11-28 17:58:40 +01:00
Alexander D. Cardosi 8d922d83a9
Fixed a small typo 2021-11-28 11:30:23 -05:00
henk717 6b9d744679
Torch needs to be newer
Uncapping the version to see the effect (Testing on the git since Colabs load from here)
2021-11-28 11:35:24 +01:00
henk717 b9c7e33410
Switch to official transformers
Official transformers is now superior, switching over to allow Colab's to use it.
2021-11-28 05:13:05 +01:00
henk717 939719214d
Merge pull request #40 from VE-FORBRYDERNE/patch
Allow bad words filter to ban <|endoftext|> token
2021-11-27 19:23:01 +01:00
Gnome Ann 9f51c42dd4 Allow bad words filter to ban <|endoftext|> token
The official transformers bad words filter doesn't allow this by
default. Finetune's version does allow this by default, however.
2021-11-27 11:42:06 -05:00
henk717 2bc93ba37a
Whitelist 6B in breakmodel
Now that we properly support it, allow the menu option to use breakmodel
2021-11-27 10:09:54 +01:00
henk717 b56ee07ffa
Fix for CPU mode
Recent optimizations caused the CPU version to load in an incompatible format, now we convert it back to the correct format after loading it efficiently first.
2021-11-27 05:34:29 +01:00
henk717 56c2e619f9 ColabKobold
A brand new launcher to power the colab's, you can use https://henk.tech/ckds as a short URL which points towards this github
2021-11-27 03:44:08 +01:00
henk717 3b976c9af7 Updated defaults
Transformers official by default, no more Git versions
2021-11-27 03:14:47 +01:00
henk717 6008d4f3a5
Merge pull request #39 from VE-FORBRYDERNE/breakmodel
Official transformers 6B breakmodel support and more RAM-efficient model loading
2021-11-27 01:11:48 +01:00
Gnome Ann e5e2fb088a Remember to actually import `GPTJModel` 2021-11-26 12:38:52 -05:00
Gnome Ann 871ed65570 Remove an unnecessary `**maybe_low_cpu_mem_usage()` 2021-11-26 11:42:04 -05:00
Gnome Ann a93a76eb01 Load model directly in fp16 if using GPU or breakmodel 2021-11-26 10:55:52 -05:00
Gnome Ann 95aff61781 Don't pin CPU layers after running out of pinned memory 2021-11-26 10:31:15 -05:00
Gnome Ann 32e1d4a7a8 Enable `low_cpu_mem_usage` 2021-11-25 18:09:25 -05:00