Commit Graph

701 Commits

Author SHA1 Message Date
a388c63023 Use aria2 to download split checkpoints 2022-05-10 21:28:13 -04:00
9c83ef7fa9 Replaced Adventure 125M and added C1-1.3B 2022-04-28 22:35:04 +00:00
ea82867e4d Merge branch 'united' into settings 2022-04-26 13:58:01 -04:00
11280a6e66 LocalTunnel Linux Fix 2022-04-19 14:41:21 +02:00
b8e79afe5e LocalTunnel support 2022-04-19 13:47:44 +02:00
c7b03398f6 Merge 'nolialsea/patch-1' into settings without Colab changes 2022-04-17 12:15:36 -04:00
372eb4c981 Merge pull request #119 from VE-FORBRYDERNE/scripting-sp
Allow userscripts to change the soft prompt
2022-04-14 21:33:20 +02:00
78d6ee491d Merge pull request #117 from mrseeker/patch-7
Shinen FSD 13B (NSFW)
2022-04-14 21:33:08 +02:00
e180db88aa Merge pull request #118 from VE-FORBRYDERNE/lazy-loader
Fix lazy loader in aiserver.py
2022-04-14 21:33:00 +02:00
bd6f7798b9 Fix lazy loader in aiserver.py 2022-04-14 14:33:10 -04:00
ad94f6c01c Shinen FSD 13B (NSFW) 2022-04-14 08:23:50 +02:00
945c34e822 Shinen FSD 6.7B (NSFW) 2022-04-13 14:47:22 +02:00
eeff126df4 Memory Sizes 2022-04-13 12:41:21 +02:00
a3a52dc9c3 Add support for changing soft prompt from userscripts 2022-04-12 15:59:05 -04:00
26909e6cf3 Model Categories 2022-04-10 20:53:15 +02:00
6fcb0af488 Adding Janeway 13B 2022-04-10 15:03:39 +02:00
359a0a1c99 Copy Python 3.6 compatible lazy loader to aiserver.py 2022-04-08 19:40:12 -04:00
1974761f70 Releasing Janeway 6.7B 2022-04-08 08:13:36 +02:00
09fee52abd Add num_seqs support to GooseAI/OpenAI client handler. 2022-04-07 14:50:23 -04:00
5feda462fb OAI - Fixes last commit 2022-04-07 02:39:37 +02:00
34b6c907f0 OAI Max Token Slider 2022-04-07 02:26:15 +02:00
b568e31381 OAI Path Support 2022-04-06 05:15:25 +02:00
699b3fc10b OAI Redo Fixes 2 2022-04-06 04:54:27 +02:00
b5a633e69b OAI Redo Fix 2022-04-06 04:45:01 +02:00
ee682702ee Merge branch 'KoboldAI:main' into united 2022-04-05 01:35:22 +02:00
8153f21d5c Convo 6B 2022-04-05 01:33:51 +02:00
e644963564 OpenAI Fixes 2022-03-28 02:02:37 +02:00
20e48b11d7 Typical sampling 2022-03-27 16:25:50 -04:00
aa8de64aa4 fix default port 2022-03-25 23:26:27 +01:00
3e003d3b42 add port to the command options 2022-03-25 22:18:28 +01:00
0348970b19 Make sure AI is not busy when using retry to regenerate random story 2022-03-23 22:09:35 -04:00
4832dd6f37 Allow regenerating random story using Retry button
Commit b55e5a8e0b removed this feature, so
this commit adds it back.
2022-03-23 13:39:46 -04:00
cf99f02ca5 Merge branch 'main' into united 2022-03-20 19:22:53 +01:00
20eab085dd Fix AutoSave Toggle 2022-03-20 19:12:11 +01:00
5c795609e4 KML Fix 2022-03-20 13:10:56 +01:00
b1125a6705 Add EOS and padding token to default NeoX badwords 2022-03-19 01:30:02 -04:00
85a4959efa Merge branch 'united' into neox 2022-03-18 11:19:03 -04:00
a3e5e052b3 Newer umamba + slope tweak 2022-03-16 18:34:02 +01:00
95c4251db9 Print two newlines before loading HF models 2022-03-15 13:58:53 -04:00
9dc48b15f0 Add custom badwords and pad token ID for GPT-NeoX 2022-03-14 23:31:49 -04:00
88f247d535 GPT-NeoX-20B support in Colab TPU instances 2022-03-14 23:14:20 -04:00
4892556059 Model saving for colab mode 2022-03-13 11:22:44 +01:00
2b8c46338e Change current working directory to KoboldAI folder 2022-03-13 01:22:11 -05:00
8ae0a4a3e7 Online Services Working now (without a way to test as I don't have accounts) 2022-03-12 14:21:11 -05:00
b55e5a8e0b Retry Bug Fix 2022-03-12 10:32:27 -05:00
ae854bab3d Fix for retry causing issues for future redo actions 2022-03-11 11:40:55 -05:00
772ae2eb80 Added model info to show model load progress in UI 2022-03-11 11:31:41 -05:00
b02d5e8696 Allows missing model_config again 2022-03-10 19:59:10 +01:00
172a548fa1 Fallback to generic GPT2 Tokenizer 2022-03-10 19:52:15 +01:00
9dee9b5c6d Ignore incorrect problems 2022-03-09 12:03:37 +01:00