Commit Graph

3709 Commits

Author SHA1 Message Date
somebody
3a128e76b4 Attempts at dynamic wi fixes 2023-03-07 16:33:23 -06:00
somebody
5d9bd96ad8 Model: Fix API support 2023-03-04 21:15:12 -06:00
somebody
beef23f5a1 Model: Add debug code for detecting faulty samplers 2023-03-04 19:02:20 -06:00
somebody
c7822464c7 oops 2023-03-04 19:02:20 -06:00
somebody
b02513df07 Model: Add singleline_stopper and fix stopper code
singleline_stopper adapted from MasterAibo in 0ba7ac9
2023-03-04 19:02:20 -06:00
Henk
6857d2f7e1 Remove old worker API 2023-03-04 19:02:19 -06:00
Henk
9ee6adb35c Enable BMS API again under a new name 2023-03-04 19:02:19 -06:00
LightSaveUs
1b911eab8f RTE (6B) 2023-03-04 19:02:15 -06:00
LightSaveUs
9a85d6c4e7 RTE (13B) 2023-03-04 19:02:15 -06:00
LightSaveUs
a0d12bb29c RTE (Custom) 2023-03-04 19:02:15 -06:00
Henk
55349d84e1 Remote no longer unblocks the port by default 2023-03-04 19:02:15 -06:00
somebody
70cddc46e2 Model: Small cleanup 2023-03-04 19:02:10 -06:00
somebody
27b7635c95 Model: Fix TPU 2023-03-04 19:02:00 -06:00
somebody
f2974d205e RWKV: Remove old gnarly (in a bad way) RWKV4 support
Better interface for it coming soon
2023-03-01 19:26:55 -06:00
somebody
54cecd4d5d Model: And another refactor 2023-03-01 19:16:35 -06:00
somebody
225dcf1a0a Model: Documentation part 1 2023-02-28 20:10:09 -06:00
somebody
ef1155291f Model: TPU Fixes 2023-02-28 18:05:34 -06:00
somebody
bd3bbdaad8 Model: More tpu fixes 2023-02-27 19:29:03 -06:00
somebody
45cd6b5de2 Model: Don't do breakmodel on colab 2023-02-27 19:24:41 -06:00
somebody
ed83362dee Model: TPU should be ready for testing 2023-02-27 19:08:44 -06:00
somebody
1839de1483 colab debug 2023-02-27 18:40:26 -06:00
somebody
2e3ca6f769 Model: Bugfixes/fix tokenizer hack 2023-02-27 18:30:13 -06:00
somebody
af73527be0 Samplers: Part 2 2023-02-26 17:22:54 -06:00
somebody
f882979c88 Model: Samplers pt. 1 2023-02-26 16:09:22 -06:00
somebody
f771ae38cf Model: Formatting fixes 2023-02-26 14:14:29 -06:00
somebody
10842e964b Get rid of yet another instance of this identical class 2023-02-26 14:02:15 -06:00
somebody
e8236bffbf Model: Fixup horde bug 2023-02-26 13:36:06 -06:00
somebody
15f957260f Model: Add rep pen for OAI API 2023-02-26 13:35:54 -06:00
somebody
35bbd78326 Model: Monkey release detection fix 2023-02-26 13:35:29 -06:00
somebody
b99c16f562 Merge branch 'united' of https://github.com/henk717/KoboldAI into model-structure-and-maybe-rwkv 2023-02-26 13:06:30 -06:00
somebody
56bcd32f6d Model: Fix ReadOnly
whoops
2023-02-26 12:40:20 -06:00
somebody
a73804ca1e Accelerate: Remove HAS_ACCELERATE
Accelerate has been a dependency for a while, and as such we probably
shouldn't be lugging around code that assumes it isn't present.
2023-02-26 12:18:06 -06:00
somebody
5e3b0062ee Model: Tokenizer fix 2023-02-26 12:17:49 -06:00
somebody
8d49b5cce1 Model: More TPU stuff 2023-02-26 12:09:26 -06:00
Henk
cdfc7326e6 Horde API Setting Consolidation 2023-02-26 17:59:36 +01:00
henk717
642d173579 Merge pull request #294 from db0/kaimerge
Changes to work with the merged hordes
2023-02-26 16:00:48 +01:00
somebody
d53d2bcc45 Model: Fix crash on full GPU load 2023-02-25 18:19:46 -06:00
somebody
465e22fa5c Model Fix bugs and introduce hack for visualization
Hopefully I remove that attrocity before the PR
2023-02-25 18:12:49 -06:00
somebody
ffe4f25349 Model: Work on stoppers and stuff 2023-02-25 17:12:16 -06:00
somebody
6b4905de30 Model: Port rest of models over
Generation's still broke but it's a start
2023-02-25 16:05:56 -06:00
Henk
165c7219eb Frequency Penalty for OAI 2023-02-25 22:15:05 +01:00
Henk
526e8ab9b1 More Horde setting fixing 2023-02-25 16:20:49 +01:00
Henk
29ceac6d43 Horde parameter fixes 2023-02-25 16:01:11 +01:00
henk717
1e6e1aaa18 Merge branch 'KoboldAI:main' into united 2023-02-25 14:59:42 +01:00
Henk
b49070e3ed Updated bridge for the new horde 2023-02-25 14:58:18 +01:00
somebody
f8c4158ebc Model: Successful load implementation
The goal of this series of commits is to have an implementation-agnostic
interface for models, thus being less reliant on HF Transformers for model
support. A model object will have a method for generation, a list of callbacks
to be run on every token generation, a list of samplers that will modify
probabilities, etc. Basically anything HF can do should be easily
implementable with the new interface :^)

Currently I've tested the loading of pre-downloaded models with
breakmodel between GPUs and that works, though essentially no testing
has been done in the larger scheme of things. Currently this is about
the only supported configuration, and generation isn't very functional.
2023-02-24 21:41:44 -06:00
Divided by Zer0
d459bdb1c0 adjust 2023-02-22 23:12:34 +01:00
henk717
0e7b2c1ba1 Merge pull request #293 from jojorne/jojorne-patch-save-load-story-with-UTF-8-encoding
Save/Load Story with UTF-8 encoding.
2023-02-22 19:59:33 +01:00
jojorne
d3bedfcbda Include koboldai_vars.save_story(). 2023-02-22 15:42:56 -03:00
jojorne
d6c9f5f1f5 Save/Load Story with UTF-8 encoding. 2023-02-22 14:40:42 -03:00