Llama
4a01f345de
Add include_anote kwarg to lua_compute_context.
...
Add an optional keyword argument to lua_compute_context to control
whether the author's note should be included in the context. The
default value is true, so if the include_anote kwarg is not specified
then the author's note will be included, which was the default
behavior prior to this change.
Also update the Lua API documentation to describe this kwarg.
2022-10-12 23:18:19 -07:00
Henk
d5143eeb80
LUA Error as Error
2022-10-12 01:23:00 +02:00
henk717
739cf0aae7
Merge pull request #227 from VE-FORBRYDERNE/pickle
...
Custom unpickler to avoid pickle's arbitrary code execution vulnerability
2022-10-07 02:12:53 +02:00
vfbd
323f593a96
Custom unpickler to avoid pickle's arbitrary code execution vulnerability
2022-10-06 20:08:08 -04:00
henk717
b85d74f22c
Merge branch 'KoboldAI:main' into united
2022-10-05 19:51:29 +02:00
henk717
9f18811ff9
Merge pull request #226 from VE-FORBRYDERNE/api-settings
...
Allow changing and reading sampler seed and sampler order from API
2022-10-04 20:30:25 +02:00
henk717
6af0e842f2
Switch to official
...
Switch to the official branch on KoboldAI now that it is compatible
2022-10-04 17:42:18 +02:00
henk717
cf3aebbd8f
Merge pull request #161 from henk717/united
...
Release 1.19
1.19.0
2022-10-04 15:57:47 +02:00
vfbd
bdfa6d86b7
Seed has to be a 64-bit unsigned int or PyTorch will throw an error
...
tpu_mtj_backend's seed can be an integer of arbitrary size but we will
limit it to a 64-bit unsigned integer anyways for consistency.
2022-10-02 17:50:32 -04:00
vfbd
dd1c25241d
Allow sampler seed and full determinism to be read/written in /config
2022-10-02 17:43:54 -04:00
vfbd
1a59a4acea
Allow changing sampler seed and sampler order from API
2022-10-02 16:25:51 -04:00
henk717
7bd3125f5a
Merge branch 'KoboldAI:main' into united
2022-10-01 16:59:46 +02:00
henk717
2f45b93119
GPU updates
2022-10-01 16:58:16 +02:00
henk717
e1606afc0d
GPU Descriptions
2022-10-01 15:43:45 +02:00
henk717
8313df8817
Localtunnel Default
2022-10-01 15:42:54 +02:00
henk717
9abad8bee9
Merge pull request #225 from scythe000/united
...
Update aiserver.py - typo fix
2022-09-30 19:30:46 +02:00
scythe000
a482ec16d8
Update aiserver.py - typo fix
...
Changed 'beakmodel' to 'breakmodel' in the example comment.
2022-09-30 10:29:32 -07:00
henk717
276c6f8e9e
Merge pull request #224 from db0/get_cluster_models_fix
...
fix endpoint for get_cluster_models
2022-09-30 00:30:18 +02:00
Divided by Zer0
90022d05c8
fix endpoint for get_cluster_models
2022-09-30 00:26:55 +02:00
henk717
3a094a049b
Merge pull request #223 from ebolam/united
...
Fix for GPT models downloading even when present in model folder
2022-09-28 19:15:33 +02:00
ebolam
e7973e13ac
Fix for GPT models downloading even when present in model folder
2022-09-28 12:47:50 -04:00
henk717
0f7ecb3257
Merge pull request #222 from ebolam/united
...
Fix for lazy loading on models after a non-lazy load model
2022-09-28 01:54:21 +02:00
ebolam
f0690373b3
Merge branch 'united' of https://github.com/ebolam/KoboldAI into united
2022-09-27 19:52:44 -04:00
ebolam
72fc68c6e4
Fix for lazy loading on models after a non-lazy load model
2022-09-27 19:52:35 -04:00
henk717
c935d8646a
Merge pull request #221 from ebolam/united
...
Fix for loading models that don't support breakmodel (GPU/CPU support…
2022-09-28 01:32:08 +02:00
ebolam
4aa842eada
Merge commit 'refs/pull/180/head' of https://github.com/ebolam/KoboldAI into united
2022-09-27 19:29:05 -04:00
ebolam
be719a7e5e
Fix for loading models that don't support breakmodel (GPU/CPU support in UI)
2022-09-27 19:02:37 -04:00
Henk
52e120c706
Disable breakmodel if we error on the check
2022-09-28 01:00:06 +02:00
henk717
d2ff32be32
Merge pull request #220 from ebolam/united
...
Fix for loading models on CPU only that don't support breakmodel
2022-09-28 00:46:37 +02:00
Henk
057ddb4fb2
Better --cpu handling
2022-09-28 00:45:17 +02:00
ebolam
168ae8083c
Remove debug print
2022-09-27 18:30:20 -04:00
ebolam
0311cc215e
Fix for loading models on CPU only that don't support breakmodel
2022-09-27 18:29:32 -04:00
henk717
3906cc1bd0
Merge pull request #219 from ebolam/united
...
Fix for GPT2 breakmodel in the UI
2022-09-28 00:17:27 +02:00
ebolam
edd50fc809
Fix for GPT2 breakmodel in the UI
2022-09-27 17:58:51 -04:00
Henk
f1d63f61f3
Syntax fix
2022-09-27 22:43:09 +02:00
henk717
d772837ad0
Merge pull request #217 from ebolam/united
...
Fix for older model loading
2022-09-27 22:41:13 +02:00
ebolam
908dc8ea60
Fix for older model loading
2022-09-27 15:59:56 -04:00
Henk
62921c4896
getmodelname for configname
2022-09-27 21:11:31 +02:00
Henk
6c32bc18d7
GPT2Tokenizer for TPU
2022-09-27 18:33:31 +02:00
Henk
60d09899ea
Don't use Fast tokenizers when we don't have to
2022-09-27 18:26:13 +02:00
Henk
11455697ef
Tokenizer Fixes (Slow first to keep coherency)
2022-09-27 17:57:18 +02:00
Henk
07896867b2
Revert Tokenizer Change
2022-09-27 15:36:08 +02:00
Henk
82a250aa1b
Revert "Fix tokenizer selection code"
...
This reverts commit 7fba1fd28af0c50e7cea38ea0ee12ab48a3bebf7.
1.18.2
2022-09-27 15:33:08 +02:00
henk717
7f5ba8a678
Merge pull request #216 from VE-FORBRYDERNE/merge
...
Merge main into united
2022-09-26 22:11:44 +02:00
vfbd
79ae0f17ec
Merge branch 'main' into merge
2022-09-26 16:10:10 -04:00
henk717
685ec3237b
Merge pull request #158 from VE-FORBRYDERNE/tokenizer
...
Fix tokenizer selection code
2022-09-26 21:34:26 +02:00
henk717
39bd02a40e
Merge pull request #157 from VE-FORBRYDERNE/patch
...
Fix `|` character sometimes appearing in editor
2022-09-26 21:34:16 +02:00
vfbd
7fba1fd28a
Fix tokenizer selection code
2022-09-26 14:37:25 -04:00
vfbd
ddc9be00d6
Attempt to fix issue where |
appears in editor after pressing enter
2022-09-26 13:57:44 -04:00
Henk
ce692c7ebf
Pinned info toggle fix
2022-09-26 16:36:50 +02:00