738 Commits

Author SHA1 Message Date
vfbd
4eff7bf3ba /api now redirects to /api/latest 2022-08-10 18:22:46 -04:00
vfbd
d2c06182f2 Remove annotation from api_version 2022-08-10 18:05:04 -04:00
vfbd
2af57adff3 API v1.1.0 2022-08-10 14:48:01 -04:00
vfbd
becda8b842 Error 405 now sets Allow header 2022-08-09 22:32:24 -04:00
vfbd
5352c14c59 Fix typo in /config/soft_prompt documentation 2022-08-08 19:20:48 -04:00
vfbd
c04e3c5666 Fix /docs/ redirects 2022-08-08 18:21:46 -04:00
vfbd
55c4acad8f Disable probability viewer and output streaming when using API 2022-08-08 18:16:08 -04:00
vfbd
82ae749396 Merge branch 'united' into api 2022-08-08 18:14:50 -04:00
vfbd
aa01d1419d Add /story/end/delete and /story endpoints 2022-08-08 18:08:55 -04:00
vfbd
1f629ee254 Add more endpoints 2022-08-08 17:51:40 -04:00
vfbd
a93087aecd Fix api_format_docstring 2022-08-08 14:21:50 -04:00
vfbd
ddda981436 Improve /generate description 2022-08-08 14:19:43 -04:00
vfbd
dc0fa9bff1 Add redirects to /api/v1/docs/ 2022-08-08 14:16:38 -04:00
vfbd
ce064168e3 Additional validation for soft_prompt in API 2022-08-08 13:52:07 -04:00
vfbd
de1e8f266a ValidationErrorSchema now has minItems 1 for its arrays 2022-08-08 13:22:18 -04:00
vfbd
596f619999 Unknown values in API input are now ignored instead of causing error 2022-08-08 13:17:53 -04:00
vfbd
3b56859c12 vars.disable_input_formatting and vars.disable_output_formatting fix 2022-08-08 13:04:46 -04:00
vfbd
34c9535667 Upload basic API with /generate POST endpoint 2022-08-08 02:27:48 -04:00
Henk
77e2a7972c Fix incorrect max tokens 2022-08-06 17:28:55 +02:00
Henk
fe00581b83 Merge branch 'main' into united 2022-08-06 17:10:09 +02:00
Henk
c71fd0cc3f OPT Nerys V2 6B 2022-08-06 17:04:22 +02:00
Henk
610257b36e Output Streaming on by Default 2022-08-06 16:47:04 +02:00
henk717
8bcf4187ac
Merge pull request #178 from one-some/token-prob
Add token probability visualizer
2022-08-05 14:27:46 +02:00
somebody
f6d046fe1b Add token probability visualizer 2022-08-04 13:49:37 -05:00
ebolam
71e119f0b7 Fix for secondary model loads leaking settings into secondary model's settings file. 2022-08-02 19:45:36 -04:00
henk717
050e195420
Merge pull request #173 from one-some/token-streaming
Add token streaming option
2022-07-30 18:32:51 +02:00
henk717
a63f7cfa5a
Merge pull request #174 from ebolam/united
Fix for blank model info box when downloading model
2022-07-29 22:15:58 +02:00
ebolam
f97c10b794 Fix for blank model info box when downloading model 2022-07-28 19:40:27 -04:00
somebody
a4d81292f8 Add token streaming option 2022-07-27 22:13:08 -05:00
henk717
fe64e480ee
Merge pull request #171 from ebolam/united
Add Download Model Status
2022-07-26 00:52:12 +02:00
henk717
7721b72184
Merge branch 'KoboldAI:main' into united 2022-07-26 00:42:35 +02:00
Henk
4d8a633351 Aetherroom instead of aidg.club 2022-07-26 00:41:51 +02:00
ebolam
12acb50ee0 Fix for getting "model download status" when downloading config to figure out layer counts 2022-07-25 18:29:14 -04:00
scott-ca
9dc9966433 Added functionality to add any/all args via json 2022-07-23 22:02:03 -06:00
ebolam
907cf74b13 Added status bar for downloading models 2022-07-22 13:58:20 -04:00
ebolam
2b53598307
Fixes for file editor (#170)
Various fixes for the file editor by Ebolam
2022-07-20 00:50:03 +02:00
ebolam
f58064e72c Revert "Fix for aidg.club website being taken read-only"
This reverts commit 23a031d852d6890e4fa6bde197a3aeb70ec7714b.
2022-07-19 16:54:32 -04:00
ebolam
23a031d852 Fix for aidg.club website being taken read-only 2022-07-19 13:40:55 -04:00
ebolam
68d143b80c Merge branch 'united' of https://github.com/ebolam/KoboldAI into united 2022-07-15 12:30:18 -04:00
ebolam
d91ed3141d Fix for non ascii files in edit mode 2022-07-15 12:30:02 -04:00
henk717
e8c39992a1
Merge pull request #166 from ebolam/united
Add file browser to soft prompts and user scripts
2022-07-04 19:52:05 +02:00
ebolam
328c0a38d7 Removed breadcrumbs on file browser before the jail directory 2022-07-03 16:02:55 -04:00
henk717
fd44f0ded3
Merge branch 'KoboldAI:main' into united 2022-07-03 15:12:12 +02:00
Henk
d041ec0921 Safer defaults and more flexibility
There have been a lot of reports from newer users who experience AI breakdown because not all models properly handle 2048 max tokens. 1024 is the only value that all models support and was the original value KoboldAI used. This commit reverts the decision to increase this to 2048, any existing configurations are not effected. Users who wish to increase the max tokens can do so themselves. Most models handle up to 1900 well (The GPT2 models are excluded), for many you can go all the way. (It is currently not yet known why some finetunes cause a decrease in maxtoken support,

In addition this commit contains a request for more consistent slider behavior, allowing the sliders to be changed at 0.01 intervals instead of some sliders being capped to 0.05.
2022-07-03 15:07:54 +02:00
henk717
a99518d0a8
Merge branch 'KoboldAI:main' into united 2022-07-02 12:59:53 +02:00
Henk
e2f7fed99f Don't turn gamestarted off 2022-07-02 12:59:14 +02:00
vfbd
aeed9bd8f7 Fix base fairseq dense models when using accelerate with a GPU 2022-07-01 20:16:39 -04:00
ebolam
3f8a7ab4bb Allowing edit in userscripts 2022-06-30 19:41:11 -04:00
ebolam
813540fe9b Added folder browser for softprompts and userscripts 2022-06-30 19:13:05 -04:00
ebolam
97e0df45d7 File Dialog complete 2022-06-30 15:57:27 -04:00