Commit Graph

53 Commits

Author SHA1 Message Date
6ffaf43548 Repetition penalty is now sampler #6 in the sampler order 2022-08-23 15:10:21 -04:00
8ba68e05ec Aria2 Status Bar Download Fix 2022-08-19 12:13:46 -04:00
513f59791a Debug 2022-08-19 12:11:53 -04:00
046f9d8ace Fix for Colab download status bar 2022-08-19 12:08:00 -04:00
ec90b76064 Add print in console for model downloading when using Aria2 2022-08-19 10:51:56 -04:00
6acccbf7a4 Save null seed to settings 2022-08-14 02:09:53 +02:00
699c2353e7 Merge branch 'KoboldAI:main' into united 2022-07-27 18:06:53 +02:00
9c8b825b4a aria2 hook now also catches EntryNotFoundError 2022-07-27 11:45:07 -04:00
907cf74b13 Added status bar for downloading models 2022-07-22 13:58:20 -04:00
46678931b2 Better sentence spacing 2022-06-26 20:27:21 +02:00
6ba7429eea Don't add sentence spacing if submission is empty
When you retry, it actually sends an empty submission, so if you have
add sentence spacing on, retrying could add an extra action with a
single space.

(cherry picked from commit 151407a001)
2022-06-26 14:06:18 -04:00
151407a001 Don't add sentence spacing if submission is empty
When you retry, it actually sends an empty submission, so if you have
add sentence spacing on, retrying could add an extra action with a
single space.
2022-06-26 13:02:22 -04:00
ff69e9fbfe Put layers_module_names, module_names and named_buffers in utils.py 2022-06-20 17:17:42 -04:00
f7ffdd7b6b Add more model querying utilities 2022-06-18 18:16:56 -04:00
5253cdcb36 Lazy loader no longer requires map file except when loading to TPU 2022-06-16 18:45:11 -04:00
979640bd2f Merge commit '2d3db7b4ba388f566aaec88a0e76678fe4fade8d' into overhaul-merge 2022-06-14 18:42:14 -04:00
2d3db7b4ba Implement support for sampler order in the backend code 2022-06-13 19:12:23 -04:00
622a3fc8db Fix for model loading by moving monkey patching functions into a run-once function
Added folder navigation to custom model loading (Needs prittying)
2022-06-08 18:42:44 -04:00
6e82f205b4 Aria2 bug fix for Windows users 2022-05-14 11:44:28 -04:00
0c5ca5261e Loading a sharded model will now display only one progress bar 2022-05-13 23:32:16 -04:00
91d3672446 Proper progress bar for aria2 downloads 2022-05-13 17:00:10 -04:00
7ea0c49c1a Merge pull request #128 from VE-FORBRYDERNE/opt
OPT breakmodel and TPU support
2022-05-13 18:07:02 +02:00
8376f12e21 Add NS mode
OPT supports newlines, but it also needs some of the behavior we use in S mode. NS mode is a more limited version of S mode that still handles the </s> token, but instead of replacing it with a new line we replace it empty and newlines are not converted.

In future if your Fairseq style model has newline support use NS mode, while if it needs artifically inserted newlines use S mode. This also means that people finetuning fairseq models to include newlines might benefit from testing their models on ns mode.
2022-05-13 10:44:12 +02:00
defbb53b68 OPT breakmodel 2022-05-13 01:03:38 -04:00
b1d8797a54 Allow TPU Colab to load sharded HF models 2022-05-12 23:51:40 -04:00
580dd0b2a3 Handle aria2 properly when it exits with nonzero exit code 2022-05-11 16:23:24 -04:00
2ebba9488b Change force_download back to False
This is to prevent fully downloaded models from being re-downloaded in
Colab.
2022-05-11 15:51:48 -04:00
6d481ca57e Merge branch 'united' into aria2 2022-05-11 15:51:11 -04:00
c65272052a aria2 now downloads to different filename and renames afterwards
This is to match the behaviour of the original transformers downloader
in order to deal with the rare case of someone downloading a model using
aria2, cancelling before it finishes, and then attempting to resume the
download with the normal transformers downloader.
2022-05-11 15:45:38 -04:00
6d27084e8a Better Aria2 Defaults
Trunc prevents slow allocation on windows, force_download=True has proven a more reliable default. Since models are converted to local formats it does not impact local users. And because -c is used the impact of checking if the model is correct is desirable and minimal.
2022-05-11 21:38:33 +02:00
7a3f865e3f Prevent aria2 from resuming cancelled downloads
Resumed downloads tend to be very slow.

The original transformers downloader didn't allow resuming downloads
either.
2022-05-11 15:14:37 -04:00
c81f3bd084 Use --file-allocation=trunc instead of --file-allocation=none 2022-05-11 14:51:43 -04:00
f96c878d83 Use aria2 even when all model files are already in cache
This allows aria2 to continue downloading a pytorch_model.bin after a
cancelled download.
2022-05-11 14:43:56 -04:00
f60c7d8492 Fix the behaviour of aria2_hook() when using force_download 2022-05-11 14:41:34 -04:00
5732a8f15a Don't use aria2_hook() if force_download=True is used 2022-05-11 14:40:31 -04:00
22b4f3c9df Bug fixes for aria2_hook() when running Windows 2022-05-11 00:14:00 -04:00
82205722af Fix logic of aria2_hook() 2022-05-10 23:46:29 -04:00
4b693b4858 Fix the logic of force_download in utils.py 2022-05-10 22:47:03 -04:00
c1ef20bcff Also enable aria2 downloading for non-sharded checkpoints 2022-05-10 22:43:41 -04:00
e115bb68e4 aria2 downloads in utils.py now use correct user agent 2022-05-10 22:22:46 -04:00
a388c63023 Use aria2 to download split checkpoints 2022-05-10 21:28:13 -04:00
f682c1229a Fix fairseq newline handling issues 2022-02-12 13:23:59 -05:00
8742453f95 Add safeguards for token budget and text formatting
* Error messages are now shown when memory, author's note, etc. exceeds
  budget by itself
* Formatting options no longer break if there are empty chunks in the
  story (although there shouldn't be any in the first place)
* Number of generated tokens is now kept track of from Python
2021-12-26 18:29:54 -05:00
7b73d7cfdd Single Line Mode
Adds Single Line mode, optimized for things like chatbot testing and other cases where you want to have control over what happens after a paragraph.

This can also be used as a foundation for a chatbot optimized interface mode.
2021-10-23 17:30:48 +02:00
c276220a35 Allow deleting and renaming stories in the browser 2021-08-31 18:22:30 -04:00
2f4f7ac92a Fix error when you use "Add sentence spacing" 2021-08-28 18:54:10 -04:00
00414d26e2 Integrated VE_FORBRYDERNE's Adventure Mode + Cleanup
Adventure Mode allows you to play this like AID, perfect for Choose your own Adventure models
2021-08-19 13:18:01 +02:00
f9bbb174a6 Added OpenAI API support
Added in-browser Save/Load/New Story controls
(Force a full refresh in your browser!)
Fixed adding InferKit API key if client.settings already exists
Added cmd calls to bat files so they'll stay open on error
Wait animation now hidden on start state/restart
2021-05-22 05:28:40 -04:00
4996e0ff46 Bugfixes:
Improvements to pruning context from text returned from the AI
Colab errors should no longer throw JSON decode errors in client
Improved logic for World Info scanning
Fix for index error in addsentencespacing
2021-05-18 17:59:59 -04:00
0e038b8727 Bugfix for Add Sentence Spacing format option 2021-05-14 02:24:05 -04:00