fef946c173
Possible colab aria2 status fix
2022-09-21 19:48:17 +02:00
06f4d9addf
No Aria2 spam
2022-09-21 19:36:57 +02:00
8915ee7eb3
Fix for aria2 download status to UI
2022-09-21 13:09:27 -04:00
cca3ce3493
Aria2 Fixes
2022-09-21 18:57:09 +02:00
f62c740f7e
Revert "Aria2 Fixes"
...
This reverts commit 8d1c734df8
.
2022-09-21 18:47:13 +02:00
8d1c734df8
Aria2 Fixes
2022-09-21 18:21:48 +02:00
943614b5e6
Merge branch 'main' into dependency-fix
2022-09-15 17:33:48 -04:00
463bf86bcc
aria2_hook now uses new cache format if you have transformers 4.22
2022-09-15 16:50:43 -04:00
551565c5ac
Fix error in aria2_hook when transformers version is at least 4.22.0
...
Some of the transformers.file_utils functions that were removed in
transformers v4.22.0 have equivalents in the huggingface_hub module.
2022-09-15 13:37:50 -04:00
6faa27ef87
Merge pull request #187 from VE-FORBRYDERNE/offline
...
Fix the model selection GUI when there is no internet connection
2022-08-24 20:13:02 +02:00
6ffaf43548
Repetition penalty is now sampler #6 in the sampler order
2022-08-23 15:10:21 -04:00
55f45c4912
Fix the model selection GUI when there is no internet connection
2022-08-22 14:45:02 -04:00
8ba68e05ec
Aria2 Status Bar Download Fix
2022-08-19 12:13:46 -04:00
513f59791a
Debug
2022-08-19 12:11:53 -04:00
046f9d8ace
Fix for Colab download status bar
2022-08-19 12:08:00 -04:00
ec90b76064
Add print in console for model downloading when using Aria2
2022-08-19 10:51:56 -04:00
6acccbf7a4
Save null seed to settings
2022-08-14 02:09:53 +02:00
699c2353e7
Merge branch 'KoboldAI:main' into united
2022-07-27 18:06:53 +02:00
9c8b825b4a
aria2 hook now also catches EntryNotFoundError
2022-07-27 11:45:07 -04:00
907cf74b13
Added status bar for downloading models
2022-07-22 13:58:20 -04:00
46678931b2
Better sentence spacing
2022-06-26 20:27:21 +02:00
6ba7429eea
Don't add sentence spacing if submission is empty
...
When you retry, it actually sends an empty submission, so if you have
add sentence spacing on, retrying could add an extra action with a
single space.
(cherry picked from commit 151407a001
)
2022-06-26 14:06:18 -04:00
151407a001
Don't add sentence spacing if submission is empty
...
When you retry, it actually sends an empty submission, so if you have
add sentence spacing on, retrying could add an extra action with a
single space.
2022-06-26 13:02:22 -04:00
ff69e9fbfe
Put layers_module_names, module_names and named_buffers in utils.py
2022-06-20 17:17:42 -04:00
f7ffdd7b6b
Add more model querying utilities
2022-06-18 18:16:56 -04:00
5253cdcb36
Lazy loader no longer requires map file except when loading to TPU
2022-06-16 18:45:11 -04:00
979640bd2f
Merge commit '2d3db7b4ba388f566aaec88a0e76678fe4fade8d' into overhaul-merge
2022-06-14 18:42:14 -04:00
2d3db7b4ba
Implement support for sampler order in the backend code
2022-06-13 19:12:23 -04:00
622a3fc8db
Fix for model loading by moving monkey patching functions into a run-once function
...
Added folder navigation to custom model loading (Needs prittying)
2022-06-08 18:42:44 -04:00
6e82f205b4
Aria2 bug fix for Windows users
2022-05-14 11:44:28 -04:00
0c5ca5261e
Loading a sharded model will now display only one progress bar
2022-05-13 23:32:16 -04:00
91d3672446
Proper progress bar for aria2 downloads
2022-05-13 17:00:10 -04:00
7ea0c49c1a
Merge pull request #128 from VE-FORBRYDERNE/opt
...
OPT breakmodel and TPU support
2022-05-13 18:07:02 +02:00
8376f12e21
Add NS mode
...
OPT supports newlines, but it also needs some of the behavior we use in S mode. NS mode is a more limited version of S mode that still handles the </s> token, but instead of replacing it with a new line we replace it empty and newlines are not converted.
In future if your Fairseq style model has newline support use NS mode, while if it needs artifically inserted newlines use S mode. This also means that people finetuning fairseq models to include newlines might benefit from testing their models on ns mode.
2022-05-13 10:44:12 +02:00
defbb53b68
OPT breakmodel
2022-05-13 01:03:38 -04:00
b1d8797a54
Allow TPU Colab to load sharded HF models
2022-05-12 23:51:40 -04:00
580dd0b2a3
Handle aria2 properly when it exits with nonzero exit code
2022-05-11 16:23:24 -04:00
2ebba9488b
Change force_download
back to False
...
This is to prevent fully downloaded models from being re-downloaded in
Colab.
2022-05-11 15:51:48 -04:00
6d481ca57e
Merge branch 'united' into aria2
2022-05-11 15:51:11 -04:00
c65272052a
aria2 now downloads to different filename and renames afterwards
...
This is to match the behaviour of the original transformers downloader
in order to deal with the rare case of someone downloading a model using
aria2, cancelling before it finishes, and then attempting to resume the
download with the normal transformers downloader.
2022-05-11 15:45:38 -04:00
6d27084e8a
Better Aria2 Defaults
...
Trunc prevents slow allocation on windows, force_download=True has proven a more reliable default. Since models are converted to local formats it does not impact local users. And because -c is used the impact of checking if the model is correct is desirable and minimal.
2022-05-11 21:38:33 +02:00
7a3f865e3f
Prevent aria2 from resuming cancelled downloads
...
Resumed downloads tend to be very slow.
The original transformers downloader didn't allow resuming downloads
either.
2022-05-11 15:14:37 -04:00
c81f3bd084
Use --file-allocation=trunc
instead of --file-allocation=none
2022-05-11 14:51:43 -04:00
f96c878d83
Use aria2 even when all model files are already in cache
...
This allows aria2 to continue downloading a pytorch_model.bin after a
cancelled download.
2022-05-11 14:43:56 -04:00
f60c7d8492
Fix the behaviour of aria2_hook()
when using force_download
2022-05-11 14:41:34 -04:00
5732a8f15a
Don't use aria2_hook()
if force_download=True
is used
2022-05-11 14:40:31 -04:00
22b4f3c9df
Bug fixes for aria2_hook()
when running Windows
2022-05-11 00:14:00 -04:00
82205722af
Fix logic of aria2_hook()
2022-05-10 23:46:29 -04:00
4b693b4858
Fix the logic of force_download
in utils.py
2022-05-10 22:47:03 -04:00
c1ef20bcff
Also enable aria2 downloading for non-sharded checkpoints
2022-05-10 22:43:41 -04:00