Gnome Ann
d5ab3ef5b1
Fix `no attribute get_checkpoint_shard_files`
2022-05-14 11:49:04 -04:00
Gnome Ann
6e82f205b4
Aria2 bug fix for Windows users
2022-05-14 11:44:28 -04:00
henk717
9eaa76c72b
Add OPT 13B to the models
2022-05-14 07:55:47 +02:00
Gnome Ann
1476e76cfc
Copy fp16 model files instead of resaving them
2022-05-14 00:45:43 -04:00
Gnome Ann
0c5ca5261e
Loading a sharded model will now display only one progress bar
2022-05-13 23:32:16 -04:00
Gnome Ann
f9f1a5f3a9
Make sure tqdm progress bars display properly in Colab
2022-05-13 17:37:45 -04:00
Gnome Ann
91d3672446
Proper progress bar for aria2 downloads
2022-05-13 17:00:10 -04:00
henk717
7ea0c49c1a
Merge pull request #128 from VE-FORBRYDERNE/opt
...
OPT breakmodel and TPU support
2022-05-13 18:07:02 +02:00
Gnome Ann
a051bf4397
OPT breakmodel bug fix
2022-05-13 10:45:57 -04:00
Gnome Ann
1200173386
Custom badwords for OPT
...
Generated using:
```
import transformers
tokenizer = transformers.AutoTokenizer.from_pretrained("facebook/opt-350m", fast=False)
badwordsids_opt = [[v] for k, v in tokenizer.vocab.items() if any(c in k for c in "<>[]")]
```
2022-05-13 10:45:28 -04:00
Henk
d5fa782483
NS Mode (comment fix)
2022-05-13 10:53:19 +02:00
Henk
8376f12e21
Add NS mode
...
OPT supports newlines, but it also needs some of the behavior we use in S mode. NS mode is a more limited version of S mode that still handles the </s> token, but instead of replacing it with a new line we replace it empty and newlines are not converted.
In future if your Fairseq style model has newline support use NS mode, while if it needs artifically inserted newlines use S mode. This also means that people finetuning fairseq models to include newlines might benefit from testing their models on ns mode.
2022-05-13 10:44:12 +02:00
Gnome Ann
55079f672a
Fix typo in soft prompt patching code
2022-05-13 01:51:55 -04:00
Gnome Ann
29bb3f569b
Fix a bug in OPTForCausalLM where self.lm_head is the wrong size
2022-05-13 01:37:17 -04:00
Gnome Ann
defbb53b68
OPT breakmodel
2022-05-13 01:03:38 -04:00
Gnome Ann
b1d8797a54
Allow TPU Colab to load sharded HF models
2022-05-12 23:51:40 -04:00
Gnome Ann
4fa5f1cd6a
Add TPU support for OPT-350M
...
The 350M model seems to have a different structure than the other ones ???
2022-05-12 22:21:15 -04:00
Gnome Ann
dfa2aa7314
Merge branch 'united' into opt
2022-05-12 20:11:53 -04:00
Henk
5c4a087970
Disable S mode for OPT
2022-05-13 01:47:59 +02:00
Gnome Ann
f5e689a725
Upload maps/opt.json and update requirements
2022-05-12 19:09:31 -04:00
Henk
e98cc3cb16
OPT models
2022-05-12 23:55:21 +02:00
Henk
376e76f5da
S mode for OPT
2022-05-12 02:18:14 +02:00
henk717
a1c7017ddc
Merge pull request #127 from VE-FORBRYDERNE/aria2
...
Handle aria2 properly when it exits with nonzero exit code
2022-05-11 22:57:45 +02:00
Gnome Ann
580dd0b2a3
Handle aria2 properly when it exits with nonzero exit code
2022-05-11 16:23:24 -04:00
henk717
05549de42d
Merge pull request #126 from VE-FORBRYDERNE/aria2
...
Aria2 downloader bug fixes
2022-05-11 21:58:31 +02:00
Gnome Ann
2ebba9488b
Change `force_download` back to False
...
This is to prevent fully downloaded models from being re-downloaded in
Colab.
2022-05-11 15:51:48 -04:00
Gnome Ann
6d481ca57e
Merge branch 'united' into aria2
2022-05-11 15:51:11 -04:00
Gnome Ann
c65272052a
aria2 now downloads to different filename and renames afterwards
...
This is to match the behaviour of the original transformers downloader
in order to deal with the rare case of someone downloading a model using
aria2, cancelling before it finishes, and then attempting to resume the
download with the normal transformers downloader.
2022-05-11 15:45:38 -04:00
Henk
6d27084e8a
Better Aria2 Defaults
...
Trunc prevents slow allocation on windows, force_download=True has proven a more reliable default. Since models are converted to local formats it does not impact local users. And because -c is used the impact of checking if the model is correct is desirable and minimal.
2022-05-11 21:38:33 +02:00
Gnome Ann
7a3f865e3f
Prevent aria2 from resuming cancelled downloads
...
Resumed downloads tend to be very slow.
The original transformers downloader didn't allow resuming downloads
either.
2022-05-11 15:14:37 -04:00
Gnome Ann
c81f3bd084
Use `--file-allocation=trunc` instead of `--file-allocation=none`
2022-05-11 14:51:43 -04:00
Gnome Ann
f96c878d83
Use aria2 even when all model files are already in cache
...
This allows aria2 to continue downloading a pytorch_model.bin after a
cancelled download.
2022-05-11 14:43:56 -04:00
Gnome Ann
f60c7d8492
Fix the behaviour of `aria2_hook()` when using `force_download`
2022-05-11 14:41:34 -04:00
Gnome Ann
5732a8f15a
Don't use `aria2_hook()` if `force_download=True` is used
2022-05-11 14:40:31 -04:00
henk717
903d593ce4
Merge pull request #125 from VE-FORBRYDERNE/aria2
...
Use aria2 to improve HF model download speeds in Colab
2022-05-11 07:55:53 +02:00
Gnome Ann
46cfa1367f
Add `--no_aria2` command line flag
2022-05-11 00:44:56 -04:00
Gnome Ann
f09959f9be
Fix patching code of `PreTrainedModel.from_pretrained()`
2022-05-11 00:41:53 -04:00
Gnome Ann
22b4f3c9df
Bug fixes for `aria2_hook()` when running Windows
2022-05-11 00:14:00 -04:00
Gnome Ann
82205722af
Fix logic of `aria2_hook()`
2022-05-10 23:46:29 -04:00
Gnome Ann
4b49d1c464
Make sure `vars.revision` is defined
2022-05-10 22:51:36 -04:00
Gnome Ann
4b693b4858
Fix the logic of `force_download` in utils.py
2022-05-10 22:47:03 -04:00
Gnome Ann
c1ef20bcff
Also enable aria2 downloading for non-sharded checkpoints
2022-05-10 22:43:41 -04:00
Gnome Ann
e115bb68e4
aria2 downloads in utils.py now use correct user agent
2022-05-10 22:22:46 -04:00
Gnome Ann
b97b2a02d6
Add `--revision` command line flag
2022-05-10 22:14:56 -04:00
Gnome Ann
937d9ee06a
Change default `model.save_pretrained` shard size to 500 MiB
2022-05-10 22:04:25 -04:00
Gnome Ann
a388c63023
Use aria2 to download split checkpoints
2022-05-10 21:28:13 -04:00
Henk
01e15d03d6
Remove play.ipnyb
...
Interactive Python doesn't work well on Jupyter, until they support what Colab can do this file is pointless.
2022-05-11 01:49:07 +02:00
Henk
7a9297adc3
Jupyter Git integration
2022-05-11 01:31:12 +02:00
Henk
f917d3438f
Updated models
2022-05-10 21:39:16 +02:00
henk717
7fcc1a9acb
Fix C1
2022-05-10 18:38:50 +02:00