somebody
|
111028642e
|
Fix tokenizer fallback for llama
|
2023-05-01 19:42:52 -05:00 |
|
one-some
|
455b8257a9
|
Implement softprompt hack
|
2023-04-28 10:26:59 -05:00 |
|
somebody
|
ace4364339
|
Two more time
|
2023-04-27 21:13:26 -05:00 |
|
somebody
|
446f38ee9d
|
One more time
|
2023-04-27 21:07:34 -05:00 |
|
somebody
|
2eee535540
|
Actually fix decoding with soft prompts
it really wants a tensor
|
2023-04-27 21:01:12 -05:00 |
|
somebody
|
ffa7b22734
|
Experiment
|
2023-04-27 20:28:04 -05:00 |
|
somebody
|
cd1eb97c2a
|
Debuuuug
|
2023-04-27 20:12:29 -05:00 |
|
somebody
|
4559112551
|
Potential fix
|
2023-04-27 19:51:10 -05:00 |
|
somebody
|
b256a8fbc7
|
Debug
|
2023-04-27 19:33:03 -05:00 |
|
somebody
|
334c09606b
|
Fix for tokenizer stuff on pythia
|
2023-04-09 18:23:58 -05:00 |
|
somebody
|
91bb433b5f
|
GenericTokenizer: Fall back to defined tokenizer
Shouldn't be relied on for model-agnostic code, but for loading
processes where you know the tokenizer class used it should be okie
dokie
|
2023-03-19 19:03:20 -05:00 |
|
somebody
|
8d0bc404a5
|
Model: More Jax import fixes and formatting
|
2023-03-17 15:36:44 -05:00 |
|
somebody
|
b10b201701
|
Model: Add basic RWKV implementation
|
2023-03-13 19:34:38 -05:00 |
|
somebody
|
bf8b60ac2d
|
Model: Add GenericTokenizer
Because Hugging Face doesnt have a consistant API across their own
libraries
|
2023-03-13 17:36:58 -05:00 |
|