somebody
3928d86339
Fall back to unpatched HF
2023-07-08 14:36:45 -05:00
somebody
fd6f66a98d
Patch _rebuild_from_type_v2 to not try converting LazyTensors to Tensors
2023-07-08 13:57:05 -05:00
somebody
802929f5f2
Patch safetensors again
2023-07-07 14:54:40 -05:00
somebody
35f3687667
Merge branch 'united' of https://github.com/henk717/KoboldAI into accelerate-offloading
2023-07-07 14:54:12 -05:00
somebody
cfe1f5b514
Stub seek_offset for cache sorting in load loop
...
The way Safetensors individual weight loading is implemented doesn't
take full advantage of the cache ordering system thing, so this can just
be left at zero for now.
2023-07-07 14:49:46 -05:00
Henk
76d21bb142
Universal ziproot for lazy_loader
2023-07-07 13:37:32 +02:00
somebody
6b83944e9b
Use VE's patched load_from_state_dict on TPU for loading empty weights
2023-07-05 18:36:57 -05:00
somebody
7f869a54d8
Just use accelerate on tpu
2023-07-03 17:18:48 -05:00
somebody
31a3046a18
Load empty modules without accelerate
2023-07-03 17:07:18 -05:00
somebody
c56214c275
Fix loading bar
2023-06-21 16:27:22 -05:00
somebody
ceaefa9f5e
Not quite
2023-05-28 14:57:45 -05:00
somebody
ed0728188a
More cleaning
2023-05-28 13:22:32 -05:00
somebody
14241fc156
Speed
2023-05-28 13:03:24 -05:00
somebody
6f93150e4d
Work on lazyload
2023-05-28 12:25:31 -05:00
somebody
1546b9efaa
Hello its breaking breakmodel time
2023-05-27 16:31:53 -05:00
Henk
59c96b5b7a
Unban fix
2023-05-15 22:38:12 +02:00
Henk
c5100b4eab
Unban Tensor
2023-05-15 22:21:22 +02:00
Henk
56443bc7ea
Unban torch._tensor._rebuild_tensor_v2
2023-05-15 21:44:01 +02:00
onesome
1db9d9ba61
Lazyload: Whoops
2023-04-25 18:46:54 -05:00
onesome
e28e268a2d
Use safetensors only when available
2023-04-25 18:32:37 -05:00
somebody
9d70646e4d
Lazyload: Safetensors
2023-04-02 15:40:34 -05:00