Patch safetensors again

This commit is contained in:
somebody
2023-07-07 14:54:40 -05:00
parent 35f3687667
commit 802929f5f2

View File

@@ -520,8 +520,8 @@ def use_lazy_load(
torch.load = torch_load torch.load = torch_load
#if HAS_SAFETENSORS: if HAS_SAFETENSORS:
#patch_safetensors(callback) patch_safetensors(callback)
if dematerialized_modules: if dematerialized_modules:
# Most devices can just use Accelerate's implementation, but the Transformers on # Most devices can just use Accelerate's implementation, but the Transformers on