简体   繁体   English

Pytorch 全局修剪不会减小 model 的大小

[英]Pytorch Global Pruning is not reducing the size of the model

I am trying to Prune my Deep Learning model via Global Pruning.我正在尝试通过全局修剪修剪我的深度学习 model。 The original UnPruned model is about 77.5 MB.原始的未修剪 model 约为 77.5 MB。 However after pruning, when I am saving the model, the size of the model is the same as the original.但是修剪后,当我保存 model 时,model 的大小与原始大小相同。 Can anyone help me with this issue?谁能帮我解决这个问题?

Below is the Pruning code:-以下是修剪代码:-

import torch.nn.utils.prune as prune

parameters_to_prune = (
(model.encoder[0], ‘weight’),
(model.up_conv1[0], ‘weight’),
(model.up_conv2[0], ‘weight’),
(model.up_conv3[0], ‘weight’),
)
print(parameters_to_prune)

prune.global_unstructured(
parameters_to_prune,
pruning_method=prune.L1Unstructured,
amount=0.2,
)

print(
“Sparsity in Encoder.weight: {:.2f}%”.format(
100. * float(torch.sum(model.encoder[0].weight == 0))
/ float(model.encoder[0].weight.nelement())
)
)
print(
“Sparsity in up_conv1.weight: {:.2f}%”.format(
100. * float(torch.sum(model.up_conv1[0].weight == 0))
/ float(model.up_conv1[0].weight.nelement())
)
)
print(
“Sparsity in up_conv2.weight: {:.2f}%”.format(
100. * float(torch.sum(model.up_conv2[0].weight == 0))
/ float(model.up_conv2[0].weight.nelement())
)
)
print(
“Sparsity in up_conv3.weight: {:.2f}%”.format(
100. * float(torch.sum(model.up_conv3[0].weight == 0))
/ float(model.up_conv3[0].weight.nelement())
)
)

print(
“Global sparsity: {:.2f}%”.format(
100. * float(
torch.sum(model.encoder[0].weight == 0)
+ torch.sum(model.up_conv1[0].weight == 0)
+ torch.sum(model.up_conv2[0].weight == 0)
+ torch.sum(model.up_conv3[0].weight == 0)
)
/ float(
model.encoder[0].weight.nelement()
+ model.up_conv1[0].weight.nelement()
+ model.up_conv2[0].weight.nelement()
+ model.up_conv3[0].weight.nelement()
)
)
)

**Setting Pruning to Permanent**
prune.remove(model.encoder[0], “weight”)
prune.remove(model.up_conv1[0], “weight”)
prune.remove(model.up_conv2[0], “weight”)
prune.remove(model.up_conv3[0], “weight”)

**Saving the model**
PATH = “C:\PrunedNet.pt”
torch.save(model.state_dict(), PATH)

Prunning won't change the model size if applied like this.如果像这样应用,修剪不会改变 model 大小

If you have a tensor, say something like:如果你有张量,可以这样说:

[1., 2., 3., 4., 5., 6., 7., 8.]

And you prune 50% of data, so for example this:你修剪了50%的数据,例如:

[1., 2., 0., 4., 0., 6., 0., 0.]

You will still have 8 float values and their size will be the same.您仍然会有8个浮点值,并且它们的大小相同。

When prunning reduces model size?修剪时减小 model 大小?

  • When we save weights in a sparse format, but it should have high sparsity (so 10% non-zero elements)当我们以稀疏格式保存权重,但它应该具有高稀疏性(因此 10% 非零元素)
  • When we actually remove something (like a kernel from Conv2d , it could be removed if it's weights are zero or negligible)当我们实际删除某些东西时(例如来自 Conv2d 的Conv2d ,如果它的权重为零或可忽略不计,则可以将其删除)

Otherwise it's not going to work.否则它将无法正常工作。 Check out some related projects that would allow you to do it without coding it in on your own, for example Torch-Pruning .查看一些相关项目,这些项目可以让您无需自己编写代码就可以完成,例如Torch-Pruning

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM