简体   繁体   English

不支持 Tensorflow 修剪层

[英]Tensorflow Prune Layer Not Supported

I am trying to prune a model in tensorflow but coming across an error I don't know how to tackle.我正在尝试在 tensorflow 中修剪 model 但遇到一个错误,我不知道如何解决。 The error is ValueError: Please initialize "Prune" with a supported layer. Layers should either be a "PrunableLayer" instance, or should be supported by the PruneRegistry. You passed: <class 'base_transformer_tf.TransformerEncoder'>错误是ValueError: Please initialize "Prune" with a supported layer. Layers should either be a "PrunableLayer" instance, or should be supported by the PruneRegistry. You passed: <class 'base_transformer_tf.TransformerEncoder'> ValueError: Please initialize "Prune" with a supported layer. Layers should either be a "PrunableLayer" instance, or should be supported by the PruneRegistry. You passed: <class 'base_transformer_tf.TransformerEncoder'>

The model is created using following model 是使用以下方法创建的

def transformer_encoder(num_columns, num_labels, num_layers, d_model, num_heads, dff, window_size, dropout_rate, weight_decay, label_smoothing, learning_rate):
    
    inp = tf.keras.layers.Input(shape = (window_size, num_columns))
    x = tf.keras.layers.BatchNormalization()(inp)
    x = tf.keras.layers.Dense(d_model)(x)
    x = tf.keras.layers.BatchNormalization()(x)
    x = tf.keras.layers.Activation('swish')(x)
    x = tf.keras.layers.SpatialDropout1D(dropout_rate)(x)
    x = TransformerEncoder(num_layers, d_model, num_heads, dff, window_size, dropout_rate)(x)
    out = tf.keras.layers.Dense(num_labels, activation = 'sigmoid', dtype=tf.float32)(x[:, -1, :])
    
    model = tf.keras.models.Model(inputs = inp, outputs = out)
    model.compile(optimizer = tfa.optimizers.AdamW(weight_decay = weight_decay, learning_rate = learning_rate),
                  loss = tf.keras.losses.BinaryCrossentropy(label_smoothing = label_smoothing), 
                  metrics = tf.keras.metrics.AUC(name = 'AUC'), 
                  )
    
    return model

The pruning portion of code is following代码的修剪部分如下

pruning_params = {
      'pruning_schedule': tfmot.sparsity.keras.PolynomialDecay(initial_sparsity=0.00,
                                                               final_sparsity=0.50,
                                                               begin_step=0,
                                                               end_step=end_step)
}
model_for_pruning = prune_low_magnitude(model, **pruning_params)

# `prune_low_magnitude` requires a recompile.
model_for_pruning.compile(optimizer='adam',
              loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
              metrics=['accuracy'])

logdir = tempfile.mkdtemp()
callbacks = [
  tfmot.sparsity.keras.UpdatePruningStep(),
  tfmot.sparsity.keras.PruningSummaries(log_dir=logdir),
]
model_for_pruning.fit(np.concatenate((X_tr2, X_val)), np.concatenate((y_tr2, y_val)),
                  batch_size=batch_size, epochs=epochs, validation_split=validation_split,
                  callbacks=callbacks)

Any help would be appreciated任何帮助,将不胜感激

Tensorflow does not know how to prune your custom TransformerEncoder Keras layer. Tensorflow 不知道如何修剪您的自定义TransformerEncoder Keras 层。 You should specify which weights to sparsify, as in this example: Prune custom Keras layer or modify parts of layer to prune .您应该指定要稀疏化的权重,如下例所示: 修剪自定义 Keras 图层或修改部分图层以修剪

That would look like:那看起来像:

class TransformerEncoder(tf.keras.layers.Layer, tfmot.sparsity.keras.PrunableLayer):
  def get_prunable_weights(self):
    return [self.my_weight, ..]

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM