[英]How do I prune over the highest weights in tensorflow layer? tfmot.sparsity.keras.prune_low_magnitude
我想修剪 tf 層中的最高權重值。 我正在考慮使用tf.nn.top_k
但我不確定我將如何去做。
文檔: https : //www.tensorflow.org/model_optimization/api_docs/python/tfmot/sparsity/keras/prune_low_magnitude代碼:
pruning_params = {
'pruning_schedule': PolynomialDecay(initial_sparsity=0.2,
final_sparsity=0.8, begin_step=1000, end_step=2000),
'block_size': (2, 3),
'block_pooling_type': 'MAX'
}
model = keras.Sequential([
layers.Dense(10, activation='relu', input_shape=(100,)),
prune_low_magnitude(layers.Dense(2, activation='tanh'), **pruning_params)
])
假設w
是您要修剪的層的權重矩陣,而k
是應該修剪的權重百分比,這應該對您有用:
# Convert k from percentage to integer representing the number of weights
k = tf.cast(tf.round(tf.size(w, out_type=tf.float32) * tf.constant(k)), dtype=tf.int32)
# Reshape flatten the weight matrix
w_reshaped = tf.reshape(w, [-1])
# Select the indices of the largest k weights
_, indices = tf.nn.top_k(w_reshaped, k, sorted=True, name=None)
# Set the elements matching the indices to 0
mask = tf.scatter_nd_update(tf.Variable(tf.ones_like(w_reshaped, dtype=tf.float32), name="mask", trainable=False), tf.reshape(indices, [-1, 1]), tf.zeros([k], tf.float32))
# Update the weight matrix w
w.assign(tf.reshape(w_reshaped * mask, tf.shape(w)))
這是基於這個Github repo 。 請注意,在該項目中,我正在修剪最小的k
權重。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.