微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

不支持 Tensorflow 修剪层

如何解决不支持 Tensorflow 修剪层

我正在尝试在 tensorflow 中修剪模型,但遇到了一个我不知道如何解决错误错误ValueError: Please initialize "Prune" with a supported layer. Layers should either be a "PrunableLayer" instance,or should be supported by the PruneRegistry. You passed: <class 'base_transformer_tf.TransformerEncoder'>

使用以下方法创建模型

def transformer_encoder(num_columns,num_labels,num_layers,d_model,num_heads,dff,window_size,dropout_rate,weight_decay,label_smoothing,learning_rate):
    
    inp = tf.keras.layers.Input(shape = (window_size,num_columns))
    x = tf.keras.layers.Batchnormalization()(inp)
    x = tf.keras.layers.Dense(d_model)(x)
    x = tf.keras.layers.Batchnormalization()(x)
    x = tf.keras.layers.Activation('swish')(x)
    x = tf.keras.layers.SpatialDropout1D(dropout_rate)(x)
    x = TransformerEncoder(num_layers,dropout_rate)(x)
    out = tf.keras.layers.Dense(num_labels,activation = 'sigmoid',dtype=tf.float32)(x[:,-1,:])
    
    model = tf.keras.models.Model(inputs = inp,outputs = out)
    model.compile(optimizer = tfa.optimizers.AdamW(weight_decay = weight_decay,learning_rate = learning_rate),loss = tf.keras.losses.BinaryCrossentropy(label_smoothing = label_smoothing),metrics = tf.keras.metrics.AUC(name = 'AUC'),)
    
    return model

代码修剪部分如下

pruning_params = {
      'pruning_schedule': tfmot.sparsity.keras.polynomialDecay(initial_sparsity=0.00,final_sparsity=0.50,begin_step=0,end_step=end_step)
}
model_for_pruning = prune_low_magnitude(model,**pruning_params)

# `prune_low_magnitude` requires a recompile.
model_for_pruning.compile(optimizer='adam',loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),metrics=['accuracy'])

logdir = tempfile.mkdtemp()
callbacks = [
  tfmot.sparsity.keras.UpdatePruningStep(),tfmot.sparsity.keras.PruningSummaries(log_dir=logdir),]
model_for_pruning.fit(np.concatenate((X_tr2,X_val)),np.concatenate((y_tr2,y_val)),batch_size=batch_size,epochs=epochs,validation_split=validation_split,callbacks=callbacks)

任何帮助将不胜感激

解决方法

Tensorflow 不知道如何修剪您的自定义 TransformerEncoder Keras 层。您应该指定要稀疏化的权重,如下例所示:Prune custom Keras layer or modify parts of layer to prune

看起来像:

class TransformerEncoder(tf.keras.layers.Layer,tfmot.sparsity.keras.PrunableLayer):
  def get_prunable_weights(self):
    return [self.my_weight,..]

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。