如何解决与Keras一起使用权重正则化时发生ValueError
我正在尝试通过Keras进行权重正则化,以减少模型中的过拟合。
我在递归层之前的密集层上设置了一个l2正则化器。
这是我的模型:
def build_model():
# Inputs to the model
input_img = layers.Input(shape=(img_width,img_height,1),name="image",dtype="float32")
labels = layers.Input(name="label",shape=(None,),dtype="float32")
x = layers.Conv2D(32,(3,3),activation="relu",kernel_initializer="he_normal",padding="same",name="Conv1")(input_img)
x = layers.MaxPooling2D((2,2),name="pool1")(x)
x = layers.Conv2D(64,name="Conv2")(x)
x = layers.MaxPooling2D((2,name="pool2")(x)
new_shape = ((img_width // 4),(img_height // 4) * 64)
x = layers.Reshape(target_shape=new_shape,name="reshape")(x)
x = layers.Dense(256,kernel_regularizer=l2(0.01),bias_regularizer=l2(0.01),name="dense1")(x) #Error
x = layers.Dropout(0.5)(x)
# RNNs
lstm_1 = layers.LSTM(512,return_sequences=True,name = "lstm_1")(x)
lstm_1b = layers.LSTM(512,go_backwards=True,name = "lstm_1b")(x)
lstm_merged = layers.add([lstm_1,lstm_1b],name = "merge")
lstm_2 = layers.LSTM(512,name = "lstm_2")(lstm_merged)
lstm_2b = layers.LSTM(512,name = "lstm_2b")(lstm_merged)
x = layers.concatenate([lstm_2,lstm_2b],name ="concatenate")
# Output layer
x = layers.Dense(len(characters) + 1,activation="softmax",name="dense2")(x)
# Add CTC layer for calculating CTC loss at each step
output = CTCLayer(name="ctc_loss")(labels,x)
# Define the model
model = keras.models.Model(inputs=[input_img,labels],outputs=output,name="ocr_model_v1")
opt = keras.optimizers.RMSprop()
model.compile(optimizer=opt)
return model
当我运行 model.fit 时,出现此错误:
ValueError: Shapes must be equal rank,but are 0 and 2
From merging shape 1 with other shapes. for '{{node AddN}} = AddN[N=3,T=DT_FLOAT](dense1/kernel/Regularizer/mul,dense1/bias/Regularizer/mul,ocr_model_v1/ctc_loss/ExpandDims)' with input shapes: [],[],[?,1].
有人可以解释问题是什么以及如何解决?
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。