微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

如何在Keras的LSTM中修复以下错误?如果RNN是有状态的,请指定`batch_input_shape`

如何解决如何在Keras的LSTM中修复以下错误?如果RNN是有状态的,请指定`batch_input_shape`

在Keras中运行LSTM时出现以下错误

ValueError: If a RNN is stateful,it needs to kNow its batch size. Specify the batch size of your input tensors: 
- If using a Sequential model,specify the batch size by passing a `batch_input_shape` argument to your first layer.
- If using the functional API,specify the batch size by passing a `batch_shape` argument to your Input layer.

我有一个Pandas数据框,它是按分钟索引的时间序列,训练集和测试集为X_train,X_test,y_train,y_test = train_test_split(X,y,test_size = .2,shuffle = False)X_train.shape是(27932,7),而X_test.shape是(6984,7)。我将X_train和X_test标准化并重塑为3D形式:

# normalizing the data
ss = StandardScaler()
X_train = ss.fit_transform(X_train)
X_test = ss.transform(X_test)

X_train = X_train.reshape(6983,4,X_train.shape[1])
X_test = X_test.reshape(1746,X_test.shape[1])
y_train = y_train.values.reshape(6983,1)
y_test = y_test.values.reshape(1746,1)

X重塑背后的逻辑是,我希望我的LSTM在6983个样本上学习4个时间步长(即4分钟)的样本。我想针对(X_train.shape[0],1,X_train.shape[1])的重塑进行测试。

我的LSTM如下:

# Creating our model's structure
model = Sequential()
model.add(Bidirectional(LSTM(4,batch_input_shape = (X_train.shape[0],X_train.shape[1],X_train.shape[2]),return_sequences = True,stateful = True)))
model.add(Dropout(0.2))
model.add(Bidirectional(LSTM(4)))
model.add(Dense(1,activation = 'sigmoid'))
es = EarlyStopping(monitor = 'val_loss',patience = 10)

# Compiling the model
model.compile(loss = 'binary_crossentropy',optimizer = 'adam',metrics = ['Recall'])

# Fitting the model
history = model.fit(X_train,epochs = 50,verbose = 1,validation_data = (X_test,y_test),callbacks = [es])

具有讽刺意味的是,即使我在LSTM的第一层中明确声明了batch_input_shapestateful = True,我仍然遇到上述错误。如果LSTM使用X_train的3D形状(X_train.shape[0],X_train.shape[1])和X_test的3D形状(X_test.shape[0],X_test.shape[1]),我运行LSTM没问题。

我的代码的哪一部分触发了错误

顺便说一句,我无法通过比代码中描述的更多的隐藏单元来提高LSTM的性能。这看起来令人惊讶吗?

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。