如何解决神经网络值错误:形状 (2115,13) 和 (1,) 未对齐:13 (dim 1) != 1 (dim 0)
我现在很无助,虽然我找了几天的答案,但找不到合适的。我也看到了,有人问了类似的问题,但我还是找不到解决我的维度问题的方法。
我的目标是预测电影院的票价。我将数据拆分为训练数据和测试数据。 直到“创建神经网络”部分一切顺利。 我是初学者。请帮忙!
ValueError: 形状 (8457,13) 和 (1,) 未对齐:13 (dim 1) != 1 (dim 0)
data_perc= data_filtered.head(int(len(data_filtered)*(10/100)))
y=data_perc.ticket_price
x=data_perc.drop('ticket_price',axis=1) # 1=colums und 0=rows
x_train,x_test,y_train,y_test=train_test_split(x,y,test_size=0.2)
x_train.head()
x_train.shape
inputs = x_train
outputs = y_train
class NeuralNetwork:
# intialize variables in class
def __init__(self,inputs,outputs):
self.inputs = inputs
self.outputs = outputs
# initialize weights as .50 for simplicity
self.weights = np.array([.50])
self.error_history = []
self.epoch_list = []
#activation function ==> S(x) = 1/1+e^(-x)
def sigmoid(self,x,deriv=False):
if deriv == True:
return x * (1 - x)
return 1 / (1 + np.exp(-x))
# data will flow through the neural network.
def feed_forward(self):
self.hidden = self.sigmoid(np.squeeze(np.dot(self.inputs,self.weights)))
# going backwards through the network to update weights
def backpropagation(self):
self.error = self.outputs - self.hidden
delta = self.error * self.sigmoid(self.hidden,deriv=True)
self.weights += np.dot(self.inputs.T,delta)
# train the neural net for 25,000 iterations
def train(self,epochs=25000):
for epoch in range(epochs):
# flow forward and produce an output
self.feed_forward()
# go back though the network to make corrections based on the output
self.backpropagation()
# keep track of the error history over each epoch
self.error_history.append(np.average(np.abs(self.error)))
self.epoch_list.append(epoch)
# function to predict output on new and unseen input data
def predict(self,new_input):
prediction = self.sigmoid(np.dot(new_input,self.weights))
return prediction
# create neural network
NN = NeuralNetwork(inputs,outputs)
# train neural network
NN.train()
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。