PyTorch会LSTM自动缩放数据

我有一个LSTM产生奇怪的结果,预测值似乎与输入数据的标量不同,而预测结果在(0.5-0.5)范围内,而输入数据已用MinmaxScaler在范围(0,1)

我从数据中删除了MinmaxScaler,发现LSTM仍在给出预测值(-0.5-0.5),即使实际值没有按比例缩小

Input sequence: [176153.8125,170.0,1511.0,77.59058380126953,915.5689086914062]
Label: 10671.240234375
Predicted output: 0.30351510643959045

LSTM代码:

class LSTM(nn.Module):
def __init__(self,input_size=360,hidden_layer_size=50,output_size=batchSize):
    super().__init__()
    self.hidden_layer_size = hidden_layer_size

    self.lstm = nn.LSTM(input_size,hidden_layer_size)

    self.linear = nn.Linear(hidden_layer_size,output_size)

    self.hidden_cell = (torch.zeros(1,1,self.hidden_layer_size),torch.zeros(1,self.hidden_layer_size))

def init_hidden(self):
    return (torch.zeros(1,self.hidden_layer_size))

def forward(self,input_seq):

    lstm_out,self.hidden_cell = self.lstm(input_seq.view(len(input_seq),-1),self.hidden_cell)

    predictions = self.linear(lstm_out.view(len(input_seq),-1))


    return predictions[-1]

培训代码:

for i in range(epochs):

for seq,labels in train_loader:
    seq = seq.to(device)
    labels = labels.to(device)
    optimizer.zero_grad()
    model.module.hidden_cell = model.module.init_hidden()

    y_pred = model(seq)


    single_loss = loss_function(y_pred,labels)
    single_loss.backward()
    optimizer.step()

我错过了什么吗?还是我的模特只是胡扯

niksosf 回答:PyTorch会LSTM自动缩放数据

暂时没有好的解决方案,如果你有好的解决方案,请发邮件至:iooj@foxmail.com
本文链接:https://www.f2er.com/3169426.html

大家都在问