简体   繁体   中英

Loss not reducing in Linear Regression with Pytorch

I'm working on a Linear Regression problem with Pytorch. The dataset I'm using is the Housing Prices from Kaggle. While training the model I see the loss is not reducing. It shows an erratic pattern. This is the Loss I'm getting after 100 epochs:

Epoch [10/100], Loss: 222273830912.0000
Epoch [20/100], Loss: 348813688832.0000
Epoch [30/100], Loss: 85658296320.0000
Epoch [40/100], Loss: 290305572864.0000
Epoch [50/100], Loss: 59399933952.0000
Epoch [60/100], Loss: 80360054784.0000
Epoch [70/100], Loss: 90352918528.0000
Epoch [80/100], Loss: 534457679872.0000
Epoch [90/100], Loss: 256064503808.0000
Epoch [100/100], Loss: 102400483328.0000

This is the code:

import torch
import numpy as np
from torch.utils.data import TensorDataset
import torch.nn as nn
from torch.utils.data import DataLoader
import torch.nn.functional as F

inputs = normalized_X
targets = np.array(train_y)

# Tensors
inputs = torch.from_numpy(inputs)
targets = torch.from_numpy(targets)
targets = targets.view(-1, 1)
train_ds = TensorDataset(inputs, targets.squeeze())
batch_size = 5
train_dl = DataLoader(train_ds, batch_size, shuffle=True)

model = nn.Linear(10, 1)
# Define Loss func
loss_fn = F.mse_loss
# Optimizer
opt = torch.optim.SGD(model.parameters(), lr = 1e-1)


num_epochs = 100
model.train()
for epoch in range(num_epochs):
    # Train with batches of data
    for xb, yb in train_dl:

        # 1. Generate predictions
        pred = model(xb.float())

        # 2. Calculate loss
        yb = yb.view(yb.size(0), -1)
        loss = loss_fn(pred, yb.float())
    
        # 3. Compute gradients
        loss.backward()

        # 4. Update parameters using gradients
        opt.step()

        # 5. Reset the gradients to zero
        opt.zero_grad()

    if (epoch+1) % 10 == 0:
        print('Epoch [{}/{}], Loss: {:.4f}'.format(epoch +
                                                   1, num_epochs, 
                                                   loss.item()))

I have run the code you give and I get this error:

    p.py:38: UserWarning: Using a target size (torch.Size([50])) that is 
different to the input size (torch.Size([50, 1])). This will likely lead 
to incorrect results due to broadcasting. Please ensure they have the same size.

Your problem is due to the difference of dimension between pred and yb .

this code show how to resolve it

import torch
import numpy as np
from torch.utils.data import TensorDataset
import torch.nn as nn
from torch.utils.data import DataLoader
import torch.nn.functional as F

inputs = np.random.rand(50, 10)
targets = np.random.randint(0, 2, 50)

# Tensors
inputs = torch.from_numpy(inputs)
targets = torch.from_numpy(targets)
train_ds = TensorDataset(inputs, targets.squeeze())
batch_size = 100
train_dl = DataLoader(train_ds, batch_size, shuffle=True)

model = nn.Linear(10, 1)
# Define Loss func
loss_fn = F.mse_loss
# Optimizer
opt = torch.optim.SGD(model.parameters(), lr = 1e-1)


num_epochs = 100
model.train()
for epoch in range(num_epochs):
    # Train with batches of data
    for xb, yb in train_dl:



# 1. Generate predictions
    pred = model(xb.float())

    # 2. Calculate loss
    yb = yb.view(yb.size(0), -1)
    loss = loss_fn(pred, yb.float())

    # 3. Compute gradients
    loss.backward()

    # 4. Update parameters using gradients
    opt.step()

    # 5. Reset the gradients to zero
    opt.zero_grad()

    if (epoch+1) % 10 == 0:
        print('Epoch [{}/{}], Loss: {:.4f}'.format(epoch +
                                               1, num_epochs, 
                                               loss.item()))

this discusion show to you in the detail https://discuss.pytorch.org/t/target-size-torch-size-10-must-be-the-same-as-input-size-torch-size-2/72354/6

My previous comment is inavalid and I deleted it. Your sample code works as intendeed. You want to predict random variable from independent random variable. There is no pattern and thats why it doesn't converge.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM