PyTorch:损失函数loss function

Posted -柚子皮-

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了PyTorch:损失函数loss function相关的知识,希望对你有一定的参考价值。

-柚子皮-

Loss Functions

nn.L1Loss

Creates a criterion that measures the mean absolute error (MAE) between each element in the input xx and target yy .

nn.MSELoss

Creates a criterion that measures the mean squared error (squared L2 norm) between each element in the input xx and target yy .

nn.CrossEntropyLoss

This criterion combines nn.LogSoftmax() and nn.NLLLoss() in one single class.

nn.CTCLoss

The Connectionist Temporal Classification loss.

nn.NLLLoss

The negative log likelihood loss.

nn.PoissonNLLLoss

Negative log likelihood loss with Poisson distribution of target.

nn.KLDivLoss

The Kullback-Leibler divergence loss measure

nn.BCELoss

Creates a criterion that measures the Binary Cross Entropy between the target and the output:

nn.BCEWithLogitsLoss

This loss combines a Sigmoid layer and the BCELoss in one single class.

nn.MarginRankingLoss

Creates a criterion that measures the loss given inputs x1x1 , x2x2 , two 1D mini-batch Tensors, and a label 1D mini-batch tensor yy (containing 1 or -1).

nn.HingeEmbeddingLoss

Measures the loss given an input tensor xx and a labels tensor yy (containing 1 or -1).

nn.MultiLabelMarginLoss

Creates a criterion that optimizes a multi-class multi-classification hinge loss (margin-based loss) between input xx (a 2D mini-batch Tensor) and output yy (which is a 2D Tensor of target class indices).

nn.SmoothL1Loss

Creates a criterion that uses a squared term if the absolute element-wise error falls below beta and an L1 term otherwise.

nn.SoftMarginLoss

Creates a criterion that optimizes a two-class classification logistic loss between input tensor xx and target tensor yy (containing 1 or -1).

nn.MultiLabelSoftMarginLoss

Creates a criterion that optimizes a multi-label one-versus-all loss based on max-entropy, between input xx and target yy of size (N, C)(N,C) .

nn.CosineEmbeddingLoss

Creates a criterion that measures the loss given input tensors x_1x1​ , x_2x2​ and a Tensor label yy with values 1 or -1.

nn.MultiMarginLoss

Creates a criterion that optimizes a multi-class classification hinge loss (margin-based loss) between input xx (a 2D mini-batch Tensor) and output yy (which is a 1D tensor of target class indices, 0 \\leq y \\leq \\textx.size(1)-10≤y≤x.size(1)−1 ):

nn.TripletMarginLoss

Creates a criterion that measures the triplet loss given an input tensors x1x1 , x2x2 , x3x3 and a margin with a value greater than 00 .

nn.TripletMarginWithDistanceLoss

Creates a criterion that measures the triplet loss given input tensors aa , pp , and nn (representing anchor, positive, and negative examples, respectively), and a nonnegative, real-valued function (“distance function”) used to compute the relationship between the anchor and positive example (“positive distance”) and the anchor and negative example (“negative distance”).

参数

reduction (stringoptional) – Specifies the reduction to apply to the output: 'none' | 'mean' | 'sum''none': no reduction will be applied, 'mean': the weighted mean of the output is taken, 'sum': the output will be summed. Note: size_average and reduce are in the process of being deprecated, and in the meantime, specifying either of those two args will override reduction. Default: 'mean'

[CROSSENTROPYLOSS]

[pytorch loss function 总结]

示例:

import torch.nn.functional as F

labels = dataloader["label"]
predictions = outputs.squeeze().contiguous()

loss = F.binary_cross_entropy(predictions, labels, reduction='mean')

from: -柚子皮-

ref: [https://pytorch.org/docs/stable/nn.html#loss-functions]

 

以上是关于PyTorch:损失函数loss function的主要内容,如果未能解决你的问题,请参考以下文章

Pytorch:为啥在 nn.modules.loss 和 nn.functional 模块中都实现了损失函数?

[pytorch]pytorch loss function 总结

如何在pytorch中获取自定义损失函数的权重?

Pytorch:损失函数

Pytorch的19种损失函数

pytorch关注auc的损失