cross_entropy

paddle.fluid.layers.loss. cross_entropy ( input, label, soft_label=False, ignore_index=- 100 ) [source]
Alias_main

paddle.nn.functional.cross_entropy :alias: paddle.nn.functional.cross_entropy,paddle.nn.functional.loss.cross_entropy :old_api: paddle.fluid.layers.cross_entropy

This operator computes the cross entropy between input and label. It supports both hard-label and and soft-label cross entropy computation.

  1. Hard-label cross entropy: if soft_label=False, label[i1,i2,...,ik] is the hard label of each sample.

    output[i1,i2,...,ik]=log(input[i1,i2,...,ik,j]),label[i1,i2,...,ik]=j,j!=ignore_index
  2. Soft-label cross entropy: if soft_label=True, label[i1,i2,...,ik,j] is the soft label of each sample corresponding to the j-th class.

    output[i1,i2,...,ik]=jlabel[i1,i2,...,ik,j]log(input[i1,i2,...,ik,j])
Parameters
  • input (Variable) – a multidimensional Tensor with shape [N1,N2,...,Nk,D], where the last dimension D is the class number. The data type should be float32 or float64.

  • label (Variable) – label value corresponding to input. If soft_label=False, the dimension of label should be [N1,N2,...,Nk] or [N1,N2,...,Nk,1] , and its data type should be int64, and the value must be inside [0, D). If soft_label=True, the shape, data type of label should be the same with input, and the sum of soft label value of each sample should be 1.

  • soft_label (bool) – indicate whether label is soft. Default False, meaning that the label is hard. If soft_label=True, the label is soft.

  • ignore_index (int) – specify an ignorable label value. The ignored label would be omitted when computing. If it is a negative integer, no label would be ignored. Only valid when soft_label=False. Default -100.

Returns

A Variable holding Tensor representing the cross entropy, whose data type is the same with input. If soft_label=False, the shape of output is the same with label. If soft_label=True, the shape of output is [N1,N2,...,Nk,1] .

Examples

import paddle.fluid as fluid
class_num = 7
x = fluid.data(name='x', shape=[None, 3, 10], dtype='float32')
label = fluid.data(name='label', shape=[None, 1], dtype='int64')
predict = fluid.layers.fc(input=x, size=class_num, act='softmax')
cost = fluid.layers.cross_entropy(input=predict, label=label)