Activation Function

The activation function incorporates non-linearity properties into the neural network.

PaddlePaddle supports most of the activation functions, including:

The way to apply activation functions in PaddlePaddle is as follows:

PaddlePaddle provides a dedicated interface for each activation function, allowing users to explicitly invoke them as needed. Below is an example of applying the ReLU activation function after a convolution operation:

conv2d = paddle.nn.functional.conv2d(x, weight, stride=1, padding=1)  # Convolution operation
relu1 = paddle.nn.functional.relu(conv2d)  # Applying the ReLU activation function