swish

paddle.fluid.layers.nn. swish ( x, beta=1.0, name=None ) [source]
Alias_main

paddle.nn.functional.swish :alias: paddle.nn.functional.swish,paddle.nn.functional.activation.swish :old_api: paddle.fluid.layers.swish

Elementwise swish activation function. See Searching for Activation Functions for more details.

Equation:

\[\begin{split}out = \\frac{x}{1 + e^{- beta * x}}\end{split}\]
Parameters
  • x (Variable) – Tensor or LoDTensor, dtype: float32 or float64, the input of swish activation.

  • beta (float) – Constant beta of swish operator, default 1.0.

  • name (str, optional) – The default value is None. Normally there is no need for user to set this property. For more information, please refer to Name.

Returns

Output of the swish activation, Tensor or LoDTensor, with the same dtype and shape with the input x.

Return type

Variable

Examples

# declarative mode
import numpy as np
from paddle import fluid

x = fluid.data(name="x", shape=(-1, 3), dtype="float32")
y = fluid.layers.swish(x, beta=2.0)

place = fluid.CPUPlace()
exe = fluid.Executor(place)
start = fluid.default_startup_program()
main = fluid.default_main_program()

data = np.random.randn(2, 3).astype("float32")
exe.run(start)
y_np, = exe.run(main, feed={"x": data}, fetch_list=[y])

data
# array([[-1.1239197 ,  1.3391294 ,  0.03921051],
#        [ 1.1970421 ,  0.02440812,  1.2055548 ]], dtype=float32)
y_np
# array([[-0.2756806 ,  1.0610548 ,  0.01998957],
#        [ 0.9193261 ,  0.01235299,  0.9276883 ]], dtype=float32)
# imperative mode
import numpy as np
from paddle import fluid
import paddle.fluid.dygraph as dg

data = np.random.randn(2, 3).astype("float32")
place = fluid.CPUPlace()
with dg.guard(place) as g:
    x = dg.to_variable(data)
    y = fluid.layers.swish(x)
    y_np = y.numpy()
data
# array([[-0.0816701 ,  1.1603649 , -0.88325626],
#        [ 0.7522361 ,  1.0978601 ,  0.12987892]], dtype=float32)
y_np
# array([[-0.03916847,  0.8835007 , -0.25835553],
#        [ 0.51126915,  0.82324016,  0.06915068]], dtype=float32)