batch_fc

paddle.fluid.contrib.layers.nn. batch_fc ( input, param_size, param_attr, bias_size, bias_attr, act=None ) [source]

Batch FC layer This Op can calculate BatchFC. This is similar to matmul op, except that the bias and relu activation layers are added. Notice: It currently supports GPU device. This Op exists in contrib, which means that it is not shown to the public. :param input: Tensor with data type float32, float64. :param param_size: The size of w. :param param_attr: Attribute initializer of w. :param bias_size: The size of bias. :param bias_attr: Attribute initializer of bias. :param act: Activation to be applied to the output of this layer.

Returns

A Tensor with the same data type as input’s.

Return type

Variable

Examples

System Message: ERROR/3 (/usr/local/lib/python3.8/site-packages/paddle/fluid/contrib/layers/nn.py:docstring of paddle.fluid.contrib.layers.nn.batch_fc, line 18)

Error in “code-block” directive: maximum 1 argument(s) allowed, 5 supplied.

.. code-block:: python
   import paddle.fluid as fluid

   input = fluid.data(name="input", shape=[16, 2, 3], dtype="float32")
   out = fluid.contrib.layers.batch_fc(input=input,
                                       param_size=[16, 3, 10],
                                       param_attr=
                                         fluid.ParamAttr(learning_rate=1.0,
                                                       name="w_0",
                                                       initializer=
                                                       fluid.initializer.Xavier(uniform=False)),
                                       bias_size=[16, 10],
                                       bias_attr=
                                         fluid.ParamAttr(learning_rate=1.0,
                                                       name="b_0",
                                                       initializer=
                                                       fluid.initializer.Xavier(uniform=False)),
                                           act="relu")