PixelShuffle

class paddle.nn. PixelShuffle ( upscale_factor, data_format='NCHW', name=None ) [source]

PixelShuffle Layer

This operator rearranges elements in a tensor of shape [N, C, H, W] to a tensor of shape [N, C/upscale_factor**2, H*upscale_factor, W*upscale_factor], or from shape [N, H, W, C] to [N, H*upscale_factor, W*upscale_factor, C/upscale_factor**2]. This is useful for implementing efficient sub-pixel convolution with a stride of 1/upscale_factor. Please refer to the paper: Real-Time Single Image and Video Super-Resolution Using an Efficient Sub-Pixel Convolutional Neural Network . by Shi et. al (2016) for more details.

Parameters
  • upscale_factor (int) – factor to increase spatial resolution.

  • data_format (str) – The data format of the input and output data. An optional string from: “NCHW”, “NHWC”. The default is “NCHW”. When it is “NCHW”, the data is stored in the order of: [batch_size, input_channels, input_height, input_width].

  • name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.

Shape:
  • x: 4-D tensor with shape: (N, C, H, W) or (N, H, W, C).

  • out: 4-D tensor with shape: (N, C/upscale_factor**2, H*upscale_factor, W*upscale_factor) or (N, H*upscale_factor, W*upscale_factor, C/upscale_factor^2).

Examples

import paddle
import paddle.nn as nn
import numpy as np

x = np.random.randn(2, 9, 4, 4).astype(np.float32)
x_var = paddle.to_tensor(x)
pixel_shuffle = nn.PixelShuffle(3)
out_var = pixel_shuffle(x_var)
out = out_var.numpy()
print(out.shape)
# (2, 1, 12, 12)
forward ( x )

forward

Defines the computation performed at every call. Should be overridden by all subclasses.

Parameters
  • *inputs (tuple) – unpacked tuple arguments

  • **kwargs (dict) – unpacked dict arguments

extra_repr ( )

extra_repr

Extra representation of this layer, you can have custom implementation of your own layer.