prune_model

paddle.incubate.asp. prune_model ( model, n=2, m=4, mask_algo='mask_1d', with_mask=True ) [source]

Pruning parameters of supported layers in model via specified mask generation function given by mask_algo. This function supports both training and inference controlled by with_mask. If with_mask is True, it would also prune parameter related ASP mask Variables, else only prunes parameters.

Note: (Static graph mode) If calling this function with with_mask, it should call OptimizerWithSparsityGuarantee.minimize and initialization (exe.run(startup_program)) before (For successfully obtain mask Variable). Typically set with_mask as true for training (have called OptimizerWithSparsityGuarantee.minimize) and false for inference only. To obtain OptimizerWithSparsityGuarantee, please see paddle.incubate.asp.decoreate().

Parameters
  • model (Program|nn.Layer) – Program with model definition and its parameters, or a object of paddle.nn.Layer.

  • n (int, optional) – n of n:m sparse pattern. Default is 2.

  • m (int, optional) – m of n:m sparse pattern. Default is 4.

  • mask_algo (string, optional) – The function name to generate spase mask. Default is mask_1d. The vaild inputs should be one of ‘mask_1d’, ‘mask_2d_greedy’ and ‘mask_2d_best’.

  • with_mask (bool, optional) – To prune mask Variables related to parameters or not. True is purning also, False is not. Default is True.

Returns

A dictionary with key: parameter name (string) and value: its corresponding mask Variable.

Return type

dictionary

Examples

  1. Usage of Dynamic Graph

    import paddle
    import numpy as np
    
    class MyLayer(paddle.nn.Layer):
        def __init__(self):
            super().__init__()
            self.conv1 = paddle.nn.Conv2D(
                in_channels=3, out_channels=4, kernel_size=3, padding=2)
            self.linear1 = paddle.nn.Linear(4624, 32)
            self.linear2 = paddle.nn.Linear(32, 32)
            self.linear3 = paddle.nn.Linear(32, 10)
    
        def forward(self, img):
            hidden = self.conv1(img)
            hidden = paddle.flatten(hidden, start_axis=1)
            hidden = self.linear1(hidden)
            hidden = self.linear2(hidden)
            prediction = self.linear3(hidden)
            return prediction
    
    my_layer = MyLayer()
    loss_fn = paddle.nn.MSELoss(reduction='mean')
    
    optimizer = paddle.optimizer.SGD(
        learning_rate=0.01, parameters=my_layer.parameters())
    
    # Calling paddle.incubate.asp.decorate() to wrap step() in optimizer, which
    # will apply necessary masking operations for ASP workflow.
    # In dynamic graph mode, ASP would create related mask variables during decoration.
    optimizer = paddle.incubate.asp.decorate(optimizer)
    
    # Must call paddle.incubate.asp.decorate() first before calling paddle.incubate.asp.prune_model()
    paddle.incubate.asp.prune_model(my_layer, mask_algo='mask_2d_best')
    
    for i in range(10):
        imgs = paddle.to_tensor(
            np.random.randn(64, 3, 32, 32),
            dtype='float32', stop_gradient=False)
        labels = paddle.to_tensor(
            np.random.randint(10, size=(64, 1)),
            dtype='float32', stop_gradient=False)
        output = my_layer(imgs)
        loss = loss_fn(output, labels)
        loss.backward()
        optimizer.step()
        optimizer.clear_grad()
    
  2. Usage of Static Graph

    import paddle
    import numpy as np
    
    paddle.enable_static()
    
    class MyLayer(paddle.nn.Layer):
        def __init__(self):
            super().__init__()
            self.conv1 = paddle.nn.Conv2D(
                in_channels=3, out_channels=4, kernel_size=3, padding=2)
            self.linear1 = paddle.nn.Linear(4624, 32)
            self.linear2 = paddle.nn.Linear(32, 32)
            self.linear3 = paddle.nn.Linear(32, 10)
    
        def forward(self, img):
            hidden = self.conv1(img)
            hidden = paddle.flatten(hidden, start_axis=1)
            hidden = self.linear1(hidden)
            hidden = self.linear2(hidden)
            prediction = self.linear3(hidden)
            return prediction
    
    main_program = paddle.static.Program()
    startup_program = paddle.static.Program()
    
    with paddle.static.program_guard(main_program, startup_program):
        input_data = paddle.static.data(name='data', shape=[None, 3, 32, 32])
        label = paddle.static.data(name='label', shape=[None, 1])
        my_layer = MyLayer()
        prob = my_layer(input_data)
        loss = paddle.mean(paddle.nn.functional.square_error_cost(prob, label))
    
        optimizer = paddle.optimizer.SGD(learning_rate=0.1)
        # Calling paddle.incubate.asp.decorate() to wrap minimize() in optimizer, which
        # will insert necessary masking operations for ASP workflow.
        # In static graph mode, ASP creates related mask variables
        # during minimize().
        optimizer = paddle.incubate.asp.decorate(optimizer)
        optimizer.minimize(loss, startup_program)
    
    device = paddle.device.get_device()
    place = paddle.set_device(device)
    
    exe = paddle.static.Executor(place)
    exe.run(startup_program)
    
    # Must call exe.run(startup_program) first before calling paddle.asp.prune_model()
    paddle.incubate.asp.prune_model(my_layer, mask_algo='mask_2d_best')
    # it also be accepted to call
    # paddle.incubate.asp.prune_model(main_program, mask_algo='mask_2d_best')
    
    for i in range(10):
        imgs = np.random.randn(64, 3, 32, 32).astype('float32')
        labels = np.random.randint(10, size=(64, 1)).astype('float32')
        exe.run(main_program, feed={'data':imgs, 'label':labels})