Skip to content

Incorrect Pruning when using AdaptiveAvgPooling2d to Linear layers. #511

@joesumargo

Description

@joesumargo

Hi, I am running into a consistent issue where, when using Adaptive Average Pooling and a Linear layer after flattening, the pruner is unable to prune the Linear layer to the correct size properly. More precisely, before pruning, I have an AdaptiveAvgPooling Layer which fixes the output size to (5,5) with a channel size of 256--resulting in a corresponding Linear size of 6400. However, after pruning, the channel size drops to 200, but the linear layer only drops by 6344, which, of course, causes an error due to the mismatch. Here are some extra details:

BEFORE:
(14): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(15): AdaptiveAvgPool2d(output_size=(5, 5))
(16): Flatten(start_dim=1, end_dim=-1)
(17): Linear(in_features=6400, out_features=512, bias=True)
(18): ReLU()

AFTER:
(14): BatchNorm2d(200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(15): AdaptiveAvgPool2d(output_size=(5, 5))
(16): Flatten(start_dim=1, end_dim=-1)
(17): Linear(in_features=6344, out_features=512, bias=True) <--- should be 5000
(18): ReLU()

From my understanding, this stems from the wrong len(idx):
[24] prune_out_channels on _ElementWiseOp_308(AdaptiveAvgPool2DBackward0) => prune_out_channels on _Reshape_306(), len(idxs)=56
[25] prune_out_channels on _Reshape_306() => prune_in_channels on _stage1_model._net.17 (Linear(in_features=6400, out_features=512, bias=True)), len(idxs)=56 <--- should by multiplied by 5*5.

I am using GroupMagnitudeImportance with p=2 and the Base Pruner with a pruning ratio of 0.2.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions