BatchNormConfig num_features and num_channels #2590
Replies: 2 comments 1 reply
-
This let me think about another case. For a linear layer, are the following two methods different ? method1: let tensor = Tensor::<MyBackend, 3>::from_data([[[1, 2, 3],[4,5,6]]], &device);
let fc: nn::Linear<MyBackend> = nn::LinearConfig::new(3, 4).with_bias(true).init(&device);
let output = fc.forward(tensor.clone()); method2: let tensor = Tensor::<MyBackend, 3>::from_data([[1, 2, 3, 4,5,6]], &device);
let fc: nn::Linear<MyBackend> = nn::LinearConfig::new(6, 4).with_bias(true).init(&device);
let output = fc.forward(tensor.clone()); method1 will run full connection for |
Beta Was this translation helpful? Give feedback.
-
You're right that it should be |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I always thought that
num_features
inBatchNormConfig::new()
means the number of features, but in fact it means the number of channels ? All the features in a channel share the same gamma and beta ? For example, here is an image:I thought num_features should be 3 because there are 3 elements (features) in a channel , but in fact it's 2 in
BatchNorm
because there are two channels. SoBatchNormConfig::new(2)
is right andBatchNormConfig::new(3)
is wrong, is it ?And so the dimensions of input tensor of BatchNorm must >=3, right ?
Beta Was this translation helpful? Give feedback.
All reactions