-
Notifications
You must be signed in to change notification settings - Fork 265
Open
Description
Hello, I have a question about y_bpp and normalization.
In the implementation of the Gaussian entropy model in CompressAI, y_bpp is computed by estimating the likelihood after normalizing the input:
half = float(0.5)
if means is not None:
values = inputs - means
else:
values = inputs
scales = self.lower_bound_scale(scales)
values = torch.abs(values)
upper = self._standardized_cumulative((half - values) / scales)
lower = self._standardized_cumulative((-half - values) / scales)
likelihood = upper - lowerDoes this mean that during actual training, it is not (y - means) / scales but rather torch.abs(y - means) / self.lower_bound_scale(scales) that is fitted to the standard normal distribution?
I need to normalize the latent variable y to obtain a standard spherical normal vector for calculating the spatial correlation.
Metadata
Metadata
Assignees
Labels
No labels