This code uses `tensor.sign()` to binarize the activations and weights. https://github.com/itayhubara/BinaryNet.pytorch/blob/f5c3672dede608f568e073a583cadd7a8a88fa9d/models/binarized_modules.py#L13 The desired behavior is to always return -1 or 1, but `sign()` returns 0 for values that are 0. Batch normalization makes 0 less probable, but it can still happen. The code should probably force every activation to be either -1 or 1.