Skip to content

Commit 86f495d

Browse files
Added relu6 as a supported activation for DefaultNBitActivationQuantizeConfig
PiperOrigin-RevId: 464974679
1 parent 10ff67f commit 86f495d

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

tensorflow_model_optimization/python/core/quantization/keras/experimental/default_n_bit/default_n_bit_quantize_registry.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -533,7 +533,7 @@ def get_output_quantizers(self, layer):
533533
'DefaultNBitActivationQuantizeConfig.'.format(
534534
layer.activation))
535535

536-
if layer.activation.__name__ in ['relu', 'swish']:
536+
if layer.activation.__name__ in ['relu', 'relu6', 'swish']:
537537
# 'relu' should generally get fused into the previous layer.
538538
return [quantizers.MovingAverageQuantizer(
539539
num_bits=self._num_bits_activation, per_axis=False,

0 commit comments

Comments
 (0)