Skip to content
This repository was archived by the owner on Jan 21, 2025. It is now read-only.

Commit 739fb09

Browse files
David SoMesh TensorFlow Team
authored andcommitted
Squared ReLU from Primer paper.
PiperOrigin-RevId: 395918244
1 parent 4758510 commit 739fb09

File tree

1 file changed

+5
-0
lines changed

1 file changed

+5
-0
lines changed

mesh_tensorflow/ops.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1921,6 +1921,11 @@ def relu(x, name="relu"):
19211921
return cwise(tf.nn.relu, [x], name=name, grad_function=_relu_grad)
19221922

19231923

1924+
def squared_relu(x, name="squared_relu"):
1925+
"""Squared ReLU from Primer paper (TODO(davidso):Link when released)."""
1926+
return cwise(lambda t: square(relu(t)), [x], name=name)
1927+
1928+
19241929
def leaky_relu(x, alpha=0.2, name="leaky_relu"):
19251930
def forward_function(x):
19261931
return tf.nn.leaky_relu(x, alpha)

0 commit comments

Comments
 (0)