You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jul 10, 2025. It is now read-only.
* C++ : [https://github.com/tensorflow/addons/blob/r0.10/tensorflow_addons/custom_ops/activations/cc/kernels/gelu_op.h](https://github.com/tensorflow/addons/blob/r0.10/tensorflow_addons/custom_ops/activations/cc/kernels/gelu_op.h)
* C++ : https://github.com/tensorflow/addons/blob/r0.10/tensorflow_addons/custom_ops/activations/cc/kernels/gelu_op.h
33
33
* Does this include custom-op kernels?
34
34
* Yes, but currently proposing to just migrate the python composite op. This may
35
35
change with discussion in the RFC.
@@ -61,13 +61,15 @@ upstream version.
61
61
* The activation would land in [nn_ops.py](https://github.com/tensorflow/tensorflow/blob/r2.2/tensorflow//python/ops/nn_ops.py) as well as in [keras advaced_activations](https://github.com/tensorflow/tensorflow/blob/r2.2/tensorflow/python/keras/layers/advanced_activations.py)
62
62
* No planned changes to the parameter signatures at this time
63
63
* Addons would deprecate our activation and make a call to the core functionality.
0 commit comments