Skip to content
Discussion options

You must be logged in to vote

I think not all gradient operators correspond to a single macro-op. In some cases the gradient is implemented via combination of individual ops. E.g. for matmul there's no matmulgrad, it's just lowered to matmul+transpose.

Replies: 2 comments 2 replies

Comment options

You must be logged in to vote
1 reply
@pranav-prakash
Comment options

Answer selected by georgen117
Comment options

You must be logged in to vote
1 reply
@georgen117
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
3 participants