Skip to content
Discussion options

You must be logged in to vote

Yes, activation functions are not part of our nn.conv layers. This has mostly two reasons:

  • It makes it easier to try out different non-linearities
  • It aligns well with operators from PyTorch, e.g., Conv2d, which also do not apply a non-linearity

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by errhernandez
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants