Skip to content

Commit 8e29360

Browse files
committed
document object in layer_additive_attention()
1 parent 9c209e1 commit 8e29360

File tree

2 files changed

+7
-0
lines changed

2 files changed

+7
-0
lines changed

R/layer-attention.R

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -149,6 +149,8 @@ layer_multi_head_attention <- function(
149149
#' shape `[batch_size, Tq, dim]`:
150150
#' `return tf$matmul(distribution, value)`.
151151
#'
152+
#' @inheritParams layer_dense
153+
#'
152154
#' @param use_scale If `TRUE`, will create a variable to scale the attention scores.
153155
#'
154156
#' @param causal Boolean. Set to `TRUE` for decoder self-attention. Adds a mask such

man/layer_additive_attention.Rd

Lines changed: 5 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)