Skip to content

Commit 7a8ce05

Browse files
committed
added layer attention documentation
1 parent dada200 commit 7a8ce05

File tree

2 files changed

+59
-0
lines changed

2 files changed

+59
-0
lines changed

R/layer-attention.R

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@
44
#' Dot-product attention layer, a.k.a. Luong-style attention.
55
#'
66
#' @inheritParams layer_dense
7+
#'
78
#' @param inputs a list of inputs first should be the query tensor, the second the value tensor
89
#' @param use_scale If True, will create a scalar variable to scale the attention scores.
910
#' @param causal Boolean. Set to True for decoder self-attention. Adds a mask such that position i cannot attend to positions j > i.

man/layer_attention.Rd

Lines changed: 58 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)