Skip to content

Commit 53ed773

Browse files
author
Alexander Ororbia
committed
tweak to atten probe
1 parent 012395b commit 53ed773

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

ngclearn/utils/analysis/attentive_probe.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -123,7 +123,7 @@ def run_attention_probe(
123123
skip = features
124124
if use_LN:
125125
features = layer_normalize(features, Wlnattn_mu, Wlnattn_scale)
126-
features = cross_attention(self_attn_params, features, features, None, n_heads, dropout)
126+
features = cross_attention(dkey, self_attn_params, features, features, None, n_heads, dropout)
127127
features = features + skip
128128
features = features[:, 0] # (B, 1, dim) => (B, dim)
129129
# MLP

0 commit comments

Comments
 (0)