Skip to content

Commit 27e31c5

Browse files
committed
inception
1 parent c55bc6a commit 27e31c5

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

README.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,8 @@ The selling point of this paper is extremely low extra parameters per added conc
88

99
It seems they successfully applied the Rank-1 editing technique from a <a href="https://arxiv.org/abs/2202.05262">memory editing paper for LLM</a>, with a few improvements. They also identified that the keys determine the "where" of the new concept, while the values determine the "what", and propose local / global-key locking to a superclass concept (while learning the values).
1010

11+
For researchers out there, if this paper checks out, the tools in this repository should work for any other text-to-`<insert modality>` network using cross attention conditioning. Just a thought
12+
1113
## Appreciation
1214

1315
- <a href="https://stability.ai/">StabilityAI</a> for the generous sponsorship, as well as my other sponsors out there

0 commit comments

Comments
 (0)