Skip to content

Commit e81c777

Browse files
committed
add missing link
1 parent 312f022 commit e81c777

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

src/diffusers/hooks/first_block_cache.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,9 @@ def new_forward(self, module: torch.nn.Module, *args, **kwargs):
193193

194194
def apply_first_block_cache(module: torch.nn.Module, config: FirstBlockCacheConfig) -> None:
195195
"""
196-
Applies [First Block Cache]() to a given module.
196+
Applies [First Block
197+
Cache](https://github.com/chengzeyi/ParaAttention/blob/4de137c5b96416489f06e43e19f2c14a772e28fd/README.md#first-block-cache-our-dynamic-caching)
198+
to a given module.
197199
198200
First Block Cache builds on the ideas of [TeaCache](). It is much simpler to implement generically for a wide range
199201
of models and has been integrated first for experimental purposes.

0 commit comments

Comments
 (0)