Skip to content

Commit 75136e6

Browse files
committed
fixes
1 parent b653eaa commit 75136e6

File tree

1 file changed

+4
-102
lines changed

1 file changed

+4
-102
lines changed

docs/source/en/api/attnprocessor.md

Lines changed: 4 additions & 102 deletions
Original file line numberDiff line numberDiff line change
@@ -20,29 +20,11 @@ An attention processor is a class for applying different types of attention mech
2020

2121
[[autodoc]] models.attention_processor.AttnProcessor2_0
2222

23-
<<<<<<< HEAD
24-
## FusedAttnProcessor2_0
25-
[[autodoc]] models.attention_processor.FusedAttnProcessor2_0
26-
27-
## XFormersAttnProcessor
28-
[[autodoc]] models.attention_processor.XFormersAttnProcessor
29-
30-
## AttnAddedKVProcessor
31-
=======
32-
>>>>>>> main
3323
[[autodoc]] models.attention_processor.AttnAddedKVProcessor
3424

3525
[[autodoc]] models.attention_processor.AttnAddedKVProcessor2_0
3626

37-
<<<<<<< HEAD
38-
## XFormersAttnAddedKVProcessor
39-
[[autodoc]] models.attention_processor.XFormersAttnAddedKVProcessor
40-
41-
## CrossFrameAttnProcessor
42-
[[autodoc]] pipelines.text_to_video_synthesis.pipeline_text_to_video_zero.CrossFrameAttnProcessor
43-
=======
4427
[[autodoc]] models.attention_processor.AttnProcessorNPU
45-
>>>>>>> main
4628

4729
[[autodoc]] models.attention_processor.FusedAttnProcessor2_0
4830

@@ -154,90 +136,6 @@ An attention processor is a class for applying different types of attention mech
154136

155137
[[autodoc]] models.attention_processor.SlicedAttnAddedKVProcessor
156138

157-
## IPAdapterAttnProcessor
158-
[[autodoc]] models.attention_processor.IPAdapterAttnProcessor
159-
160-
## IPAdapterAttnProcessor2_0
161-
[[autodoc]] models.attention_processor.IPAdapterAttnProcessor2_0
162-
163-
## AttnProcessorNPU
164-
[[autodoc]] models.attention_processor.AttnProcessorNPU
165-
166-
## JointAttnProcessor2_0
167-
[[autodoc]] models.attention_processor.JointAttnProcessor2_0
168-
169-
## JointAttnProcessor2_0
170-
[[autodoc]] models.attention_processor.PAGJointAttnProcessor2_0
171-
172-
## PAGCFGJointAttnProcessor2_0
173-
[[autodoc]] models.attention_processor.PAGCFGJointAttnProcessor2_0
174-
175-
176-
## FusedJointAttnProcessor2_0
177-
[[autodoc]] models.attention_processor.FusedJointAttnProcessor2_0
178-
179-
## AllegroAttnProcessor2_0
180-
[[autodoc]] models.attention_processor.AllegroAttnProcessor2_0
181-
182-
## AuraFlowAttnProcessor2_0
183-
[[autodoc]] models.attention_processor.AuraFlowAttnProcessor2_0
184-
185-
## MochiVaeAttnProcessor2_0
186-
[[autodoc]] models.attention_processor.MochiVaeAttnProcessor2_0
187-
188-
## PAGCFGIdentitySelfAttnProcessor2_0
189-
[[autodoc]] models.attention_processor.PAGCFGIdentitySelfAttnProcessor2_0
190-
191-
## FusedAuraFlowAttnProcessor2_0
192-
[[autodoc]] models.attention_processor.FusedAuraFlowAttnProcessor2_0
193-
194-
## FusedFluxAttnProcessor2_0
195-
[[autodoc]] models.attention_processor.FusedFluxAttnProcessor2_0
196-
197-
## SanaMultiscaleAttnProcessor2_0
198-
[[autodoc]] models.attention_processor.SanaMultiscaleAttnProcessor2_0
199-
200-
## PAGHunyuanAttnProcessor2_0
201-
[[autodoc]] models.attention_processor.PAGHunyuanAttnProcessor2_0
202-
203-
## HunyuanAttnProcessor2_0
204-
[[autodoc]] models.attention_processor.HunyuanAttnProcessor2_0
205-
206-
## FluxAttnProcessor2_0
207-
[[autodoc]] models.attention_processor.FluxAttnProcessor2_0
208-
209-
## PAGIdentitySelfAttnProcessor2_0
210-
[[autodoc]] models.attention_processor.PAGIdentitySelfAttnProcessor2_0
211-
212-
## FusedCogVideoXAttnProcessor2_0
213-
[[autodoc]] models.attention_processor.FusedCogVideoXAttnProcessor2_0
214-
215-
## MochiAttnProcessor2_0
216-
[[autodoc]] models.attention_processor.MochiAttnProcessor2_0
217-
218-
## StableAudioAttnProcessor2_0
219-
[[autodoc]] models.attention_processor.StableAudioAttnProcessor2_0
220-
221-
## XLAFlashAttnProcessor2_0
222-
[[autodoc]] models.attention_processor.XLAFlashAttnProcessor2_0
223-
224-
## FusedHunyuanAttnProcessor2_0
225-
[[autodoc]] models.attention_processor.FusedHunyuanAttnProcessor2_0
226-
227-
## IPAdapterXFormersAttnProcessor
228-
[[autodoc]] models.attention_processor.IPAdapterXFormersAttnProcessor
229-
230-
## LuminaAttnProcessor2_0
231-
[[autodoc]] models.attention_processor.LuminaAttnProcessor2_0
232-
233-
## PAGCFGHunyuanAttnProcessor2_0
234-
[[autodoc]] models.attention_processor.PAGCFGHunyuanAttnProcessor2_0
235-
236-
## FluxSingleAttnProcessor2_0
237-
[[autodoc]] models.attention_processor.FluxSingleAttnProcessor2_0
238-
239-
## CogVideoXAttnProcessor2_0
240-
[[autodoc]] models.attention_processor.CogVideoXAttnProcessor2_0
241139
## XFormersAttnProcessor
242140

243141
[[autodoc]] models.attention_processor.XFormersAttnProcessor
@@ -251,3 +149,7 @@ An attention processor is a class for applying different types of attention mech
251149
## XFormersJointAttnProcessor
252150

253151
[[autodoc]] models.attention_processor.XFormersJointAttnProcessor
152+
153+
## IPAdapterXFormersAttnProcessor
154+
155+
[[autodoc]] models.attention_processor.IPAdapterXFormersAttnProcessor

0 commit comments

Comments
 (0)