It/s 1024 flux lora -3060 #1917
alexgilseg
started this conversation in
General
Replies: 2 comments
-
Using a 3060 I would recommend 512x512 |
Beta Was this translation helpful? Give feedback.
0 replies
-
Make sure it's not using shared memory. on 8GB would need to make sure to enable all caching, fp8_base, gradient checkpointing, block swapping near max. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm training a flux lora om my 3060 and I finally got it working but I'm getting 235s/image with 1024x1024 at rank/dim 4 and alpha 2.
Is this reasoable? or is something wrong with my setup ?
Beta Was this translation helpful? Give feedback.
All reactions