-
From code, it seems both of # modules/processing.py
with devices.autocast():
uc = prompt_parser.get_learned_conditioning(shared.sd_model, len(prompts) * [p.negative_prompt], p.steps)
c = prompt_parser.get_multicond_learned_conditioning(shared.sd_model, prompts, p.steps) So though there is no token counter for negative prompt, it should also have token limit? If this is true, maybe I can add a token counter for negative prompt...... |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
currently if you go over 77, the token limit automatically increases. I assume this would apply to negative prompt. |
Beta Was this translation helpful? Give feedback.
-
#2572 Metion the PR. I'm working for this discussion. |
Beta Was this translation helpful? Give feedback.
currently if you go over 77, the token limit automatically increases. I assume this would apply to negative prompt.
Note about auto prompt extension: some tradeoffs may have been made to allow for this, though I'm not sure yet what these are specifically. If someone would provide these details, It'll be nice to drop them in the wiki.