Using the new Magic Prompt Batch Size #349
Replies: 1 comment 5 replies
-
I think that I should move # benchmark.py
from datetime import datetime
from dynamicprompts.generators import RandomPromptGenerator
from dynamicprompts.generators.magicprompt import MagicPromptGenerator
device = 0
generator = MagicPromptGenerator(RandomPromptGenerator(), device=device)
t1 = datetime.now()
prompts = generator.generate("A {red|blue|green} {car|boat|plane|truck}", 32)
t2 = datetime.now()
print(f"Time taken with batch_size=1: {t2-t1}")
generator = MagicPromptGenerator(RandomPromptGenerator(), batch_size=32, device=device)
t1 = datetime.now()
prompts = generator.generate("A {red|blue|green} {car|boat|plane|truck}", 32)
t2 = datetime.now()
print(f"Time taken with batch_size=32: {t2-t1}")
print(t2-t1) python benchmark.py
First load of MagicPrompt may take a while.
Time taken with batch_size=1: 0:00:10.962923
Time taken with batch_size=32: 0:00:00.583754 With a batch size of 1, it takes around 11 seconds to produce 32 prompts. With a batch size of 32, it took just over half a second.
No, but in any case, I don't think it makes sense to run both Magic Prompt and I'm Feeling Lucky at the same time.
I wasn't aware that the API lets you run the alwayson scripts separately. If so, then yes, the ultimate output of DP is
The prompts are all generated before a single image is generated so Batch Size and Magic Prompt Batch Size don't interact at all.
That's weird. What happens if you use the same settings in the ui itself (instead of the API)? As an aside, you might be interested in using the dynamicprompts library directly rather than go through auto1111's API since I don't think you get additional value and it one more layer that could cause problems. Have a look here: https://github.com/adieyal/dynamicprompts/ |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Perhaps I'm misusing this feature or something isn't quite functional on my installation but I don't think this is working as I understand it would.
If for instance I set both generation parameters of Batch Size and Batch Count to 1 and set Magic Prompt Batch Size to 8 shouldn't that generate 8 prompts for a single image? Does enabling
I'm Feeling Lucky
as well interfere somehow in the output? I've set my WebUI with the mentioned settings and the csv output is a single generated prompt for a single image.I'm particularly interested in consuming the script via the api's
alwayson_scripts
arg, one small caveat though, is that images aren't saved after inference as with using the webui, we must handle that using the returned object containing the relevant info and base64 image data, however, the object does contain theall_prompts
/all_negative_prompts
properties populated after initializing theprompt_writer
here and I'm assuming that's the information written into the csv right? If so we should be able to replicate the default script's behavior.This means that, even if we have the setting enabled for saving to csv, this callback is never triggered because I believe the api doesn't call that at the moment. Either way, as I said previously, the returned object info and the saved csv seem to be in parity and the problem must elsewhere. If it helps, I can also provide an example payload and response, as well as a regular csv and the WebUI settings for a regular generation.
EDIT: Ok I think I understand it now, I think that the Magic Prompt Batch Size is relevant if we have more than 1 generation, for instance, if we set a Batch Size of 8 and a Magic Prompt Batch Size of 4 it will generate 8 prompts in 2 batches of 4 but we still need to set the generation count using the Batch Size/Batch Count parameters.
Unfortunately, after a quick test I don't think that the returned object from the api is populated properly, doing a batch of 4 images and disabling image generation, the
all_prompts
list contains only 1 string, presumably the one for the single generated image. It's odd becauseall_negative_prompts
(disabled Magic Prompt on negative prompts as well),all_seeds
andall_subseeds
contain 4 elements, it's just the promts that are returning a single one.Beta Was this translation helpful? Give feedback.
All reactions