Skip to content

Commit d202c2b

Browse files
authored
execution: Allow a subgraph nodes to execute multiple times (#10499)
In the case of --cache-none lazy and subgraph execution can cause anything to be run multiple times per workflow. If that rerun nodes is in itself a subgraph generator, this will crash for two reasons. pending_subgraph_results[] does not cleanup entries after their use. So when a pending_subgraph_result is consumed, remove it from the list so that if the corresponding node is fully re-executed this misses lookup and it fall through to execute the node as it should. Secondly, theres is an explicit enforcement against dups in the addition of subgraphs nodes as ephemerals to the dymprompt. Remove this enforcement as the use case is now valid.
1 parent 8817f8f commit d202c2b

File tree

1 file changed

+1
-4
lines changed

1 file changed

+1
-4
lines changed

execution.py

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -445,6 +445,7 @@ async def execute(server, dynprompt, caches, current_item, extra_data, executed,
445445
resolved_outputs.append(tuple(resolved_output))
446446
output_data = merge_result_data(resolved_outputs, class_def)
447447
output_ui = []
448+
del pending_subgraph_results[unique_id]
448449
has_subgraph = False
449450
else:
450451
get_progress_state().start_progress(unique_id)
@@ -527,10 +528,6 @@ async def await_completion():
527528
if new_graph is None:
528529
cached_outputs.append((False, node_outputs))
529530
else:
530-
# Check for conflicts
531-
for node_id in new_graph.keys():
532-
if dynprompt.has_node(node_id):
533-
raise DuplicateNodeError(f"Attempt to add duplicate node {node_id}. Ensure node ids are unique and deterministic or use graph_utils.GraphBuilder.")
534531
for node_id, node_info in new_graph.items():
535532
new_node_ids.append(node_id)
536533
display_id = node_info.get("override_display_id", unique_id)

0 commit comments

Comments
 (0)