Skip to content

Comments

fix : Workflows are not terminated if errors are detected#163

Closed
asterisk-ragavan wants to merge 2 commits intoOpenMS:mainfrom
asterisk-ragavan:fix-bug-#145
Closed

fix : Workflows are not terminated if errors are detected#163
asterisk-ragavan wants to merge 2 commits intoOpenMS:mainfrom
asterisk-ragavan:fix-bug-#145

Conversation

@asterisk-ragavan
Copy link
Contributor

@asterisk-ragavan asterisk-ragavan commented Mar 7, 2025

  • Added error raising on line 108 to halt workflow execution when an error occurs.

  • Implemented level 2 logging for the initial error.

  • Captured and logged the first line of the error at level 0 for user visibility.

  • Ensures workflow termination and proper error reporting.

  • This will resolve issue Workflows are not terminated if errors are detected #154

Summary by CodeRabbit

  • New Features
    • Enhanced how background processes can be halted, leading to smoother and more responsive command executions.
    • Improved error feedback by distinguishing less critical alerts from serious issues, ensuring users receive clearer information during operation.

Added error raising on line 108 to halt workflow execution when an error occurs.
Implemented level 2 logging for the initial error.
Captured and logged the first line of the error at level 0 for user visibility.
Ensures workflow termination and proper error reporting.
@coderabbitai
Copy link

coderabbitai bot commented Mar 7, 2025

Walkthrough

The changes update the CommandExecutor class to include a new stop_event parameter in the run_command method. This update allows the command execution to be halted early if the event is set, improving control in a multi-threaded context. Additionally, error handling within run_command now distinguishes between non-critical warnings and critical errors, with the latter triggering the stop_event and raising a RuntimeError.

Changes

File Change Summary
src/.../CommandExecutor.py Added a new stop_event parameter to run_command and updated run_multiple_commands to pass this parameter. Introduced an early exit check for stop_event.is_set() and refined error handling to log warnings or log errors, set the stop event, and raise a RuntimeError.

Sequence Diagram(s)

sequenceDiagram
    participant Caller as Caller
    participant Executor as CommandExecutor
    participant Thread as Command Thread

    Caller ->> Executor: run_multiple_commands()
    Executor ->> Thread: run_command(command, stop_event)
    Note over Thread: Check if stop_event.is_set()
    alt stop_event is set
        Thread -->> Executor: Exit command early
    else stop_event not set
        Thread ->> Executor: Execute command
        alt Warning error occurs
            Executor -->> Thread: Log warning
        else Critical error occurs
            Executor ->> Thread: Log error, set stop_event
            Thread -->> Executor: Raise RuntimeError
        end
    end
Loading

Poem

I'm a rabbit with a curious gaze,
Hopping through codes in endless arrays.
Stop events guide my every leap,
In threads where commands no longer creep.
With a twitch of my nose, I see clear skies—
Code's magic unfolds before my eyes! 🐇

Tip

⚡🧪 Multi-step agentic review comment chat (experimental)
  • We're introducing multi-step agentic chat in review comments. This experimental feature enhances review discussions with the CodeRabbit agentic chat by enabling advanced interactions, including the ability to create pull requests directly from comments.
    - To enable this feature, set early_access to true under in the settings.
✨ Finishing Touches
  • 📝 Generate Docstrings

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@t0mdavid-m t0mdavid-m linked an issue Mar 11, 2025 that may be closed by this pull request
@t0mdavid-m t0mdavid-m self-requested a review March 11, 2025 14:38
Copy link
Member

@t0mdavid-m t0mdavid-m left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your contribution!

Interestingly the error detection does not seem to detect the first error but another error further downstream. Below is the log for the problematic workflow discussed in #154.

STARTING WORKFLOW

Number of input mzML files: 4

Detecting features...

Running 4 commands in parallel...

Running command:
FeatureFinderMetabo -in ../workspaces-streamlit-template/default/topp-workflow/input-files/mzML-files/Control.mzML -out ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Control.featureXML -algorithm:epd:masstrace_snr_filtering false -ini ../workspaces-streamlit-template/default/topp-workflow/ini/FeatureFinderMetabo.ini
Waiting for command to finish...

Running command:
FeatureFinderMetabo -in ../workspaces-streamlit-template/default/topp-workflow/input-files/mzML-files/Blank.mzML -out ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Blank.featureXML -algorithm:epd:masstrace_snr_filtering false -ini ../workspaces-streamlit-template/default/topp-workflow/ini/FeatureFinderMetabo.ini
Waiting for command to finish...

Running command:
FeatureFinderMetabo -in ../workspaces-streamlit-template/default/topp-workflow/input-files/mzML-files/Pool.mzML -out ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Pool.featureXML -algorithm:epd:masstrace_snr_filtering false -ini ../workspaces-streamlit-template/default/topp-workflow/ini/FeatureFinderMetabo.ini
Waiting for command to finish...

Running command:
FeatureFinderMetabo -in ../workspaces-streamlit-template/default/topp-workflow/input-files/mzML-files/Treatment.mzML -out ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Treatment.featureXML -algorithm:epd:masstrace_snr_filtering false -ini ../workspaces-streamlit-template/default/topp-workflow/ini/FeatureFinderMetabo.ini
Waiting for command to finish...

Process finished:
FeatureFinderMetabo -in ../workspaces-streamlit-template/default/topp-workflow/input-files/mzML-files/Control.mzML -out ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Control.featureXML -algorithm:epd:masstrace_snr_filtering false -ini ../workspaces-streamlit-template/default/topp-workflow/ini/FeatureFinderMetabo.ini
Total time to run command: 0.01 seconds

ERRORS OCCURRED:
Trailing text argument(s) '[false]' given. Aborting!

FeatureFinderMetabo -- Assembles metabolite features from centroided (LC-)MS data using the mass trace approach.
Full documentation: http://www.openms.de/doxygen/nightly/html/TOPP_FeatureFinderMetabo.html
Version: 3.1.0-pre-FDdevelop-2024-12-10 Dec 11 2024, 11:51:33, Revision: d27c9b5
To cite OpenMS:

  • Pfeuffer, J., Bielow, C., Wein, S. et al.. OpenMS 3 enables reproducible analysis of large-scale mass spectrometry data. Nat Methods (2024). doi:10.1038/s41592-024-02197-7.

Usage:
FeatureFinderMetabo

This tool has algorithm parameters that are not shown here! Please check the ini file for a detailed description or use the --helphelp option

Options (mandatory options marked with ''):
-in
Centroided mzML file (valid formats: 'mzML')
-out * FeatureXML file with metabolite features (valid formats: 'featureXML')
-out_chrom Optional mzML file with chromatograms (valid formats: 'mzML')

Common TOPP options:
-ini Use the given TOPP INI file
-threads Sets the number of threads allowed to be used by the TOPP tool (default: '1')
-write_ini Writes the default configuration file
--help Shows options
--helphelp Shows all options (including advanced)

The following configuration subsections are valid:

  • algorithm Algorithm parameters section

You can write an example INI file using the '-write_ini' option.
Documentation of subsection parameters can be found in the doxygen documentation or the INIFileEditor.
For more information, please consult the online documentation for this tool:

Process finished:
FeatureFinderMetabo -in ../workspaces-streamlit-template/default/topp-workflow/input-files/mzML-files/Blank.mzML -out ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Blank.featureXML -algorithm:epd:masstrace_snr_filtering false -ini ../workspaces-streamlit-template/default/topp-workflow/ini/FeatureFinderMetabo.ini
Total time to run command: 0.01 seconds

ERRORS OCCURRED:
Trailing text argument(s) '[false]' given. Aborting!

FeatureFinderMetabo -- Assembles metabolite features from centroided (LC-)MS data using the mass trace approach.
Full documentation: http://www.openms.de/doxygen/nightly/html/TOPP_FeatureFinderMetabo.html
Version: 3.1.0-pre-FDdevelop-2024-12-10 Dec 11 2024, 11:51:33, Revision: d27c9b5
To cite OpenMS:

  • Pfeuffer, J., Bielow, C., Wein, S. et al.. OpenMS 3 enables reproducible analysis of large-scale mass spectrometry data. Nat Methods (2024). doi:10.1038/s41592-024-02197-7.

Usage:
FeatureFinderMetabo

This tool has algorithm parameters that are not shown here! Please check the ini file for a detailed description or use the --helphelp option

Options (mandatory options marked with ''):
-in
Centroided mzML file (valid formats: 'mzML')
-out * FeatureXML file with metabolite features (valid formats: 'featureXML')
-out_chrom Optional mzML file with chromatograms (valid formats: 'mzML')

Common TOPP options:
-ini Use the given TOPP INI file
-threads Sets the number of threads allowed to be used by the TOPP tool (default: '1')
-write_ini Writes the default configuration file
--help Shows options
--helphelp Shows all options (including advanced)

The following configuration subsections are valid:

  • algorithm Algorithm parameters section

You can write an example INI file using the '-write_ini' option.
Documentation of subsection parameters can be found in the doxygen documentation or the INIFileEditor.
For more information, please consult the online documentation for this tool:

Process finished:
FeatureFinderMetabo -in ../workspaces-streamlit-template/default/topp-workflow/input-files/mzML-files/Treatment.mzML -out ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Treatment.featureXML -algorithm:epd:masstrace_snr_filtering false -ini ../workspaces-streamlit-template/default/topp-workflow/ini/FeatureFinderMetabo.ini
Total time to run command: 0.01 seconds

ERRORS OCCURRED:
Trailing text argument(s) '[false]' given. Aborting!

FeatureFinderMetabo -- Assembles metabolite features from centroided (LC-)MS data using the mass trace approach.
Full documentation: http://www.openms.de/doxygen/nightly/html/TOPP_FeatureFinderMetabo.html
Version: 3.1.0-pre-FDdevelop-2024-12-10 Dec 11 2024, 11:51:33, Revision: d27c9b5
To cite OpenMS:

  • Pfeuffer, J., Bielow, C., Wein, S. et al.. OpenMS 3 enables reproducible analysis of large-scale mass spectrometry data. Nat Methods (2024). doi:10.1038/s41592-024-02197-7.

Usage:
FeatureFinderMetabo

This tool has algorithm parameters that are not shown here! Please check the ini file for a detailed description or use the --helphelp option

Options (mandatory options marked with ''):
-in
Centroided mzML file (valid formats: 'mzML')
-out * FeatureXML file with metabolite features (valid formats: 'featureXML')
-out_chrom Optional mzML file with chromatograms (valid formats: 'mzML')

Common TOPP options:
-ini Use the given TOPP INI file
-threads Sets the number of threads allowed to be used by the TOPP tool (default: '1')
-write_ini Writes the default configuration file
--help Shows options
--helphelp Shows all options (including advanced)

The following configuration subsections are valid:

  • algorithm Algorithm parameters section

You can write an example INI file using the '-write_ini' option.
Documentation of subsection parameters can be found in the doxygen documentation or the INIFileEditor.
For more information, please consult the online documentation for this tool:

Process finished:
FeatureFinderMetabo -in ../workspaces-streamlit-template/default/topp-workflow/input-files/mzML-files/Pool.mzML -out ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Pool.featureXML -algorithm:epd:masstrace_snr_filtering false -ini ../workspaces-streamlit-template/default/topp-workflow/ini/FeatureFinderMetabo.ini
Total time to run command: 0.01 seconds

ERRORS OCCURRED:
Trailing text argument(s) '[false]' given. Aborting!

FeatureFinderMetabo -- Assembles metabolite features from centroided (LC-)MS data using the mass trace approach.
Full documentation: http://www.openms.de/doxygen/nightly/html/TOPP_FeatureFinderMetabo.html
Version: 3.1.0-pre-FDdevelop-2024-12-10 Dec 11 2024, 11:51:33, Revision: d27c9b5
To cite OpenMS:

  • Pfeuffer, J., Bielow, C., Wein, S. et al.. OpenMS 3 enables reproducible analysis of large-scale mass spectrometry data. Nat Methods (2024). doi:10.1038/s41592-024-02197-7.

Usage:
FeatureFinderMetabo

This tool has algorithm parameters that are not shown here! Please check the ini file for a detailed description or use the --helphelp option

Options (mandatory options marked with ''):
-in
Centroided mzML file (valid formats: 'mzML')
-out * FeatureXML file with metabolite features (valid formats: 'featureXML')
-out_chrom Optional mzML file with chromatograms (valid formats: 'mzML')

Common TOPP options:
-ini Use the given TOPP INI file
-threads Sets the number of threads allowed to be used by the TOPP tool (default: '1')
-write_ini Writes the default configuration file
--help Shows options
--helphelp Shows all options (including advanced)

The following configuration subsections are valid:

  • algorithm Algorithm parameters section

You can write an example INI file using the '-write_ini' option.
Documentation of subsection parameters can be found in the doxygen documentation or the INIFileEditor.
For more information, please consult the online documentation for this tool:

Total time to run 4 commands: 0.01 seconds

Linking features...

Running command:
FeatureLinkerUnlabeledKD -in ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Control.featureXML ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Blank.featureXML ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Treatment.featureXML ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Pool.featureXML -out ../workspaces-streamlit-template/default/topp-workflow/results/feature-linking/feature_matrix.consensusXML -ini ../workspaces-streamlit-template/default/topp-workflow/ini/FeatureLinkerUnlabeledKD.ini
Waiting for command to finish...

Process finished:
FeatureLinkerUnlabeledKD -in ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Control.featureXML ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Blank.featureXML ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Treatment.featureXML ../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Pool.featureXML -out ../workspaces-streamlit-template/default/topp-workflow/results/feature-linking/feature_matrix.consensusXML -ini ../workspaces-streamlit-template/default/topp-workflow/ini/FeatureLinkerUnlabeledKD.ini
Total time to run command: 0.01 seconds

ERRORS OCCURRED:
Cannot read input file given from parameter '-in'!
Error: File not found (the file '../workspaces-streamlit-template/default/topp-workflow/results/feature-detection/Control.featureXML' could not be found)

ERROR: Process failed: Cannot read input file given from parameter '-in'!

The error

ERROR: Process failed: Cannot read input file given from parameter '-in'!

fails the process but is the last tool in the workflow to raise an error. For some reason all previous errors seem to be ignored.

I suspect it might have to do with the execution occurring in a separate thread (so exceptions might not behave as you would expect). Perhaps you could try to kill the thread once an error is detected instead of raising an exception?

@asterisk-ragavan
Copy link
Contributor Author

Okay, I saw that when executing multiple commands at the same time, it creates a thread and execution them parallel. But I thought raising the exception from the main thread would kill all the threads

I'll look into it and try to kill all other threads when error occurred

@asterisk-ragavan
Copy link
Contributor Author

@t0mdavid-m added a stop signal to a thread handler to stop all threads when an error occurred, and also I have reverted back to the old log statement (like the main code)

Note: still didn't fix the command line arguments passed as a string, which is addressed in #90

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (2)
src/workflow/CommandExecutor.py (2)

72-74: Update exception documentation

The docstring's "Raises" section should be updated to specifically mention the RuntimeError that's now explicitly raised for critical errors.

 Raises:
-    Exception: If the command execution results in any errors.
+    RuntimeError: If the command execution results in critical errors.
+    Exception: If other errors occur during command execution.

203-203: Consider integrating the stop method with stop_event

The stop method kills processes but isn't integrated with the new stop_event mechanism. Consider calling stop() when setting the stop_event to ensure immediate termination.

In run_command, you could add:

 if "UserWarning" in error_message:
     self.logger.log(f"WARNING:\n{error_message}", 2)
 else:
     self.logger.log(f"ERRORS OCCURRED:\n{error_message}", 2)
     stop_event.set()
+    self.stop()  # Immediately kill all processes
     raise RuntimeError(f"Process failed: {error_message}")

Alternatively, consider modifying the stop method to check and respect the stop_event:

 def stop(self) -> None:
     """
     Terminates all processes initiated by this executor by killing them based on stored PIDs.
     """
+    # If any thread has set the stop_event, log an error message
+    if hasattr(self, 'stop_event') and self.stop_event.is_set():
+        self.logger.log("Stopping due to errors in workflow execution")
     
     self.logger.log("Stopping all running processes...")
     # Rest of the method...
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between d1b92be and af7e264.

📒 Files selected for processing (1)
  • src/workflow/CommandExecutor.py (6 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (3)
  • GitHub Check: build-simple-app
  • GitHub Check: build-openms
  • GitHub Check: build-full-app
🔇 Additional comments (6)
src/workflow/CommandExecutor.py (6)

48-48: Good implementation of thread coordination with stop_event

Adding a thread-safe event object to coordinate stopping threads is the right approach to handle error propagation in a multi-threaded environment.


52-52: LGTM: Properly passing stop_event to each thread

The implementation correctly passes the stop_event to each thread, ensuring they can all be signaled to stop if an error occurs.


64-70: Correctly updated method signature and documentation

The method signature and documentation have been properly updated to include the new stop_event parameter, maintaining code clarity.


75-77: Good early termination check

This early check prevents unnecessary work if the stop event has been set, improving efficiency in error scenarios.


111-116: Effective error handling implementation

This implementation correctly distinguishes between non-critical warnings and critical errors, sets the stop event for critical errors, and raises an exception with the error message.


276-276: Inconsistent stop_event usage (duplicate)

Same issue as in line 261 - creating a new threading.Event() for each command means these won't coordinate with other threads for stopping.

self.logger.log(f"WARNING: No DEFAULTS found in {path.name}")
# run command without params
self.run_command(["python", str(path)])
self.run_command(["python", str(path)], threading.Event())
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Inconsistent stop_event usage

Each call to run_command in run_python creates a new threading.Event(), which won't coordinate with other threads. If an error occurs in another thread, these commands won't stop.

Consider refactoring to maintain a consistent stop_event approach:

 def run_python(self, script_file: str, input_output: dict = {}) -> None:
+    stop_event = threading.Event()
     # ...
-    self.run_command(["python", str(path)], threading.Event())
+    self.run_command(["python", str(path)], stop_event)
     # ...
-    self.run_command(["python", str(path), str(tmp_params_file)], threading.Event())
+    self.run_command(["python", str(path), str(tmp_params_file)], stop_event)

Or consider passing a class-level stop_event that's initialized in __init__ and used throughout all methods.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
self.run_command(["python", str(path)], threading.Event())
def run_python(self, script_file: str, input_output: dict = {}) -> None:
+ stop_event = threading.Event()
# ...
- self.run_command(["python", str(path)], threading.Event())
+ self.run_command(["python", str(path)], stop_event)
# ...
- self.run_command(["python", str(path), str(tmp_params_file)], threading.Event())
+ self.run_command(["python", str(path), str(tmp_params_file)], stop_event)

Comment on lines +199 to +201
stop_event = threading.Event()
if len(commands) == 1:
self.run_command(commands[0])
self.run_command(commands[0], stop_event)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix needed: Stop event not used in run_multiple_commands call

The stop_event created in line 199 is used for direct run_command calls but not passed to run_multiple_commands in line 203, creating an inconsistency in error handling.

Apply this fix:

 # Run command(s)
 stop_event = threading.Event()
 if len(commands) == 1:
     self.run_command(commands[0], stop_event)
 elif len(commands) > 1:
-    self.run_multiple_commands(commands)
+    self.run_multiple_commands(commands)  # This uses its own stop_event internally
 else:
     raise Exception("No commands to execute.")

Consider refactoring to use the same stop_event for all command executions:

 # Run command(s)
 stop_event = threading.Event()
 if len(commands) == 1:
     self.run_command(commands[0], stop_event)
 elif len(commands) > 1:
-    self.run_multiple_commands(commands)
+    # Either modify run_multiple_commands to accept a stop_event
+    # self.run_multiple_commands(commands, stop_event)
+    # Or pass each command individually
+    threads = []
+    for cmd in commands:
+        thread = threading.Thread(target=self.run_command, args=(cmd, stop_event))
+        thread.start()
+        threads.append(thread)
+    for thread in threads:
+        thread.join()
 else:
     raise Exception("No commands to execute.")
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
stop_event = threading.Event()
if len(commands) == 1:
self.run_command(commands[0])
self.run_command(commands[0], stop_event)
# Run command(s)
stop_event = threading.Event()
if len(commands) == 1:
self.run_command(commands[0], stop_event)
elif len(commands) > 1:
# Either modify run_multiple_commands to accept a stop_event
# self.run_multiple_commands(commands, stop_event)
# Or pass each command individually
threads = []
for cmd in commands:
thread = threading.Thread(target=self.run_command, args=(cmd, stop_event))
thread.start()
threads.append(thread)
for thread in threads:
thread.join()
else:
raise Exception("No commands to execute.")

Copy link
Member

@t0mdavid-m t0mdavid-m left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unfortunately, this does not seem to have any effect. The process still fails like in the previous iteration. Just double-checking whether you can replicate the issue on your end?

@asterisk-ragavan
Copy link
Contributor Author

I want to lame close this and work from scratch, I got this special solve from the stack overflow

Iam not able to recreate the original error from my branch. So lame close and start fresh

@t0mdavid-m
Copy link
Member

Why does this necessitate closing the issue? I am still getting the error when running the workflows from your branch. What output do you see when you run the workflows?

@t0mdavid-m
Copy link
Member

@asterisk-ragavan I just wanted to check-in if you are still working on this? Were you able to recreate the failing workflow?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Workflows are not terminated if errors are detected

2 participants