Skip to content

feat: updated version of pretext workflow developed by Delphine that takes HiFi and HiC as input.#584

Merged
mvdbeek merged 40 commits intogalaxyproject:mainfrom
Smeds:update-pretext
Mar 28, 2025
Merged

feat: updated version of pretext workflow developed by Delphine that takes HiFi and HiC as input.#584
mvdbeek merged 40 commits intogalaxyproject:mainfrom
Smeds:update-pretext

Conversation

@Smeds
Copy link
Copy Markdown
Collaborator

@Smeds Smeds commented Oct 28, 2024

updated version of pretext workflow developed by Delphine that takes HiFi and HiC as input.

@Smeds Smeds marked this pull request as draft October 28, 2024 14:03
@Smeds Smeds changed the title feat: updated version of pretext workflow developed by Delphine that … feat: updated version of pretext workflow developed by Delphine that takes HiFi and HiC as input. Oct 28, 2024
@mvdbeek
Copy link
Copy Markdown
Member

mvdbeek commented Oct 30, 2024

@Smeds
Copy link
Copy Markdown
Collaborator Author

Smeds commented Oct 30, 2024

Nice! Can you take a look at https://github.com/galaxyproject/iwc/blob/main/workflows/README.md#adding-workflows ?

Thanks! I will read through the readme and update the workflow.

@Delphine-L
Copy link
Copy Markdown
Contributor

Need to solve galaxyproject/galaxy#19143 as soon as possible to settle on a way to deal with compressed haplotypes inputs

@Delphine-L Delphine-L marked this pull request as ready for review November 26, 2024 03:37
@Delphine-L
Copy link
Copy Markdown
Contributor

I am not sure why the tests are not running. Could someone take a look ?

@mvdbeek
Copy link
Copy Markdown
Member

mvdbeek commented Nov 26, 2024

Screenshot 2024-11-26 at 09 51 41

I assume that's the reason ?

@github-actions
Copy link
Copy Markdown

Test Results (powered by Planemo)

Test Summary

Test State Count
Total 0
Passed 0
Error 0
Failure 0
Skipped 0

@Delphine-L Delphine-L added autoupdate 'artefact' Autoupdate generated a new version but does not worth releasing it and removed autoupdate 'artefact' Autoupdate generated a new version but does not worth releasing it labels Feb 4, 2025
@github-actions
Copy link
Copy Markdown

Test Results (powered by Planemo)

Test Summary

Test State Count
Total 1
Passed 0
Error 1
Failure 0
Skipped 0
Errored Tests
  • ❌ hi-c-map-for-assembly-manual-curation.ga_0

    Execution Problem:

    • Unexpected HTTP status code: 400: {"err_msg":"Workflow cannot be run because input step '49' (Haplotype 2) is not optional and no input provided.","err_code":0}
      

@Delphine-L
Copy link
Copy Markdown
Contributor

@Smeds @mvdbeek I'd be grateful for feedback on the latest version of the workflow.

I modified the workflow to :

  1. use bwameme2 on paired collection instead of the filter and merge tool
  2. take either 1 or 2 haplotypes
  3. generate pretext maps with coverage and gaps even if telomeres track is empty (with all 3 tracks if not). I am not sure how often it could happen that no telomeres are detected, but it's there just in case.

Tests are :

  1. one haplotype with telomeres found
  2. one haplotype without telomeres found

I am not testing for 2 haplotypes right now because it takes too much memory for the CI to run, I am working on using smaller scaffolds to test it.

@Delphine-L Delphine-L enabled auto-merge March 27, 2025 15:55
@mvdbeek mvdbeek disabled auto-merge March 27, 2025 16:02
Delphine-L and others added 2 commits March 27, 2025 12:09
…-curation/hi-c-map-for-assembly-manual-curation.ga

Co-authored-by: Marius van den Beek <m.vandenbeek@gmail.com>
…-curation/hi-c-map-for-assembly-manual-curation.ga

Co-authored-by: Marius van den Beek <m.vandenbeek@gmail.com>
@mvdbeek mvdbeek enabled auto-merge March 27, 2025 16:11
@github-actions
Copy link
Copy Markdown

Test Results (powered by Planemo)

Test Summary

Test State Count
Total 2
Passed 2
Error 0
Failure 0
Skipped 0
Passed Tests
  • ✅ hi-c-map-for-assembly-manual-curation.ga_0

    Workflow invocation details

    • Invocation Messages

    • Steps
      • Step 1: Haplotype 1:

        • step_state: scheduled
      • Step 2: Will you use a second haplotype?:

        • step_state: scheduled
      • Step 11: Hap2 not provided:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input_param_type {"__current_case__": 3, "input_param": false, "mappings": [{"__index__": 0, "from": false, "to": "true"}, {"__index__": 1, "from": true, "to": "false"}], "type": "boolean"}
              output_param_type "boolean"
              unmapped {"__current_case__": 2, "default_value": "false", "on_unmapped": "default"}
      • Step 12: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Hap1:

            • step_state: scheduled
          • Step 2: Hap2:

            • step_state: scheduled
          • Step 11: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfa90b2811f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 10, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 2, "src": "hda"}]}}]}}
          • Step 12: toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_cat/9.3+galaxy1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfa90b2811f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  queries [{"__index__": 0, "inputs2": {"values": [{"id": 12, "src": "hda"}]}}]
          • Step 3: Do you want to add suffixes to the scaffold names?:

            • step_state: scheduled
          • Step 4: Hap1 suffix:

            • step_state: scheduled
          • Step 5: Hap2 suffix:

            • step_state: scheduled
          • Step 6: Expression for hap1 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfa90b2811f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H1", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 7: Expression for hap2 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfa90b2811f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H2", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 8: add hap1 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfa90b2811f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": null}]
          • Step 9: add hap2 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfa90b2811f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": null}]
          • Step 10: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfa90b2811f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 9, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 1, "src": "hda"}]}}]}}
      • Step 13: concatenate HiFi:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cat '/tmp/tmpjw6cynca/files/f/a/3/dataset_fa361fb2-928f-4ed7-b3ce-3e94e0a2fc19.dat' >> '/tmp/tmpjw6cynca/job_working_directory/000/14/outputs/dataset_55626df1-c303-448c-a660-03c17d08c884.dat' && exit 0

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              queries []
      • Step 14: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Hap1:

            • step_state: scheduled
          • Step 2: Do you want to add suffixes to the scaffold names?:

            • step_state: scheduled
          • Step 3: Hap1 suffix:

            • step_state: scheduled
          • Step 4: Expression for hap1 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfaa0b2811f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H1", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 5: add hap1 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • sed -r --sandbox -e 's/>.+$/&.H1/g' '/tmp/tmpjw6cynca/files/b/b/d/dataset_bbd8ca79-5158-4561-9ce2-b97edd1d5ee9.dat' > '/tmp/tmpjw6cynca/job_working_directory/000/16/outputs/dataset_45ad0f19-a975-4dd9-8a9c-399b876a7110.dat'

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfaa0b2811f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": "&.H1"}]
          • Step 6: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfaa0b2811f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 16, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 1, "src": "hda"}]}}]}}
      • Step 15: Pick Assembly:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 13, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 17, "src": "hda"}]}}]}}
      • Step 16: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Reference:

            • step_state: scheduled
          • Step 2: Hi-C reads:

            • step_state: scheduled
          • Step 3: Do you want to trim the Hi-C data?:

            • step_state: scheduled
          • Step 4: Trim Hi-C reads 2:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • ln -f -s '/tmp/tmpjw6cynca/files/3/a/a/dataset_3aab135c-81d0-452a-9b2c-d51e82ff9523.dat' 'Hi-C reads_1.fq.gz' && ln -f -s '/tmp/tmpjw6cynca/files/4/b/3/dataset_4b3834d9-9d1e-4668-92f7-8e961459e7ba.dat' 'Hi-C reads_2.fq.gz' &&  cutadapt  -j=${GALAXY_SLOTS:-4}     --error-rate=0.1 --times=1 --overlap=3    --action=trim   --cut=5 -U 5       --minimum-length=1      -o 'out1.fq.gz' -p 'out2.fq.gz'  'Hi-C reads_1.fq.gz' 'Hi-C reads_2.fq.gz'  > report.txt

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfab0b2811f0aeed000d3a3012a8"
                  adapter_options {"action": "trim", "error_rate": "0.1", "match_read_wildcards": false, "no_indels": false, "no_match_adapter_wildcards": true, "overlap": "3", "revcomp": false, "times": "1"}
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  filter_options {"discard_casava": false, "discard_trimmed": false, "discard_untrimmed": false, "max_average_error_rate": null, "max_expected_errors": null, "max_n": null, "maximum_length": null, "maximum_length2": null, "minimum_length": "1", "minimum_length2": null, "pair_filter": "any"}
                  library {"__current_case__": 2, "input_1": {"values": [{"id": 1, "src": "dce"}]}, "pair_adapters": false, "r1": {"adapters": [], "anywhere_adapters": [], "front_adapters": []}, "r2": {"adapters2": [], "anywhere_adapters2": [], "front_adapters2": []}, "type": "paired_collection"}
                  other_trimming_options {"cut": "5", "cut2": "5", "nextseq_trim": "0", "poly_a": false, "quality_cutoff": "0", "quality_cutoff2": "", "shorten_options": {"__current_case__": 1, "shorten_values": "False"}, "shorten_options_r2": {"__current_case__": 1, "shorten_values_r2": "False"}, "trim_n": false}
                  output_selector ["report"]
                  read_mod_options {"length_tag": null, "rename": null, "strip_suffix": null, "zero_cap": false}
          • Step 5: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfab0b2811f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 7, "src": "dce"}]}}, {"__index__": 1, "value": {"values": [{"id": 2, "src": "dce"}]}}]}}
              • Job 2:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfab0b2811f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 8, "src": "dce"}]}}, {"__index__": 1, "value": {"values": [{"id": 3, "src": "dce"}]}}]}}
          • Step 6: toolshed.g2.bx.psu.edu/repos/iuc/bwa_mem2/bwa_mem2/2.2.1+galaxy1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • set -o | grep -q pipefail && set -o pipefail;  ln -s '/tmp/tmpjw6cynca/files/4/5/a/dataset_45ad0f19-a975-4dd9-8a9c-399b876a7110.dat' 'localref.fa' && bwa-mem2 index 'localref.fa' &&    bwa-mem2 mem -t "${GALAXY_SLOTS:-1}" -v 1   -k '19' -w '100' -d '100' -r '1.5' -y '20' -c '500' -D '0.5' -W '0' -m '50' -S -P    -T '20' -h '5' -a                      'localref.fa' '/tmp/tmpjw6cynca/files/9/9/8/dataset_9982f7a6-75ce-4729-a553-84eef307e54f.dat' '/tmp/tmpjw6cynca/files/3/0/8/dataset_308fc953-fa3c-4aa2-8158-585160c45907.dat'  | samtools sort -@${GALAXY_SLOTS:-2} -T "${TMPDIR:-.}" -O bam -o '/tmp/tmpjw6cynca/job_working_directory/000/22/outputs/dataset_201e0aaa-48a3-4dcb-9eec-6e19e90917f7.dat'

                Exit Code:

                • 0

                Standard Error:

                • Looking to launch executable "/usr/local/bin/bwa-mem2.avx2", simd = .avx2
                  Launching executable "/usr/local/bin/bwa-mem2.avx2"
                  [bwa_index] Pack FASTA... 0.62 sec
                  * Entering FMI_search
                  init ticks = 10203051845
                  ref seq len = 279307354
                  binary seq ticks = 5180157794
                  build suffix-array ticks = 91861858347
                  pos: 34913420, ref_seq_len__: 34913419
                  build fm-index ticks = 21767177838
                  Total time taken: 53.4820
                  Looking to launch executable "/usr/local/bin/bwa-mem2.avx2", simd = .avx2
                  Launching executable "/usr/local/bin/bwa-mem2.avx2"
                  -----------------------------
                  Executing in AVX2 mode!!
                  -----------------------------
                  * SA compression enabled with xfactor: 8
                  * Ref file: localref.fa
                  * Entering FMI_search
                  * Index file found. Loading index from localref.fa.bwt.2bit.64
                  * Reference seq len for bi-index = 279307355
                  * sentinel-index: 13446364
                  * Count:
                  0,	1
                  1,	84204314
                  2,	139653678
                  3,	195103042
                  4,	279307355
                  
                  * Reading other elements of the index from files localref.fa
                  * Index prefix: localref.fa
                  * Read 0 ALT contigs
                  * Done reading Index!!
                  * Reading reference genome..
                  * Binary seq file = localref.fa.0123
                  * Reference genome size: 279307354 bp
                  * Done reading reference genome !!
                  
                  ------------------------------------------
                  1. Memory pre-allocation for Chaining: 139.3584 MB
                  2. Memory pre-allocation for BSW: 239.6170 MB
                  3. Memory pre-allocation for BWT: 77.3142 MB
                  ------------------------------------------
                  * Threads used (compute): 1
                  * No. of pipeline threads: 2
                  
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] Reallocating initial memory allocations!!
                  [0000] Calling mem_process_seqs.., task: 0
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 139653677, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (1649, 4140, 6027)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 14783)
                  [0000][PE] mean and std.dev: (4063.15, 2569.67)
                  [0000][PE] low and high boundaries for proper pairs: (1, 19161)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (157, 248, 382)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 832)
                  [0000][PE] mean and std.dev: (254.60, 148.66)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1057)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (2814, 4404, 5708)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 11496)
                  [0000][PE] mean and std.dev: (4321.60, 2279.57)
                  [0000][PE] low and high boundaries for proper pairs: (1, 14390)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1378, 2846, 5532)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 13840)
                  [0000][PE] mean and std.dev: (3534.90, 2567.86)
                  [0000][PE] low and high boundaries for proper pairs: (1, 17994)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 33.921 CPU sec, 34.042 real sec
                  [0000] Calling mem_process_seqs.., task: 1
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 139653677, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (1352, 2928, 5414)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 13538)
                  [0000][PE] mean and std.dev: (3677.61, 2704.12)
                  [0000][PE] low and high boundaries for proper pairs: (1, 17600)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (155, 240, 368)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 794)
                  [0000][PE] mean and std.dev: (241.09, 138.59)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1007)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (2050, 3954, 6826)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 16378)
                  [0000][PE] mean and std.dev: (4268.57, 2763.35)
                  [0000][PE] low and high boundaries for proper pairs: (1, 21154)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1927, 3814, 7217)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 17797)
                  [0000][PE] mean and std.dev: (4449.25, 2980.39)
                  [0000][PE] low and high boundaries for proper pairs: (1, 23087)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 32.958 CPU sec, 33.871 real sec
                  [0000] Calling mem_process_seqs.., task: 2
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 139653677, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (1701, 3918, 5911)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 14331)
                  [0000][PE] mean and std.dev: (4026.81, 2594.52)
                  [0000][PE] low and high boundaries for proper pairs: (1, 18541)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (154, 239, 432)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 988)
                  [0000][PE] mean and std.dev: (258.53, 170.79)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1266)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (1694, 2863, 6136)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 15020)
                  [0000][PE] mean and std.dev: (3923.51, 2842.81)
                  [0000][PE] low and high boundaries for proper pairs: (1, 19462)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (2159, 4591, 5807)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 13103)
                  [0000][PE] mean and std.dev: (4308.35, 2518.58)
                  [0000][PE] low and high boundaries for proper pairs: (1, 16751)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 32.799 CPU sec, 33.724 real sec
                  [0000] Calling mem_process_seqs.., task: 3
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 139653677, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (1288, 3079, 5247)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 13165)
                  [0000][PE] mean and std.dev: (3554.44, 2757.92)
                  [0000][PE] low and high boundaries for proper pairs: (1, 17124)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (164, 272, 454)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 1034)
                  [0000][PE] mean and std.dev: (283.81, 181.48)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1324)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (1107, 3976, 6440)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 17106)
                  [0000][PE] mean and std.dev: (3918.84, 2949.26)
                  [0000][PE] low and high boundaries for proper pairs: (1, 22439)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1780, 4065, 6647)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 16381)
                  [0000][PE] mean and std.dev: (4110.26, 2635.19)
                  [0000][PE] low and high boundaries for proper pairs: (1, 21248)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 32.706 CPU sec, 40.735 real sec
                  [0000] Calling mem_process_seqs.., task: 4
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 3899368, nseq: 26708
                  	[0000][ M::kt_pipeline] read 26708 sequences (3899368 bp)...
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 139653677, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (1322, 3322, 5913)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 15095)
                  [0000][PE] mean and std.dev: (3738.54, 2615.93)
                  [0000][PE] low and high boundaries for proper pairs: (1, 19686)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (159, 244, 399)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 879)
                  [0000][PE] mean and std.dev: (247.93, 143.14)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1119)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (1392, 2569, 5634)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 14118)
                  [0000][PE] mean and std.dev: (3364.92, 2668.75)
                  [0000][PE] low and high boundaries for proper pairs: (1, 18360)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1915, 3629, 5977)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 14101)
                  [0000][PE] mean and std.dev: (4139.37, 2620.32)
                  [0000][PE] low and high boundaries for proper pairs: (1, 18163)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 29.939 CPU sec, 30.803 real sec
                  [0000] Calling mem_process_seqs.., task: 5
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 0, nseq: 0
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 139653677, n: 26708
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (1905, 3648, 6371)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 15303)
                  [0000][PE] mean and std.dev: (3698.70, 2635.91)
                  [0000][PE] low and high boundaries for proper pairs: (1, 19769)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (154, 254, 370)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 802)
                  [0000][PE] mean and std.dev: (249.25, 145.76)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1018)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (1977, 3673, 6050)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 14196)
                  [0000][PE] mean and std.dev: (4176.14, 2655.14)
                  [0000][PE] low and high boundaries for proper pairs: (1, 18269)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1826, 3080, 5221)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 12011)
                  [0000][PE] mean and std.dev: (3522.50, 2457.06)
                  [0000][PE] low and high boundaries for proper pairs: (1, 15406)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 26708 reads in 11.809 CPU sec, 12.556 real sec
                  [0000] read_chunk: 10000000, work_chunk_size: 0, nseq: 0
                  [0000] Computation ends..
                  No. of OMP threads: 1
                  Processor is running @2445.580347 MHz
                  Runtime profile:
                  
                  	Time taken for main_mem function: 186.54 sec
                  
                  	IO times (sec) :
                  	Reading IO time (reads) avg: 1.58, (1.58, 1.58)
                  	Writing IO time (SAM) avg: 25.10, (25.10, 25.10)
                  	Reading IO time (Reference Genome) avg: 0.05, (0.05, 0.05)
                  	Index read time avg: 0.25, (0.25, 0.25)
                  
                  	Overall time (sec) (Excluding Index reading time):
                  	PROCESS() (Total compute time + (read + SAM) IO time) : 186.24
                  	MEM_PROCESS_SEQ() (Total compute time (Kernel + SAM)), avg: 185.72, (185.72, 185.72)
                  
                  	 SAM Processing time (sec):
                  	--WORKER_SAM avg: 94.55, (94.55, 94.55)
                  
                  	Kernels' compute time (sec):
                  	Total kernel (smem+sal+bsw) time avg: 91.14, (91.14, 91.14)
                  		SMEM compute avg: 36.54, (36.54, 36.54)
                  		SAL compute avg: 8.36, (8.36, 8.36)
                  				MEM_SA avg: 4.24, (4.24, 4.24)
                  
                  		BSW time, avg: 36.81, (36.81, 36.81)
                  
                  Important parameter settings: 
                  	BATCH_SIZE: 512
                  	MAX_SEQ_LEN_REF: 256
                  	MAX_SEQ_LEN_QER: 128
                  	MAX_SEQ_LEN8: 128
                  	SEEDS_PER_READ: 500
                  	SIMD_WIDTH8 X: 32
                  	SIMD_WIDTH16 X: 16
                  	AVG_SEEDS_PER_READ: 64
                  [bam_sort_core] merging from 1 files and 1 in-memory blocks...
                  

                Standard Output:

                • ref_seq_len = 279307354
                  count = 0, 84204313, 139653677, 195103041, 279307354
                  BWT[13446364] = 4
                  CP_SHIFT = 6, CP_MASK = 63
                  sizeof CP_OCC = 64
                  max_occ_ind = 4364177
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfab0b2811f0aeed000d3a3012a8"
                  analysis_type {"__current_case__": 4, "algorithmic_options": {"D": "0.5", "P": true, "S": true, "W": "0", "__current_case__": 0, "algorithmic_options_selector": "set", "c": "500", "d": "100", "e": false, "k": "19", "m": "50", "r": "1.5", "w": "100", "y": "20"}, "analysis_type_selector": "full", "io_options": {"C": false, "K": null, "M": false, "T": "20", "V": false, "Y": false, "__current_case__": 0, "a": true, "five": false, "h": "5", "io_options_selector": "set", "q": false}, "scoring_options": {"__current_case__": 1, "scoring_options_selector": "do_not_set"}}
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  fastq_input {"__current_case__": 2, "fastq_input1": {"values": [{"id": 9, "src": "dce"}]}, "fastq_input_selector": "paired_collection", "iset_stats": null}
                  output_sort "coordinate"
                  reference_source {"__current_case__": 1, "ref_file": {"values": [{"id": 18, "src": "hda"}]}, "reference_source_selector": "history"}
                  rg {"__current_case__": 3, "rg_selector": "do_not_set"}
          • Step 7: toolshed.g2.bx.psu.edu/repos/devteam/picard/picard_MergeSamFiles/3.1.1.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • _JAVA_OPTIONS=${_JAVA_OPTIONS:-"-Xmx2048m -Xms256m -Djava.io.tmpdir=${TMPDIR:-${_GALAXY_JOB_TMPDIR}}"} && export _JAVA_OPTIONS &&  picard MergeSamFiles  --INPUT '/tmp/tmpjw6cynca/files/2/0/1/dataset_201e0aaa-48a3-4dcb-9eec-6e19e90917f7.dat'  --OUTPUT '/tmp/tmpjw6cynca/job_working_directory/000/23/outputs/dataset_f7147647-0ee9-4451-9b72-f48958de3a2d.dat' --MERGE_SEQUENCE_DICTIONARIES 'false'  --ASSUME_SORTED 'true'  --USE_THREADING true --SORT_ORDER coordinate --VALIDATION_STRINGENCY 'LENIENT' --QUIET true --VERBOSITY ERROR

                Exit Code:

                • 0

                Standard Error:

                • /usr/local/bin/picard: line 5: warning: setlocale: LC_ALL: cannot change locale (en_US.UTF-8): No such file or directory
                  Picked up _JAVA_OPTIONS: -Xmx2048m -Xms256m -Djava.io.tmpdir=/tmp/tmpjw6cynca/tmp
                  Mar 27, 2025 4:36:45 PM com.intel.gkl.NativeLibraryLoader load
                  INFO: Loading libgkl_compression.so from jar:file:/usr/local/share/picard-3.1.1-0/picard.jar!/com/intel/gkl/native/libgkl_compression.so
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "aafebfab0b2811f0aeed000d3a3012a8"
                  assume_sorted true
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  comments []
                  dbkey "?"
                  merge_sequence_dictionaries false
                  validation_stringency "LENIENT"
      • Step 17: toolshed.g2.bx.psu.edu/repos/bgruening/gfastats/gfastats/1.3.9+galaxy2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • gfastats '/tmp/tmpjw6cynca/files/4/5/a/dataset_45ad0f19-a975-4dd9-8a9c-399b876a7110.dat' --out-coord g   --tabular > '/tmp/tmpjw6cynca/job_working_directory/000/24/outputs/dataset_3e729f81-0bc8-412f-9c25-eb5a0a70eaef.dat' --threads ${GALAXY_SLOTS:-8}

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode_condition {"__current_case__": 1, "discover_paths": false, "locale": false, "selector": "statistics", "statistics_condition": {"__current_case__": 1, "out_coord": "g", "selector": "coordinates"}, "tabular": true}
              target_condition {"__current_case__": 0, "target_option": "false"}
      • Step 18: toolshed.g2.bx.psu.edu/repos/iuc/seqtk/seqtk_telo/1.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • seqtk telo -m 'CCCTAA' -p '1' -d '2000' -s '300'  '/tmp/tmpjw6cynca/files/4/5/a/dataset_45ad0f19-a975-4dd9-8a9c-399b876a7110.dat' > '/tmp/tmpjw6cynca/job_working_directory/000/25/outputs/dataset_af1611cf-13ed-421f-aae1-dc003e14c3a8.dat'

            Exit Code:

            • 0

            Standard Error:

            • 11012	139653677
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              P false
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              d "2000"
              dbkey "?"
              m "CCCTAA"
              p "1"
              s "300"
      • Step 19: toolshed.g2.bx.psu.edu/repos/iuc/minimap2/minimap2/2.28+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -f -s '/tmp/tmpjw6cynca/files/4/5/a/dataset_45ad0f19-a975-4dd9-8a9c-399b876a7110.dat' reference.fa && minimap2 -x map-hifi    --q-occ-frac 0.01       -t ${GALAXY_SLOTS:-4} reference.fa '/tmp/tmpjw6cynca/files/5/5/6/dataset_55626df1-c303-448c-a660-03c17d08c884.dat' -a | samtools view --no-PG -hT reference.fa | samtools sort -@${GALAXY_SLOTS:-2} -T "${TMPDIR:-.}" -O BAM -o '/tmp/tmpjw6cynca/job_working_directory/000/26/outputs/dataset_c224f6b3-3caf-4c02-ae03-47a2efbb6173.dat'

            Exit Code:

            • 0

            Standard Error:

            • [M::mm_idx_gen::4.467*0.78] collected minimizers
              [M::mm_idx_gen::5.764*0.81] sorted minimizers
              [M::main::5.764*0.81] loaded/built the index for 1 target sequence(s)
              [M::mm_mapopt_update::5.894*0.81] mid_occ = 131
              [M::mm_idx_stat] kmer size: 19; skip: 19; is_hpc: 0; #seq: 1
              [M::mm_idx_stat::5.992*0.82] distinct minimizers: 11315845 (97.08% are singletons); average occurrences: 1.241; average spacing: 9.941; total length: 139653677
              [M::worker_pipeline::148.131*0.98] mapped 5426 sequences
              [M::main] Version: 2.28-r1209
              [M::main] CMD: minimap2 -x map-hifi --q-occ-frac 0.01 -t 1 -a reference.fa /tmp/tmpjw6cynca/files/5/5/6/dataset_55626df1-c303-448c-a660-03c17d08c884.dat
              [M::main] Real time: 148.153 sec; CPU: 144.794 sec; Peak RSS: 1.376 GB
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              alignment_options {"A": null, "B": null, "E": null, "E2": null, "O": null, "O2": null, "no_end_flt": true, "s": null, "splicing": {"__current_case__": 0, "splice_mode": "preset"}, "z": null, "z2": null}
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              fastq_input {"__current_case__": 0, "analysis_type_selector": "map-hifi", "fastq_input1": {"values": [{"id": 14, "src": "hda"}]}, "fastq_input_selector": "single"}
              indexing_options {"H": false, "I": null, "k": null, "w": null}
              io_options {"K": null, "L": false, "Q": false, "Y": false, "c": false, "cs": null, "eqx": false, "output_format": "BAM"}
              mapping_options {"F": null, "N": null, "X": false, "f": null, "g": null, "kmer_ocurrence_interval": {"__current_case__": 1, "interval": ""}, "m": null, "mask_len": null, "max_chain_iter": null, "max_chain_skip": null, "min_occ_floor": null, "n": null, "p": null, "q_occ_frac": "0.01", "r": null}
              reference_source {"__current_case__": 1, "ref_file": {"values": [{"id": 18, "src": "hda"}]}, "reference_source_selector": "history"}
      • Step 20: toolshed.g2.bx.psu.edu/repos/iuc/pretext_map/pretext_map/0.1.9+galaxy1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • samtools view -h '/tmp/tmpjw6cynca/files/f/7/1/dataset_f7147647-0ee9-4451-9b72-f48958de3a2d.dat' | PretextMap --sortby length --sortorder descend --mapq 0 -o output.pretext

            Exit Code:

            • 0

            Standard Output:

            • [PretextMap status] :: Mapping to 1 sequences, sorted by length, descending. Filtering by minimum mapping quality 0
              
                                                                                              
              [PretextMap status] :: 10.0 k reads processed, 3  read-pairs mapped
                                                                                              
              [PretextMap status] :: 20.0 k reads processed, 206  read-pairs mapped
                                                                                              
              [PretextMap status] :: 30.0 k reads processed, 337  read-pairs mapped
                                                                                              
              [PretextMap status] :: 40.0 k reads processed, 455  read-pairs mapped
                                                                                              
              [PretextMap status] :: 50.0 k reads processed, 537  read-pairs mapped
                                                                                              
              [PretextMap status] :: 60.0 k reads processed, 605  read-pairs mapped
                                                                                              
              [PretextMap status] :: 70.0 k reads processed, 681  read-pairs mapped
                                                                                              
              [PretextMap status] :: 80.0 k reads processed, 747  read-pairs mapped
                                                                                              
              [PretextMap status] :: 90.0 k reads processed, 815  read-pairs mapped
                                                                                              
              [PretextMap status] :: 100.0 k reads processed, 942  read-pairs mapped
                                                                                              
              [PretextMap status] :: 110.0 k reads processed, 1.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 120.0 k reads processed, 1.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 130.0 k reads processed, 1.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 140.0 k reads processed, 1.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 150.0 k reads processed, 1.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 160.0 k reads processed, 1.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 170.0 k reads processed, 1.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 180.0 k reads processed, 1.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 190.0 k reads processed, 2.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 200.0 k reads processed, 2.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 210.0 k reads processed, 2.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 220.0 k reads processed, 2.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 230.0 k reads processed, 2.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 240.0 k reads processed, 2.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 250.0 k reads processed, 2.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 260.0 k reads processed, 2.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 270.0 k reads processed, 3.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 280.0 k reads processed, 3.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 290.0 k reads processed, 3.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 300.0 k reads processed, 3.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 310.0 k reads processed, 3.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 320.0 k reads processed, 3.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 330.0 k reads processed, 3.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 340.0 k reads processed, 3.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 350.0 k reads processed, 3.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 360.0 k reads processed, 3.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 370.0 k reads processed, 4.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 380.0 k reads processed, 4.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 390.0 k reads processed, 4.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 400.0 k reads processed, 4.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 410.0 k reads processed, 4.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 420.0 k reads processed, 4.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 430.0 k reads processed, 4.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 440.0 k reads processed, 4.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 450.0 k reads processed, 4.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 460.0 k reads processed, 4.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 470.0 k reads processed, 5.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 480.0 k reads processed, 5.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 490.0 k reads processed, 5.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 500.0 k reads processed, 5.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 510.0 k reads processed, 5.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 520.0 k reads processed, 5.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 530.0 k reads processed, 5.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 540.0 k reads processed, 5.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 550.0 k reads processed, 5.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 560.0 k reads processed, 5.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 570.0 k reads processed, 5.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 580.0 k reads processed, 6.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 590.0 k reads processed, 6.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 600.0 k reads processed, 6.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 610.0 k reads processed, 6.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 620.0 k reads processed, 6.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 630.0 k reads processed, 6.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 640.0 k reads processed, 6.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 650.0 k reads processed, 6.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 660.0 k reads processed, 6.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 670.0 k reads processed, 7.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 680.0 k reads processed, 7.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 690.0 k reads processed, 7.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 700.0 k reads processed, 7.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 710.0 k reads processed, 7.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 720.0 k reads processed, 7.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 730.0 k reads processed, 7.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 740.0 k reads processed, 7.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 750.0 k reads processed, 7.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 760.0 k reads processed, 8.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 770.0 k reads processed, 8.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 780.0 k reads processed, 8.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 790.0 k reads processed, 8.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 800.0 k reads processed, 8.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 810.0 k reads processed, 8.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 820.0 k reads processed, 8.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 830.0 k reads processed, 8.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 840.0 k reads processed, 8.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 850.0 k reads processed, 8.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 860.0 k reads processed, 9.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 870.0 k reads processed, 9.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 880.0 k reads processed, 9.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 890.0 k reads processed, 9.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 900.0 k reads processed, 9.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 910.0 k reads processed, 9.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 920.0 k reads processed, 9.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 930.0 k reads processed, 9.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 940.0 k reads processed, 10.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 950.0 k reads processed, 10.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 960.0 k reads processed, 10.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 970.0 k reads processed, 10.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 980.0 k reads processed, 10.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 990.0 k reads processed, 10.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.0 M reads processed, 10.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.0 M reads processed, 10.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.0 M reads processed, 10.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.0 M reads processed, 10.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.0 M reads processed, 11.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.1 M reads processed, 11.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.1 M reads processed, 11.3 k read-pairs mapped
                                                                                   
              ..
              :: 298/528 (56.44%) texture blocks written to disk
              [PretextMap status] :: 299/528 (56.63%) texture blocks written to disk
              [PretextMap status] :: 300/528 (56.82%) texture blocks written to disk
              [PretextMap status] :: 301/528 (57.01%) texture blocks written to disk
              [PretextMap status] :: 302/528 (57.20%) texture blocks written to disk
              [PretextMap status] :: 303/528 (57.39%) texture blocks written to disk
              [PretextMap status] :: 304/528 (57.58%) texture blocks written to disk
              [PretextMap status] :: 305/528 (57.77%) texture blocks written to disk
              [PretextMap status] :: 306/528 (57.95%) texture blocks written to disk
              [PretextMap status] :: 307/528 (58.14%) texture blocks written to disk
              [PretextMap status] :: 308/528 (58.33%) texture blocks written to disk
              [PretextMap status] :: 309/528 (58.52%) texture blocks written to disk
              [PretextMap status] :: 310/528 (58.71%) texture blocks written to disk
              [PretextMap status] :: 311/528 (58.90%) texture blocks written to disk
              [PretextMap status] :: 312/528 (59.09%) texture blocks written to disk
              [PretextMap status] :: 313/528 (59.28%) texture blocks written to disk
              [PretextMap status] :: 314/528 (59.47%) texture blocks written to disk
              [PretextMap status] :: 315/528 (59.66%) texture blocks written to disk
              [PretextMap status] :: 316/528 (59.85%) texture blocks written to disk
              [PretextMap status] :: 317/528 (60.04%) texture blocks written to disk
              [PretextMap status] :: 318/528 (60.23%) texture blocks written to disk
              [PretextMap status] :: 319/528 (60.42%) texture blocks written to disk
              [PretextMap status] :: 320/528 (60.61%) texture blocks written to disk
              [PretextMap status] :: 321/528 (60.80%) texture blocks written to disk
              [PretextMap status] :: 322/528 (60.98%) texture blocks written to disk
              [PretextMap status] :: 323/528 (61.17%) texture blocks written to disk
              [PretextMap status] :: 324/528 (61.36%) texture blocks written to disk
              [PretextMap status] :: 325/528 (61.55%) texture blocks written to disk
              [PretextMap status] :: 326/528 (61.74%) texture blocks written to disk
              [PretextMap status] :: 327/528 (61.93%) texture blocks written to disk
              [PretextMap status] :: 328/528 (62.12%) texture blocks written to disk
              [PretextMap status] :: 329/528 (62.31%) texture blocks written to disk
              [PretextMap status] :: 330/528 (62.50%) texture blocks written to disk
              [PretextMap status] :: 331/528 (62.69%) texture blocks written to disk
              [PretextMap status] :: 332/528 (62.88%) texture blocks written to disk
              [PretextMap status] :: 333/528 (63.07%) texture blocks written to disk
              [PretextMap status] :: 334/528 (63.26%) texture blocks written to disk
              [PretextMap status] :: 335/528 (63.45%) texture blocks written to disk
              [PretextMap status] :: 336/528 (63.64%) texture blocks written to disk
              [PretextMap status] :: 337/528 (63.83%) texture blocks written to disk
              [PretextMap status] :: 338/528 (64.02%) texture blocks written to disk
              [PretextMap status] :: 339/528 (64.20%) texture blocks written to disk
              [PretextMap status] :: 340/528 (64.39%) texture blocks written to disk
              [PretextMap status] :: 341/528 (64.58%) texture blocks written to disk
              [PretextMap status] :: 342/528 (64.77%) texture blocks written to disk
              [PretextMap status] :: 343/528 (64.96%) texture blocks written to disk
              [PretextMap status] :: 344/528 (65.15%) texture blocks written to disk
              [PretextMap status] :: 345/528 (65.34%) texture blocks written to disk
              [PretextMap status] :: 346/528 (65.53%) texture blocks written to disk
              [PretextMap status] :: 347/528 (65.72%) texture blocks written to disk
              [PretextMap status] :: 348/528 (65.91%) texture blocks written to disk
              [PretextMap status] :: 349/528 (66.10%) texture blocks written to disk
              [PretextMap status] :: 350/528 (66.29%) texture blocks written to disk
              [PretextMap status] :: 351/528 (66.48%) texture blocks written to disk
              [PretextMap status] :: 352/528 (66.67%) texture blocks written to disk
              [PretextMap status] :: 353/528 (66.86%) texture blocks written to disk
              [PretextMap status] :: 354/528 (67.05%) texture blocks written to disk
              [PretextMap status] :: 355/528 (67.23%) texture blocks written to disk
              [PretextMap status] :: 356/528 (67.42%) texture blocks written to disk
              [PretextMap status] :: 357/528 (67.61%) texture blocks written to disk
              [PretextMap status] :: 358/528 (67.80%) texture blocks written to disk
              [PretextMap status] :: 359/528 (67.99%) texture blocks written to disk
              [PretextMap status] :: 360/528 (68.18%) texture blocks written to disk
              [PretextMap status] :: 361/528 (68.37%) texture blocks written to disk
              [PretextMap status] :: 362/528 (68.56%) texture blocks written to disk
              [PretextMap status] :: 363/528 (68.75%) texture blocks written to disk
              [PretextMap status] :: 364/528 (68.94%) texture blocks written to disk
              [PretextMap status] :: 365/528 (69.13%) texture blocks written to disk
              [PretextMap status] :: 366/528 (69.32%) texture blocks written to disk
              [PretextMap status] :: 367/528 (69.51%) texture blocks written to disk
              [PretextMap status] :: 368/528 (69.70%) texture blocks written to disk
              [PretextMap status] :: 369/528 (69.89%) texture blocks written to disk
              [PretextMap status] :: 370/528 (70.08%) texture blocks written to disk
              [PretextMap status] :: 371/528 (70.27%) texture blocks written to disk
              [PretextMap status] :: 372/528 (70.45%) texture blocks written to disk
              [PretextMap status] :: 373/528 (70.64%) texture blocks written to disk
              [PretextMap status] :: 374/528 (70.83%) texture blocks written to disk
              [PretextMap status] :: 375/528 (71.02%) texture blocks written to disk
              [PretextMap status] :: 376/528 (71.21%) texture blocks written to disk
              [PretextMap status] :: 377/528 (71.40%) texture blocks written to disk
              [PretextMap status] :: 378/528 (71.59%) texture blocks written to disk
              [PretextMap status] :: 379/528 (71.78%) texture blocks written to disk
              [PretextMap status] :: 380/528 (71.97%) texture blocks written to disk
              [PretextMap status] :: 381/528 (72.16%) texture blocks written to disk
              [PretextMap status] :: 382/528 (72.35%) texture blocks written to disk
              [PretextMap status] :: 383/528 (72.54%) texture blocks written to disk
              [PretextMap status] :: 384/528 (72.73%) texture blocks written to disk
              [PretextMap status] :: 385/528 (72.92%) texture blocks written to disk
              [PretextMap status] :: 386/528 (73.11%) texture blocks written to disk
              [PretextMap status] :: 387/528 (73.30%) texture blocks written to disk
              [PretextMap status] :: 388/528 (73.48%) texture blocks written to disk
              [PretextMap status] :: 389/528 (73.67%) texture blocks written to disk
              [PretextMap status] :: 390/528 (73.86%) texture blocks written to disk
              [PretextMap status] :: 391/528 (74.05%) texture blocks written to disk
              [PretextMap status] :: 392/528 (74.24%) texture blocks written to disk
              [PretextMap status] :: 393/528 (74.43%) texture blocks written to disk
              [PretextMap status] :: 394/528 (74.62%) texture blocks written to disk
              [PretextMap status] :: 395/528 (74.81%) texture blocks written to disk
              [PretextMap status] :: 396/528 (75.00%) texture blocks written to disk
              [PretextMap status] :: 397/528 (75.19%) texture blocks written to disk
              [PretextMap status] :: 398/528 (75.38%) texture blocks written to disk
              [PretextMap status] :: 399/528 (75.57%) texture blocks written to disk
              [PretextMap status] :: 400/528 (75.76%) texture blocks written to disk
              [PretextMap status] :: 401/528 (75.95%) texture blocks written to disk
              [PretextMap status] :: 402/528 (76.14%) texture blocks written to disk
              [PretextMap status] :: 403/528 (76.33%) texture blocks written to disk
              [PretextMap status] :: 404/528 (76.52%) texture blocks written to disk
              [PretextMap status] :: 405/528 (76.70%) texture blocks written to disk
              [PretextMap status] :: 406/528 (76.89%) texture blocks written to disk
              [PretextMap status] :: 407/528 (77.08%) texture blocks written to disk
              [PretextMap status] :: 408/528 (77.27%) texture blocks written to disk
              [PretextMap status] :: 409/528 (77.46%) texture blocks written to disk
              [PretextMap status] :: 410/528 (77.65%) texture blocks written to disk
              [PretextMap status] :: 411/528 (77.84%) texture blocks written to disk
              [PretextMap status] :: 412/528 (78.03%) texture blocks written to disk
              [PretextMap status] :: 413/528 (78.22%) texture blocks written to disk
              [PretextMap status] :: 414/528 (78.41%) texture blocks written to disk
              [PretextMap status] :: 415/528 (78.60%) texture blocks written to disk
              [PretextMap status] :: 416/528 (78.79%) texture blocks written to disk
              [PretextMap status] :: 417/528 (78.98%) texture blocks written to disk
              [PretextMap status] :: 418/528 (79.17%) texture blocks written to disk
              [PretextMap status] :: 419/528 (79.36%) texture blocks written to disk
              [PretextMap status] :: 420/528 (79.55%) texture blocks written to disk
              [PretextMap status] :: 421/528 (79.73%) texture blocks written to disk
              [PretextMap status] :: 422/528 (79.92%) texture blocks written to disk
              [PretextMap status] :: 423/528 (80.11%) texture blocks written to disk
              [PretextMap status] :: 424/528 (80.30%) texture blocks written to disk
              [PretextMap status] :: 425/528 (80.49%) texture blocks written to disk
              [PretextMap status] :: 426/528 (80.68%) texture blocks written to disk
              [PretextMap status] :: 427/528 (80.87%) texture blocks written to disk
              [PretextMap status] :: 428/528 (81.06%) texture blocks written to disk
              [PretextMap status] :: 429/528 (81.25%) texture blocks written to disk
              [PretextMap status] :: 430/528 (81.44%) texture blocks written to disk
              [PretextMap status] :: 431/528 (81.63%) texture blocks written to disk
              [PretextMap status] :: 432/528 (81.82%) texture blocks written to disk
              [PretextMap status] :: 433/528 (82.01%) texture blocks written to disk
              [PretextMap status] :: 434/528 (82.20%) texture blocks written to disk
              [PretextMap status] :: 435/528 (82.39%) texture blocks written to disk
              [PretextMap status] :: 436/528 (82.58%) texture blocks written to disk
              [PretextMap status] :: 437/528 (82.77%) texture blocks written to disk
              [PretextMap status] :: 438/528 (82.95%) texture blocks written to disk
              [PretextMap status] :: 439/528 (83.14%) texture blocks written to disk
              [PretextMap status] :: 440/528 (83.33%) texture blocks written to disk
              [PretextMap status] :: 441/528 (83.52%) texture blocks written to disk
              [PretextMap status] :: 442/528 (83.71%) texture blocks written to disk
              [PretextMap status] :: 443/528 (83.90%) texture blocks written to disk
              [PretextMap status] :: 444/528 (84.09%) texture blocks written to disk
              [PretextMap status] :: 445/528 (84.28%) texture blocks written to disk
              [PretextMap status] :: 446/528 (84.47%) texture blocks written to disk
              [PretextMap status] :: 447/528 (84.66%) texture blocks written to disk
              [PretextMap status] :: 448/528 (84.85%) texture blocks written to disk
              [PretextMap status] :: 449/528 (85.04%) texture blocks written to disk
              [PretextMap status] :: 450/528 (85.23%) texture blocks written to disk
              [PretextMap status] :: 451/528 (85.42%) texture blocks written to disk
              [PretextMap status] :: 452/528 (85.61%) texture blocks written to disk
              [PretextMap status] :: 453/528 (85.80%) texture blocks written to disk
              [PretextMap status] :: 454/528 (85.98%) texture blocks written to disk
              [PretextMap status] :: 455/528 (86.17%) texture blocks written to disk
              [PretextMap status] :: 456/528 (86.36%) texture blocks written to disk
              [PretextMap status] :: 457/528 (86.55%) texture blocks written to disk
              [PretextMap status] :: 458/528 (86.74%) texture blocks written to disk
              [PretextMap status] :: 459/528 (86.93%) texture blocks written to disk
              [PretextMap status] :: 460/528 (87.12%) texture blocks written to disk
              [PretextMap status] :: 461/528 (87.31%) texture blocks written to disk
              [PretextMap status] :: 462/528 (87.50%) texture blocks written to disk
              [PretextMap status] :: 463/528 (87.69%) texture blocks written to disk
              [PretextMap status] :: 464/528 (87.88%) texture blocks written to disk
              [PretextMap status] :: 465/528 (88.07%) texture blocks written to disk
              [PretextMap status] :: 466/528 (88.26%) texture blocks written to disk
              [PretextMap status] :: 467/528 (88.45%) texture blocks written to disk
              [PretextMap status] :: 468/528 (88.64%) texture blocks written to disk
              [PretextMap status] :: 469/528 (88.83%) texture blocks written to disk
              [PretextMap status] :: 470/528 (89.02%) texture blocks written to disk
              [PretextMap status] :: 471/528 (89.20%) texture blocks written to disk
              [PretextMap status] :: 472/528 (89.39%) texture blocks written to disk
              [PretextMap status] :: 473/528 (89.58%) texture blocks written to disk
              [PretextMap status] :: 474/528 (89.77%) texture blocks written to disk
              [PretextMap status] :: 475/528 (89.96%) texture blocks written to disk
              [PretextMap status] :: 476/528 (90.15%) texture blocks written to disk
              [PretextMap status] :: 477/528 (90.34%) texture blocks written to disk
              [PretextMap status] :: 478/528 (90.53%) texture blocks written to disk
              [PretextMap status] :: 479/528 (90.72%) texture blocks written to disk
              [PretextMap status] :: 480/528 (90.91%) texture blocks written to disk
              [PretextMap status] :: 481/528 (91.10%) texture blocks written to disk
              [PretextMap status] :: 482/528 (91.29%) texture blocks written to disk
              [PretextMap status] :: 483/528 (91.48%) texture blocks written to disk
              [PretextMap status] :: 484/528 (91.67%) texture blocks written to disk
              [PretextMap status] :: 485/528 (91.86%) texture blocks written to disk
              [PretextMap status] :: 486/528 (92.05%) texture blocks written to disk
              [PretextMap status] :: 487/528 (92.23%) texture blocks written to disk
              [PretextMap status] :: 488/528 (92.42%) texture blocks written to disk
              [PretextMap status] :: 489/528 (92.61%) texture blocks written to disk
              [PretextMap status] :: 490/528 (92.80%) texture blocks written to disk
              [PretextMap status] :: 491/528 (92.99%) texture blocks written to disk
              [PretextMap status] :: 492/528 (93.18%) texture blocks written to disk
              [PretextMap status] :: 493/528 (93.37%) texture blocks written to disk
              [PretextMap status] :: 494/528 (93.56%) texture blocks written to disk
              [PretextMap status] :: 495/528 (93.75%) texture blocks written to disk
              [PretextMap status] :: 496/528 (93.94%) texture blocks written to disk
              [PretextMap status] :: 497/528 (94.13%) texture blocks written to disk
              [PretextMap status] :: 498/528 (94.32%) texture blocks written to disk
              [PretextMap status] :: 499/528 (94.51%) texture blocks written to disk
              [PretextMap status] :: 500/528 (94.70%) texture blocks written to disk
              [PretextMap status] :: 501/528 (94.89%) texture blocks written to disk
              [PretextMap status] :: 502/528 (95.08%) texture blocks written to disk
              [PretextMap status] :: 503/528 (95.27%) texture blocks written to disk
              [PretextMap status] :: 504/528 (95.45%) texture blocks written to disk
              [PretextMap status] :: 505/528 (95.64%) texture blocks written to disk
              [PretextMap status] :: 506/528 (95.83%) texture blocks written to disk
              [PretextMap status] :: 507/528 (96.02%) texture blocks written to disk
              [PretextMap status] :: 508/528 (96.21%) texture blocks written to disk
              [PretextMap status] :: 509/528 (96.40%) texture blocks written to disk
              [PretextMap status] :: 510/528 (96.59%) texture blocks written to disk
              [PretextMap status] :: 511/528 (96.78%) texture blocks written to disk
              [PretextMap status] :: 512/528 (96.97%) texture blocks written to disk
              [PretextMap status] :: 513/528 (97.16%) texture blocks written to disk
              [PretextMap status] :: 514/528 (97.35%) texture blocks written to disk
              [PretextMap status] :: 515/528 (97.54%) texture blocks written to disk
              [PretextMap status] :: 516/528 (97.73%) texture blocks written to disk
              [PretextMap status] :: 517/528 (97.92%) texture blocks written to disk
              [PretextMap status] :: 518/528 (98.11%) texture blocks written to disk
              [PretextMap status] :: 519/528 (98.30%) texture blocks written to disk
              [PretextMap status] :: 520/528 (98.48%) texture blocks written to disk
              [PretextMap status] :: 521/528 (98.67%) texture blocks written to disk
              [PretextMap status] :: 522/528 (98.86%) texture blocks written to disk
              [PretextMap status] :: 523/528 (99.05%) texture blocks written to disk
              [PretextMap status] :: 524/528 (99.24%) texture blocks written to disk
              [PretextMap status] :: 525/528 (99.43%) texture blocks written to disk
              [PretextMap status] :: 526/528 (99.62%) texture blocks written to disk
              [PretextMap status] :: 527/528 (99.81%) texture blocks written to disk
              [PretextMap status] :: 528/528 (100.00%) texture blocks written to disk
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              filter {"__current_case__": 0, "filter_type": ""}
              map_qual "0"
              sorting {"__current_case__": 1, "sortby": "length", "sortorder": "descend"}
      • Step 3: Haplotype 2:

        • step_state: scheduled
      • Step 21: toolshed.g2.bx.psu.edu/repos/devteam/column_maker/Add_a_column1/2.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/column_maker/aff5135563c6/column_maker/column_maker.py' --column-types str,int,int  --file '/tmp/tmpjw6cynca/job_working_directory/000/28/configs/tmpzrpb6lx_' --fail-on-non-existent-columns --fail-on-non-computable '/tmp/tmpjw6cynca/files/3/e/7/dataset_3e729f81-0bc8-412f-9c25-eb5a0a70eaef.dat' '/tmp/tmpjw6cynca/job_working_directory/000/28/outputs/dataset_f00fd1b1-1133-4406-8967-88aa20e646f6.dat'

            Exit Code:

            • 0

            Standard Output:

            • abs(int(c3)-int(c2))
              Computing 1 new columns with instructions ['abs(int(c3)-int(c2));;']
              Computed new column values for 100.00% of 4 lines written.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              avoid_scientific_notation false
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              error_handling {"auto_col_types": true, "fail_on_non_existent_columns": true, "non_computable": {"__current_case__": 0, "action": "--fail-on-non-computable"}}
              ops {"__current_case__": 0, "expressions": [{"__index__": 0, "add_column": {"__current_case__": 0, "mode": "", "pos": ""}, "cond": "abs(int(c3)-int(c2))"}], "header_lines_select": "no"}
      • Step 22: Cut1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • perl '/tmp/tmpjw6cynca/galaxy-dev/tools/filters/cutWrapper.pl' '/tmp/tmpjw6cynca/files/a/f/1/dataset_af1611cf-13ed-421f-aae1-dc003e14c3a8.dat' 'c1,c2,c3' T '/tmp/tmpjw6cynca/job_working_directory/000/29/outputs/dataset_21c2db6c-ac4d-47c2-bb57-5add1aef8676.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "bed"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              columnList "c1,c2,c3"
              dbkey "?"
              delimiter "T"
      • Step 23: toolshed.g2.bx.psu.edu/repos/bgruening/deeptools_bam_coverage/deeptools_bam_coverage/3.5.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -s '/tmp/tmpjw6cynca/files/c/2/2/dataset_c224f6b3-3caf-4c02-ae03-47a2efbb6173.dat' one.bam && ln -s '/tmp/tmpjw6cynca/files/_metadata_files/0/b/c/metadata_0bcfa329-5226-4934-901c-ccca1b539490.dat' one.bam.bai &&  bamCoverage --numberOfProcessors "${GALAXY_SLOTS:-4}"  --bam one.bam --outFileName '/tmp/tmpjw6cynca/job_working_directory/000/30/outputs/dataset_78dcf67b-5870-4d0f-a2b7-1f069f4ffac2.dat' --outFileFormat 'bigwig'  --binSize 100

            Exit Code:

            • 0

            Standard Error:

            • bamFilesList: ['one.bam']
              binLength: 100
              numberOfSamples: None
              blackListFileName: None
              skipZeroOverZero: False
              bed_and_bin: False
              genomeChunkSize: None
              defaultFragmentLength: read length
              numberOfProcessors: 1
              verbose: False
              region: None
              bedFile: None
              minMappingQuality: None
              ignoreDuplicates: False
              chrsToSkip: []
              stepSize: 100
              center_read: False
              samFlag_include: None
              samFlag_exclude: None
              minFragmentLength: 0
              maxFragmentLength: 0
              zerosToNans: False
              smoothLength: None
              save_data: False
              out_file_for_raw_data: None
              maxPairedFragmentLength: 1000
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              advancedOpt {"__current_case__": 0, "showAdvancedOpt": "no"}
              binSize "100"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              exactScaling false
              outFileFormat "bigwig"
              region ""
              scaling {"__current_case__": 3, "type": "no"}
      • Step 24: toolshed.g2.bx.psu.edu/repos/devteam/column_maker/Add_a_column1/2.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/column_maker/aff5135563c6/column_maker/column_maker.py' --column-types str,int,int  --file '/tmp/tmpjw6cynca/job_working_directory/000/31/configs/tmpgm6v07de' --fail-on-non-existent-columns --fail-on-non-computable '/tmp/tmpjw6cynca/files/2/1/c/dataset_21c2db6c-ac4d-47c2-bb57-5add1aef8676.dat' '/tmp/tmpjw6cynca/job_working_directory/000/31/outputs/dataset_a5724c45-94fc-4349-8fd5-b721397772b7.dat'

            Exit Code:

            • 0

            Standard Output:

            • abs(int(c3)-int(c2))
              Computing 1 new columns with instructions ['abs(int(c3)-int(c2));;']
              Computed new column values for 100.00% of 1 lines written.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              avoid_scientific_notation false
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              error_handling {"auto_col_types": true, "fail_on_non_existent_columns": true, "non_computable": {"__current_case__": 0, "action": "--fail-on-non-computable"}}
              ops {"__current_case__": 0, "expressions": [{"__index__": 0, "add_column": {"__current_case__": 0, "mode": "", "pos": ""}, "cond": "abs(int(c3)-int(c2))"}], "header_lines_select": "no"}
      • Step 25: Add Coverage Track:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cp '/tmp/tmpjw6cynca/files/a/3/4/dataset_a3452744-f28b-4c9d-8269-4d586b345456.dat' input.pretext && ln -s '/tmp/tmpjw6cynca/files/7/8/d/dataset_78dcf67b-5870-4d0f-a2b7-1f069f4ffac2.dat' input.bigwig && bigWigToBedGraph input.bigwig /dev/stdout | PretextGraph -i input.pretext -n 'coverage' -o output.pretext

            Exit Code:

            • 0

            Standard Output:

            • [PretextGraph status] :: Release mode, number of thread: 4
              
              [PretextGraph status] :: Pretext file: input.pretext
              [PretextGraph status] :: Graph name: 'coverage'
              [PretextGraph status] :: The extension of the graph name is default, set the data_type into 0 (default).
              [PretextGraph status] :: Pretext file: output.pretext
              [PretextGraph status] :: Reading file...
              [PretextGraph status] :: File read
              
                                                                                              
              [PretextGraph status] :: 16.4 k bedgraph lines read
              [PretextGraph status] :: Transfer f32 to s32...
              [PretextGraph status] :: Saving graph...
              [PretextGraph status] :: Done
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              name "coverage"
      • Step 26: Test if telomere track is empty:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "bedgraph"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              param_type "text"
              remove_newlines true
      • Step 27: False if telomere track is empty:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input_param_type {"__current_case__": 0, "input_param": "scaffold_10.H1\t0\t11012\t11012", "mappings": [{"__index__": 0, "from": "", "to": "false"}], "type": "text"}
              output_param_type "boolean"
              unmapped {"__current_case__": 2, "default_value": "true", "on_unmapped": "default"}
      • Step 28: Add telomere track:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cp '/tmp/tmpjw6cynca/files/b/9/2/dataset_b92d64c6-4413-4ff4-b384-2cbc6ccfa3f5.dat' input.pretext && cat '/tmp/tmpjw6cynca/files/a/5/7/dataset_a5724c45-94fc-4349-8fd5-b721397772b7.dat' | PretextGraph -i input.pretext -n 'telomeres_gap_format' -o output.pretext

            Exit Code:

            • 0

            Standard Output:

            • [PretextGraph status] :: Release mode, number of thread: 4
              
              [PretextGraph status] :: Pretext file: input.pretext
              [PretextGraph status] :: Graph name: 'telomeres_gap_format'
              [PretextGraph status] :: The extension of the graph name is gap, set the data_type to 2 (gap).
              [PretextGraph status] :: Pretext file: output.pretext
              [PretextGraph status] :: Reading file...
              [PretextGraph status] :: File read
              
              [PretextGraph status] :: Transfer f32 to s32...
              [PretextGraph status] :: Saving graph...
              [PretextGraph status] :: Done
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              name "telomeres_gap_format"
      • Step 29: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 37, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 34, "src": "hda"}]}}]}}
      • Step 30: Add gaps track:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cp '/tmp/tmpjw6cynca/files/5/a/9/dataset_5a9fa82f-ee8a-4417-9649-074eb90ae3ac.dat' input.pretext && cat '/tmp/tmpjw6cynca/files/f/0/0/dataset_f00fd1b1-1133-4406-8967-88aa20e646f6.dat' | PretextGraph -i input.pretext -n 'gaps' -o output.pretext

            Exit Code:

            • 0

            Standard Output:

            • [PretextGraph status] :: Release mode, number of thread: 4
              
              [PretextGraph status] :: Pretext file: input.pretext
              [PretextGraph status] :: Graph name: 'gaps'
              [PretextGraph status] :: The extension of the graph name is gap, set the data_type to 2 (gap).
              [PretextGraph status] :: Pretext file: output.pretext
              [PretextGraph status] :: Reading file...
              [PretextGraph status] :: File read
              
              [PretextGraph status] :: Transfer f32 to s32...
              [PretextGraph status] :: Saving graph...
              [PretextGraph status] :: Done
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              name "gaps"
      • Step 4: Do you want to add suffixes to the scaffold names?:

        • step_state: scheduled
      • Step 31: toolshed.g2.bx.psu.edu/repos/iuc/pretext_snapshot/pretext_snapshot/0.0.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • PretextSnapshot -m /tmp/tmpjw6cynca/files/a/3/0/dataset_a30298b0-a876-49c9-a1a3-27625a6707d9.dat -f png -r 2000 -c 5 --sequences '=full' --minTexels 64 --gridSize 1 --gridColour black '' -o output --prefix pretext_snapshot

            Exit Code:

            • 0

            Standard Output:

            • [PretextSnapshot status] :: Input map: '/tmp/tmpjw6cynca/files/a/3/0/dataset_a30298b0-a876-49c9-a1a3-27625a6707d9.dat'
              [PretextSnapshot status] :: Output format: png
              [PretextSnapshot status] :: Output resolution: 2000 pixels
              [PretextSnapshot status] :: Colour map: Three Wave Blue-Green-Yellow
              [PretextSnapshot status] :: Minimum texels per image: 64
              [PretextSnapshot status] :: Grid size: 1 pixel
              [PretextSnapshot status] :: Grid colour: 000000ff
              [PretextSnapshot status] :: Output path: 'output/pretext_snapshot'
              [PretextSnapshot status] :: Indexing file...
              [PretextSnapshot status] :: File indexed
              [PretextSnapshot status] :: 'output/pretext_snapshotFullMap.png' written to disk
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              colormap "5"
              dbkey "?"
              formats {"__current_case__": 0, "outformat": "png"}
              grid {"__current_case__": 0, "gridcolor": "black", "gridsize": "1", "showGrid": "yes"}
              mintexels "64"
              resolution "2000"
              sequencenames false
              sequences "=full"
      • Step 32: __EXTRACT_DATASET__:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __workflow_invocation_uuid__ "aafebfa80b2811f0aeed000d3a3012a8"
              input {"values": [{"id": 7, "src": "hdca"}]}
              which {"__current_case__": 0, "which_dataset": "first"}
      • Step 5: First Haplotype suffix:

        • step_state: scheduled
      • Step 6: Second Haplotype suffix:

        • step_state: scheduled
      • Step 7: Hi-C reads:

        • step_state: scheduled
      • Step 8: Do you want to trim the Hi-C data?:

        • step_state: scheduled
      • Step 9: Telomere repeat to suit species:

        • step_state: scheduled
      • Step 10: PacBio reads:

        • step_state: scheduled
    • Other invocation details
      • history_id

        • a807273f93b894f8
      • history_state

        • ok
      • invocation_id

        • a807273f93b894f8
      • invocation_state

        • scheduled
      • workflow_id

        • 6939a158f632cb4b
  • ✅ hi-c-map-for-assembly-manual-curation.ga_1

    Workflow invocation details

    • Invocation Messages

    • Steps
      • Step 1: Haplotype 1:

        • step_state: scheduled
      • Step 2: Will you use a second haplotype?:

        • step_state: scheduled
      • Step 11: Hap2 not provided:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input_param_type {"__current_case__": 3, "input_param": false, "mappings": [{"__index__": 0, "from": false, "to": "true"}, {"__index__": 1, "from": true, "to": "false"}], "type": "boolean"}
              output_param_type "boolean"
              unmapped {"__current_case__": 2, "default_value": "false", "on_unmapped": "default"}
      • Step 12: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Hap1:

            • step_state: scheduled
          • Step 2: Hap2:

            • step_state: scheduled
          • Step 11: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7dd0b2a11f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 51, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 43, "src": "hda"}]}}]}}
          • Step 12: toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_cat/9.3+galaxy1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7dd0b2a11f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  queries [{"__index__": 0, "inputs2": {"values": [{"id": 53, "src": "hda"}]}}]
          • Step 3: Do you want to add suffixes to the scaffold names?:

            • step_state: scheduled
          • Step 4: Hap1 suffix:

            • step_state: scheduled
          • Step 5: Hap2 suffix:

            • step_state: scheduled
          • Step 6: Expression for hap1 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7dd0b2a11f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H1", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 7: Expression for hap2 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7dd0b2a11f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H2", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 8: add hap1 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7dd0b2a11f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": null}]
          • Step 9: add hap2 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7dd0b2a11f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": null}]
          • Step 10: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7dd0b2a11f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 50, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 42, "src": "hda"}]}}]}}
      • Step 13: concatenate HiFi:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cat '/tmp/tmpjw6cynca/files/d/e/b/dataset_debe1cf0-9700-4541-88e7-cb4a46ecf254.dat' >> '/tmp/tmpjw6cynca/job_working_directory/000/53/outputs/dataset_56452c9a-629e-4d6e-9d61-6e2d04e5eebf.dat' && exit 0

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              queries []
      • Step 14: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Hap1:

            • step_state: scheduled
          • Step 2: Do you want to add suffixes to the scaffold names?:

            • step_state: scheduled
          • Step 3: Hap1 suffix:

            • step_state: scheduled
          • Step 4: Expression for hap1 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7de0b2a11f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H1", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 5: add hap1 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • sed -r --sandbox -e 's/>.+$/&.H1/g' '/tmp/tmpjw6cynca/files/e/4/2/dataset_e42d62a1-2ebd-4547-ba8d-52e18bb7e813.dat' > '/tmp/tmpjw6cynca/job_working_directory/000/55/outputs/dataset_2a924c18-ec8a-47b0-a8de-8c8fa1d2c560.dat'

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7de0b2a11f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": "&.H1"}]
          • Step 6: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7de0b2a11f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 57, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 42, "src": "hda"}]}}]}}
      • Step 15: Pick Assembly:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 54, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 58, "src": "hda"}]}}]}}
      • Step 16: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Reference:

            • step_state: scheduled
          • Step 2: Hi-C reads:

            • step_state: scheduled
          • Step 3: Do you want to trim the Hi-C data?:

            • step_state: scheduled
          • Step 4: Trim Hi-C reads 2:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • ln -f -s '/tmp/tmpjw6cynca/files/1/f/6/dataset_1f641452-af7e-4f5c-890f-680ef4e97b47.dat' 'Hi-C reads_1.fq.gz' && ln -f -s '/tmp/tmpjw6cynca/files/3/b/6/dataset_3b6ecedd-64d9-4f7c-a352-aa905ac939f6.dat' 'Hi-C reads_2.fq.gz' &&  cutadapt  -j=${GALAXY_SLOTS:-4}     --error-rate=0.1 --times=1 --overlap=3    --action=trim   --cut=5 -U 5       --minimum-length=1      -o 'out1.fq.gz' -p 'out2.fq.gz'  'Hi-C reads_1.fq.gz' 'Hi-C reads_2.fq.gz'  > report.txt

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7df0b2a11f0aeed000d3a3012a8"
                  adapter_options {"action": "trim", "error_rate": "0.1", "match_read_wildcards": false, "no_indels": false, "no_match_adapter_wildcards": true, "overlap": "3", "revcomp": false, "times": "1"}
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  filter_options {"discard_casava": false, "discard_trimmed": false, "discard_untrimmed": false, "max_average_error_rate": null, "max_expected_errors": null, "max_n": null, "maximum_length": null, "maximum_length2": null, "minimum_length": "1", "minimum_length2": null, "pair_filter": "any"}
                  library {"__current_case__": 2, "input_1": {"values": [{"id": 14, "src": "dce"}]}, "pair_adapters": false, "r1": {"adapters": [], "anywhere_adapters": [], "front_adapters": []}, "r2": {"adapters2": [], "anywhere_adapters2": [], "front_adapters2": []}, "type": "paired_collection"}
                  other_trimming_options {"cut": "5", "cut2": "5", "nextseq_trim": "0", "poly_a": false, "quality_cutoff": "0", "quality_cutoff2": "", "shorten_options": {"__current_case__": 1, "shorten_values": "False"}, "shorten_options_r2": {"__current_case__": 1, "shorten_values_r2": "False"}, "trim_n": false}
                  output_selector ["report"]
                  read_mod_options {"length_tag": null, "rename": null, "strip_suffix": null, "zero_cap": false}
          • Step 5: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7df0b2a11f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 20, "src": "dce"}]}}, {"__index__": 1, "value": {"values": [{"id": 15, "src": "dce"}]}}]}}
              • Job 2:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7df0b2a11f0aeed000d3a3012a8"
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 21, "src": "dce"}]}}, {"__index__": 1, "value": {"values": [{"id": 16, "src": "dce"}]}}]}}
          • Step 6: toolshed.g2.bx.psu.edu/repos/iuc/bwa_mem2/bwa_mem2/2.2.1+galaxy1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • set -o | grep -q pipefail && set -o pipefail;  ln -s '/tmp/tmpjw6cynca/files/2/a/9/dataset_2a924c18-ec8a-47b0-a8de-8c8fa1d2c560.dat' 'localref.fa' && bwa-mem2 index 'localref.fa' &&    bwa-mem2 mem -t "${GALAXY_SLOTS:-1}" -v 1   -k '19' -w '100' -d '100' -r '1.5' -y '20' -c '500' -D '0.5' -W '0' -m '50' -S -P    -T '20' -h '5' -a                      'localref.fa' '/tmp/tmpjw6cynca/files/6/2/4/dataset_6249012f-cde2-4f53-b1c2-b1ea20099ae1.dat' '/tmp/tmpjw6cynca/files/7/5/a/dataset_75a8b988-ca3d-4f9d-bfed-0602f6bcb871.dat'  | samtools sort -@${GALAXY_SLOTS:-2} -T "${TMPDIR:-.}" -O bam -o '/tmp/tmpjw6cynca/job_working_directory/000/61/outputs/dataset_0935e66b-9f1f-45a7-bf8d-64ff6cba22f7.dat'

                Exit Code:

                • 0

                Standard Error:

                • Looking to launch executable "/usr/local/bin/bwa-mem2.avx2", simd = .avx2
                  Launching executable "/usr/local/bin/bwa-mem2.avx2"
                  [bwa_index] Pack FASTA... 0.69 sec
                  * Entering FMI_search
                  init ticks = 12243835893
                  ref seq len = 275747614
                  binary seq ticks = 5904747156
                  build suffix-array ticks = 102461454974
                  pos: 34468452, ref_seq_len__: 34468451
                  build fm-index ticks = 22400054162
                  Total time taken: 59.5645
                  Looking to launch executable "/usr/local/bin/bwa-mem2.avx2", simd = .avx2
                  Launching executable "/usr/local/bin/bwa-mem2.avx2"
                  -----------------------------
                  Executing in AVX2 mode!!
                  -----------------------------
                  * SA compression enabled with xfactor: 8
                  * Ref file: localref.fa
                  * Entering FMI_search
                  * Index file found. Loading index from localref.fa.bwt.2bit.64
                  * Reference seq len for bi-index = 275747615
                  * sentinel-index: 224703773
                  * Count:
                  0,	1
                  1,	80527259
                  2,	137873808
                  3,	195220357
                  4,	275747615
                  
                  * Reading other elements of the index from files localref.fa
                  * Index prefix: localref.fa
                  * Read 0 ALT contigs
                  * Done reading Index!!
                  * Reading reference genome..
                  * Binary seq file = localref.fa.0123
                  * Reference genome size: 275747614 bp
                  * Done reading reference genome !!
                  
                  ------------------------------------------
                  1. Memory pre-allocation for Chaining: 139.3584 MB
                  2. Memory pre-allocation for BSW: 239.6170 MB
                  3. Memory pre-allocation for BWT: 77.3142 MB
                  ------------------------------------------
                  * Threads used (compute): 1
                  * No. of pipeline threads: 2
                  
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] Reallocating initial memory allocations!!
                  [0000] Calling mem_process_seqs.., task: 0
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 137873807, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (2021, 3042, 5240)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 11678)
                  [0000][PE] mean and std.dev: (3782.74, 2698.62)
                  [0000][PE] low and high boundaries for proper pairs: (1, 14897)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (151, 230, 368)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 802)
                  [0000][PE] mean and std.dev: (244.21, 150.93)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1019)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (2708, 3331, 6114)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 12926)
                  [0000][PE] mean and std.dev: (4077.29, 2787.30)
                  [0000][PE] low and high boundaries for proper pairs: (1, 16332)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1608, 2936, 5772)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 14100)
                  [0000][PE] mean and std.dev: (3700.00, 2741.58)
                  [0000][PE] low and high boundaries for proper pairs: (1, 18264)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 30.876 CPU sec, 31.117 real sec
                  [0000] Calling mem_process_seqs.., task: 1
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 137873807, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (1977, 3478, 6066)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 14244)
                  [0000][PE] mean and std.dev: (3961.96, 2537.86)
                  [0000][PE] low and high boundaries for proper pairs: (1, 18333)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (146, 241, 386)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 866)
                  [0000][PE] mean and std.dev: (246.95, 151.23)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1106)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (1463, 3158, 6983)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 18023)
                  [0000][PE] mean and std.dev: (3938.16, 3007.49)
                  [0000][PE] low and high boundaries for proper pairs: (1, 23543)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1764, 3269, 5104)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 11784)
                  [0000][PE] mean and std.dev: (3596.50, 2353.59)
                  [0000][PE] low and high boundaries for proper pairs: (1, 15124)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 30.919 CPU sec, 31.775 real sec
                  [0000] Calling mem_process_seqs.., task: 2
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 137873807, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (1600, 3333, 5222)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 12466)
                  [0000][PE] mean and std.dev: (3541.11, 2405.55)
                  [0000][PE] low and high boundaries for proper pairs: (1, 16088)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (164, 263, 457)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 1043)
                  [0000][PE] mean and std.dev: (279.92, 172.80)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1336)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (1761, 2418, 4426)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 9756)
                  [0000][PE] mean and std.dev: (3069.66, 2241.85)
                  [0000][PE] low and high boundaries for proper pairs: (1, 12421)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (2055, 4260, 7105)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 17205)
                  [0000][PE] mean and std.dev: (4350.33, 2963.76)
                  [0000][PE] low and high boundaries for proper pairs: (1, 22255)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 29.930 CPU sec, 30.808 real sec
                  [0000] Calling mem_process_seqs.., task: 3
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 137873807, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (1364, 3003, 5283)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 13121)
                  [0000][PE] mean and std.dev: (3348.47, 2513.77)
                  [0000][PE] low and high boundaries for proper pairs: (1, 17040)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (164, 272, 463)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 1061)
                  [0000][PE] mean and std.dev: (271.60, 174.49)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1360)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (2196, 3194, 6130)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 13998)
                  [0000][PE] mean and std.dev: (4047.59, 2612.96)
                  [0000][PE] low and high boundaries for proper pairs: (1, 17932)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1777, 4213, 5514)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 12988)
                  [0000][PE] mean and std.dev: (4154.35, 2778.07)
                  [0000][PE] low and high boundaries for proper pairs: (1, 16725)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 29.311 CPU sec, 36.966 real sec
                  [0000] Calling mem_process_seqs.., task: 4
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 3899368, nseq: 26708
                  	[0000][ M::kt_pipeline] read 26708 sequences (3899368 bp)...
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 137873807, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (1040, 2626, 5380)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 14060)
                  [0000][PE] mean and std.dev: (3261.88, 2607.72)
                  [0000][PE] low and high boundaries for proper pairs: (1, 18400)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (172, 281, 464)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 1048)
                  [0000][PE] mean and std.dev: (296.82, 192.41)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1340)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (700, 2548, 5583)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 15349)
                  [0000][PE] mean and std.dev: (3176.85, 2680.05)
                  [0000][PE] low and high boundaries for proper pairs: (1, 20232)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1957, 4138, 6358)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 15160)
                  [0000][PE] mean and std.dev: (4190.45, 2530.34)
                  [0000][PE] low and high boundaries for proper pairs: (1, 19561)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 28.148 CPU sec, 28.882 real sec
                  [0000] Calling mem_process_seqs.., task: 5
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 0, nseq: 0
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 137873807, n: 26708
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (1531, 3089, 5527)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 13519)
                  [0000][PE] mean and std.dev: (3502.83, 2580.78)
                  [0000][PE] low and high boundaries for proper pairs: (1, 17515)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (146, 223, 370)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 818)
                  [0000][PE] mean and std.dev: (233.56, 144.74)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1042)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (1436, 3889, 6170)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 15638)
                  [0000][PE] mean and std.dev: (3972.06, 2638.61)
                  [0000][PE] low and high boundaries for proper pairs: (1, 20372)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1130, 3715, 6897)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 18431)
                  [0000][PE] mean and std.dev: (4249.90, 3057.36)
                  [0000][PE] low and high boundaries for proper pairs: (1, 24198)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 26708 reads in 10.897 CPU sec, 11.550 real sec
                  [0000] read_chunk: 10000000, work_chunk_size: 0, nseq: 0
                  [0000] Computation ends..
                  No. of OMP threads: 1
                  Processor is running @2445.568832 MHz
                  Runtime profile:
                  
                  	Time taken for main_mem function: 171.97 sec
                  
                  	IO times (sec) :
                  	Reading IO time (reads) avg: 1.57, (1.57, 1.57)
                  	Writing IO time (SAM) avg: 23.64, (23.64, 23.64)
                  	Reading IO time (Reference Genome) avg: 0.07, (0.07, 0.07)
                  	Index read time avg: 0.29, (0.29, 0.29)
                  
                  	Overall time (sec) (Excluding Index reading time):
                  	PROCESS() (Total compute time + (read + SAM) IO time) : 171.60
                  	MEM_PROCESS_SEQ() (Total compute time (Kernel + SAM)), avg: 171.08, (171.08, 171.08)
                  
                  	 SAM Processing time (sec):
                  	--WORKER_SAM avg: 83.51, (83.51, 83.51)
                  
                  	Kernels' compute time (sec):
                  	Total kernel (smem+sal+bsw) time avg: 87.55, (87.55, 87.55)
                  		SMEM compute avg: 36.79, (36.79, 36.79)
                  		SAL compute avg: 8.05, (8.05, 8.05)
                  				MEM_SA avg: 3.84, (3.84, 3.84)
                  
                  		BSW time, avg: 34.00, (34.00, 34.00)
                  
                  Important parameter settings: 
                  	BATCH_SIZE: 512
                  	MAX_SEQ_LEN_REF: 256
                  	MAX_SEQ_LEN_QER: 128
                  	MAX_SEQ_LEN8: 128
                  	SEEDS_PER_READ: 500
                  	SIMD_WIDTH8 X: 32
                  	SIMD_WIDTH16 X: 16
                  	AVG_SEEDS_PER_READ: 64
                  [bam_sort_core] merging from 1 files and 1 in-memory blocks...
                  

                Standard Output:

                • ref_seq_len = 275747614
                  count = 0, 80527258, 137873807, 195220356, 275747614
                  BWT[224703773] = 4
                  CP_SHIFT = 6, CP_MASK = 63
                  sizeof CP_OCC = 64
                  max_occ_ind = 4308556
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7df0b2a11f0aeed000d3a3012a8"
                  analysis_type {"__current_case__": 4, "algorithmic_options": {"D": "0.5", "P": true, "S": true, "W": "0", "__current_case__": 0, "algorithmic_options_selector": "set", "c": "500", "d": "100", "e": false, "k": "19", "m": "50", "r": "1.5", "w": "100", "y": "20"}, "analysis_type_selector": "full", "io_options": {"C": false, "K": null, "M": false, "T": "20", "V": false, "Y": false, "__current_case__": 0, "a": true, "five": false, "h": "5", "io_options_selector": "set", "q": false}, "scoring_options": {"__current_case__": 1, "scoring_options_selector": "do_not_set"}}
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  fastq_input {"__current_case__": 2, "fastq_input1": {"values": [{"id": 22, "src": "dce"}]}, "fastq_input_selector": "paired_collection", "iset_stats": null}
                  output_sort "coordinate"
                  reference_source {"__current_case__": 1, "ref_file": {"values": [{"id": 59, "src": "hda"}]}, "reference_source_selector": "history"}
                  rg {"__current_case__": 3, "rg_selector": "do_not_set"}
          • Step 7: toolshed.g2.bx.psu.edu/repos/devteam/picard/picard_MergeSamFiles/3.1.1.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • _JAVA_OPTIONS=${_JAVA_OPTIONS:-"-Xmx2048m -Xms256m -Djava.io.tmpdir=${TMPDIR:-${_GALAXY_JOB_TMPDIR}}"} && export _JAVA_OPTIONS &&  picard MergeSamFiles  --INPUT '/tmp/tmpjw6cynca/files/0/9/3/dataset_0935e66b-9f1f-45a7-bf8d-64ff6cba22f7.dat'  --OUTPUT '/tmp/tmpjw6cynca/job_working_directory/000/62/outputs/dataset_703d773a-37db-4420-9d07-fd62b2cff94c.dat' --MERGE_SEQUENCE_DICTIONARIES 'false'  --ASSUME_SORTED 'true'  --USE_THREADING true --SORT_ORDER coordinate --VALIDATION_STRINGENCY 'LENIENT' --QUIET true --VERBOSITY ERROR

                Exit Code:

                • 0

                Standard Error:

                • /usr/local/bin/picard: line 5: warning: setlocale: LC_ALL: cannot change locale (en_US.UTF-8): No such file or directory
                  Picked up _JAVA_OPTIONS: -Xmx2048m -Xms256m -Djava.io.tmpdir=/tmp/tmpjw6cynca/tmp
                  Mar 27, 2025 4:51:05 PM com.intel.gkl.NativeLibraryLoader load
                  INFO: Loading libgkl_compression.so from jar:file:/usr/local/share/picard-3.1.1-0/picard.jar!/com/intel/gkl/native/libgkl_compression.so
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2d2d7df0b2a11f0aeed000d3a3012a8"
                  assume_sorted true
                  chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  comments []
                  dbkey "?"
                  merge_sequence_dictionaries false
                  validation_stringency "LENIENT"
      • Step 17: toolshed.g2.bx.psu.edu/repos/bgruening/gfastats/gfastats/1.3.9+galaxy2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • gfastats '/tmp/tmpjw6cynca/files/2/a/9/dataset_2a924c18-ec8a-47b0-a8de-8c8fa1d2c560.dat' --out-coord g   --tabular > '/tmp/tmpjw6cynca/job_working_directory/000/63/outputs/dataset_9a7f18e5-fbbe-4167-9f10-7dfb02c4c31a.dat' --threads ${GALAXY_SLOTS:-8}

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode_condition {"__current_case__": 1, "discover_paths": false, "locale": false, "selector": "statistics", "statistics_condition": {"__current_case__": 1, "out_coord": "g", "selector": "coordinates"}, "tabular": true}
              target_condition {"__current_case__": 0, "target_option": "false"}
      • Step 18: toolshed.g2.bx.psu.edu/repos/iuc/seqtk/seqtk_telo/1.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • seqtk telo -m 'CCCTAA' -p '1' -d '2000' -s '300'  '/tmp/tmpjw6cynca/files/2/a/9/dataset_2a924c18-ec8a-47b0-a8de-8c8fa1d2c560.dat' > '/tmp/tmpjw6cynca/job_working_directory/000/64/outputs/dataset_3caa6820-5fb7-4260-acb4-ab5801408326.dat'

            Exit Code:

            • 0

            Standard Error:

            • 0	137873807
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              P false
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              d "2000"
              dbkey "?"
              m "CCCTAA"
              p "1"
              s "300"
      • Step 19: toolshed.g2.bx.psu.edu/repos/iuc/minimap2/minimap2/2.28+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -f -s '/tmp/tmpjw6cynca/files/2/a/9/dataset_2a924c18-ec8a-47b0-a8de-8c8fa1d2c560.dat' reference.fa && minimap2 -x map-hifi    --q-occ-frac 0.01       -t ${GALAXY_SLOTS:-4} reference.fa '/tmp/tmpjw6cynca/files/5/6/4/dataset_56452c9a-629e-4d6e-9d61-6e2d04e5eebf.dat' -a | samtools view --no-PG -hT reference.fa | samtools sort -@${GALAXY_SLOTS:-2} -T "${TMPDIR:-.}" -O BAM -o '/tmp/tmpjw6cynca/job_working_directory/000/65/outputs/dataset_4302f78e-444f-46b2-a31b-d8fd4444548b.dat'

            Exit Code:

            • 0

            Standard Error:

            • [M::mm_idx_gen::4.964*0.75] collected minimizers
              [M::mm_idx_gen::6.346*0.78] sorted minimizers
              [M::main::6.346*0.78] loaded/built the index for 1 target sequence(s)
              [M::mm_mapopt_update::6.516*0.78] mid_occ = 113
              [M::mm_idx_stat] kmer size: 19; skip: 19; is_hpc: 0; #seq: 1
              [M::mm_idx_stat::6.629*0.79] distinct minimizers: 11276538 (95.62% are singletons); average occurrences: 1.229; average spacing: 9.951; total length: 137873807
              [M::worker_pipeline::146.365*0.98] mapped 5426 sequences
              [M::main] Version: 2.28-r1209
              [M::main] CMD: minimap2 -x map-hifi --q-occ-frac 0.01 -t 1 -a reference.fa /tmp/tmpjw6cynca/files/5/6/4/dataset_56452c9a-629e-4d6e-9d61-6e2d04e5eebf.dat
              [M::main] Real time: 146.444 sec; CPU: 142.957 sec; Peak RSS: 1.253 GB
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              alignment_options {"A": null, "B": null, "E": null, "E2": null, "O": null, "O2": null, "no_end_flt": true, "s": null, "splicing": {"__current_case__": 0, "splice_mode": "preset"}, "z": null, "z2": null}
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              fastq_input {"__current_case__": 0, "analysis_type_selector": "map-hifi", "fastq_input1": {"values": [{"id": 55, "src": "hda"}]}, "fastq_input_selector": "single"}
              indexing_options {"H": false, "I": null, "k": null, "w": null}
              io_options {"K": null, "L": false, "Q": false, "Y": false, "c": false, "cs": null, "eqx": false, "output_format": "BAM"}
              mapping_options {"F": null, "N": null, "X": false, "f": null, "g": null, "kmer_ocurrence_interval": {"__current_case__": 1, "interval": ""}, "m": null, "mask_len": null, "max_chain_iter": null, "max_chain_skip": null, "min_occ_floor": null, "n": null, "p": null, "q_occ_frac": "0.01", "r": null}
              reference_source {"__current_case__": 1, "ref_file": {"values": [{"id": 59, "src": "hda"}]}, "reference_source_selector": "history"}
      • Step 20: toolshed.g2.bx.psu.edu/repos/iuc/pretext_map/pretext_map/0.1.9+galaxy1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • samtools view -h '/tmp/tmpjw6cynca/files/7/0/3/dataset_703d773a-37db-4420-9d07-fd62b2cff94c.dat' | PretextMap --sortby length --sortorder descend --mapq 0 -o output.pretext

            Exit Code:

            • 0

            Standard Output:

            • [PretextMap status] :: Mapping to 1 sequences, sorted by length, descending. Filtering by minimum mapping quality 0
              
                                                                                              
              [PretextMap status] :: 10.0 k reads processed, 147  read-pairs mapped
                                                                                              
              [PretextMap status] :: 20.0 k reads processed, 280  read-pairs mapped
                                                                                              
              [PretextMap status] :: 30.0 k reads processed, 389  read-pairs mapped
                                                                                              
              [PretextMap status] :: 40.0 k reads processed, 541  read-pairs mapped
                                                                                              
              [PretextMap status] :: 50.0 k reads processed, 654  read-pairs mapped
                                                                                              
              [PretextMap status] :: 60.0 k reads processed, 797  read-pairs mapped
                                                                                              
              [PretextMap status] :: 70.0 k reads processed, 890  read-pairs mapped
                                                                                              
              [PretextMap status] :: 80.0 k reads processed, 1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 90.0 k reads processed, 1.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 100.0 k reads processed, 1.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 110.0 k reads processed, 1.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 120.0 k reads processed, 1.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 130.0 k reads processed, 1.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 140.0 k reads processed, 1.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 150.0 k reads processed, 2.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 160.0 k reads processed, 2.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 170.0 k reads processed, 2.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 180.0 k reads processed, 2.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 190.0 k reads processed, 2.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 200.0 k reads processed, 2.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 210.0 k reads processed, 2.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 220.0 k reads processed, 2.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 230.0 k reads processed, 2.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 240.0 k reads processed, 3.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 250.0 k reads processed, 3.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 260.0 k reads processed, 3.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 270.0 k reads processed, 3.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 280.0 k reads processed, 3.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 290.0 k reads processed, 3.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 300.0 k reads processed, 3.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 310.0 k reads processed, 3.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 320.0 k reads processed, 3.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 330.0 k reads processed, 4.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 340.0 k reads processed, 4.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 350.0 k reads processed, 4.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 360.0 k reads processed, 4.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 370.0 k reads processed, 4.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 380.0 k reads processed, 4.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 390.0 k reads processed, 4.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 400.0 k reads processed, 4.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 410.0 k reads processed, 4.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 420.0 k reads processed, 4.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 430.0 k reads processed, 5.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 440.0 k reads processed, 5.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 450.0 k reads processed, 5.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 460.0 k reads processed, 5.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 470.0 k reads processed, 5.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 480.0 k reads processed, 5.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 490.0 k reads processed, 5.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 500.0 k reads processed, 5.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 510.0 k reads processed, 6.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 520.0 k reads processed, 6.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 530.0 k reads processed, 6.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 540.0 k reads processed, 6.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 550.0 k reads processed, 6.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 560.0 k reads processed, 6.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 570.0 k reads processed, 6.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 580.0 k reads processed, 6.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 590.0 k reads processed, 7.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 600.0 k reads processed, 7.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 610.0 k reads processed, 7.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 620.0 k reads processed, 7.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 630.0 k reads processed, 7.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 640.0 k reads processed, 7.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 650.0 k reads processed, 7.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 660.0 k reads processed, 7.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 670.0 k reads processed, 8.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 680.0 k reads processed, 8.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 690.0 k reads processed, 8.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 700.0 k reads processed, 8.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 710.0 k reads processed, 8.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 720.0 k reads processed, 8.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 730.0 k reads processed, 8.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 740.0 k reads processed, 8.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 750.0 k reads processed, 8.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 760.0 k reads processed, 9.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 770.0 k reads processed, 9.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 780.0 k reads processed, 9.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 790.0 k reads processed, 9.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 800.0 k reads processed, 9.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 810.0 k reads processed, 9.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 820.0 k reads processed, 9.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 830.0 k reads processed, 10.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 840.0 k reads processed, 10.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 850.0 k reads processed, 10.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 860.0 k reads processed, 10.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 870.0 k reads processed, 10.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 880.0 k reads processed, 10.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 890.0 k reads processed, 10.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 900.0 k reads processed, 10.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 910.0 k reads processed, 10.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 920.0 k reads processed, 11.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 930.0 k reads processed, 11.2 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 940.0 k reads processed, 11.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 950.0 k reads processed, 11.3 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 960.0 k reads processed, 11.4 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 970.0 k reads processed, 11.5 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 980.0 k reads processed, 11.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 990.0 k reads processed, 11.6 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.0 M reads processed, 11.7 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.0 M reads processed, 11.8 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.0 M reads processed, 11.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.0 M reads processed, 11.9 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.0 M reads processed, 12.0 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.1 M reads processed, 12.1 k read-pairs mapped
                                                                                              
              [PretextMap status] :: 1.1 M reads processed, 12.1 k read-pairs mapped
                                                                     
              ..
              :: 298/528 (56.44%) texture blocks written to disk
              [PretextMap status] :: 299/528 (56.63%) texture blocks written to disk
              [PretextMap status] :: 300/528 (56.82%) texture blocks written to disk
              [PretextMap status] :: 301/528 (57.01%) texture blocks written to disk
              [PretextMap status] :: 302/528 (57.20%) texture blocks written to disk
              [PretextMap status] :: 303/528 (57.39%) texture blocks written to disk
              [PretextMap status] :: 304/528 (57.58%) texture blocks written to disk
              [PretextMap status] :: 305/528 (57.77%) texture blocks written to disk
              [PretextMap status] :: 306/528 (57.95%) texture blocks written to disk
              [PretextMap status] :: 307/528 (58.14%) texture blocks written to disk
              [PretextMap status] :: 308/528 (58.33%) texture blocks written to disk
              [PretextMap status] :: 309/528 (58.52%) texture blocks written to disk
              [PretextMap status] :: 310/528 (58.71%) texture blocks written to disk
              [PretextMap status] :: 311/528 (58.90%) texture blocks written to disk
              [PretextMap status] :: 312/528 (59.09%) texture blocks written to disk
              [PretextMap status] :: 313/528 (59.28%) texture blocks written to disk
              [PretextMap status] :: 314/528 (59.47%) texture blocks written to disk
              [PretextMap status] :: 315/528 (59.66%) texture blocks written to disk
              [PretextMap status] :: 316/528 (59.85%) texture blocks written to disk
              [PretextMap status] :: 317/528 (60.04%) texture blocks written to disk
              [PretextMap status] :: 318/528 (60.23%) texture blocks written to disk
              [PretextMap status] :: 319/528 (60.42%) texture blocks written to disk
              [PretextMap status] :: 320/528 (60.61%) texture blocks written to disk
              [PretextMap status] :: 321/528 (60.80%) texture blocks written to disk
              [PretextMap status] :: 322/528 (60.98%) texture blocks written to disk
              [PretextMap status] :: 323/528 (61.17%) texture blocks written to disk
              [PretextMap status] :: 324/528 (61.36%) texture blocks written to disk
              [PretextMap status] :: 325/528 (61.55%) texture blocks written to disk
              [PretextMap status] :: 326/528 (61.74%) texture blocks written to disk
              [PretextMap status] :: 327/528 (61.93%) texture blocks written to disk
              [PretextMap status] :: 328/528 (62.12%) texture blocks written to disk
              [PretextMap status] :: 329/528 (62.31%) texture blocks written to disk
              [PretextMap status] :: 330/528 (62.50%) texture blocks written to disk
              [PretextMap status] :: 331/528 (62.69%) texture blocks written to disk
              [PretextMap status] :: 332/528 (62.88%) texture blocks written to disk
              [PretextMap status] :: 333/528 (63.07%) texture blocks written to disk
              [PretextMap status] :: 334/528 (63.26%) texture blocks written to disk
              [PretextMap status] :: 335/528 (63.45%) texture blocks written to disk
              [PretextMap status] :: 336/528 (63.64%) texture blocks written to disk
              [PretextMap status] :: 337/528 (63.83%) texture blocks written to disk
              [PretextMap status] :: 338/528 (64.02%) texture blocks written to disk
              [PretextMap status] :: 339/528 (64.20%) texture blocks written to disk
              [PretextMap status] :: 340/528 (64.39%) texture blocks written to disk
              [PretextMap status] :: 341/528 (64.58%) texture blocks written to disk
              [PretextMap status] :: 342/528 (64.77%) texture blocks written to disk
              [PretextMap status] :: 343/528 (64.96%) texture blocks written to disk
              [PretextMap status] :: 344/528 (65.15%) texture blocks written to disk
              [PretextMap status] :: 345/528 (65.34%) texture blocks written to disk
              [PretextMap status] :: 346/528 (65.53%) texture blocks written to disk
              [PretextMap status] :: 347/528 (65.72%) texture blocks written to disk
              [PretextMap status] :: 348/528 (65.91%) texture blocks written to disk
              [PretextMap status] :: 349/528 (66.10%) texture blocks written to disk
              [PretextMap status] :: 350/528 (66.29%) texture blocks written to disk
              [PretextMap status] :: 351/528 (66.48%) texture blocks written to disk
              [PretextMap status] :: 352/528 (66.67%) texture blocks written to disk
              [PretextMap status] :: 353/528 (66.86%) texture blocks written to disk
              [PretextMap status] :: 354/528 (67.05%) texture blocks written to disk
              [PretextMap status] :: 355/528 (67.23%) texture blocks written to disk
              [PretextMap status] :: 356/528 (67.42%) texture blocks written to disk
              [PretextMap status] :: 357/528 (67.61%) texture blocks written to disk
              [PretextMap status] :: 358/528 (67.80%) texture blocks written to disk
              [PretextMap status] :: 359/528 (67.99%) texture blocks written to disk
              [PretextMap status] :: 360/528 (68.18%) texture blocks written to disk
              [PretextMap status] :: 361/528 (68.37%) texture blocks written to disk
              [PretextMap status] :: 362/528 (68.56%) texture blocks written to disk
              [PretextMap status] :: 363/528 (68.75%) texture blocks written to disk
              [PretextMap status] :: 364/528 (68.94%) texture blocks written to disk
              [PretextMap status] :: 365/528 (69.13%) texture blocks written to disk
              [PretextMap status] :: 366/528 (69.32%) texture blocks written to disk
              [PretextMap status] :: 367/528 (69.51%) texture blocks written to disk
              [PretextMap status] :: 368/528 (69.70%) texture blocks written to disk
              [PretextMap status] :: 369/528 (69.89%) texture blocks written to disk
              [PretextMap status] :: 370/528 (70.08%) texture blocks written to disk
              [PretextMap status] :: 371/528 (70.27%) texture blocks written to disk
              [PretextMap status] :: 372/528 (70.45%) texture blocks written to disk
              [PretextMap status] :: 373/528 (70.64%) texture blocks written to disk
              [PretextMap status] :: 374/528 (70.83%) texture blocks written to disk
              [PretextMap status] :: 375/528 (71.02%) texture blocks written to disk
              [PretextMap status] :: 376/528 (71.21%) texture blocks written to disk
              [PretextMap status] :: 377/528 (71.40%) texture blocks written to disk
              [PretextMap status] :: 378/528 (71.59%) texture blocks written to disk
              [PretextMap status] :: 379/528 (71.78%) texture blocks written to disk
              [PretextMap status] :: 380/528 (71.97%) texture blocks written to disk
              [PretextMap status] :: 381/528 (72.16%) texture blocks written to disk
              [PretextMap status] :: 382/528 (72.35%) texture blocks written to disk
              [PretextMap status] :: 383/528 (72.54%) texture blocks written to disk
              [PretextMap status] :: 384/528 (72.73%) texture blocks written to disk
              [PretextMap status] :: 385/528 (72.92%) texture blocks written to disk
              [PretextMap status] :: 386/528 (73.11%) texture blocks written to disk
              [PretextMap status] :: 387/528 (73.30%) texture blocks written to disk
              [PretextMap status] :: 388/528 (73.48%) texture blocks written to disk
              [PretextMap status] :: 389/528 (73.67%) texture blocks written to disk
              [PretextMap status] :: 390/528 (73.86%) texture blocks written to disk
              [PretextMap status] :: 391/528 (74.05%) texture blocks written to disk
              [PretextMap status] :: 392/528 (74.24%) texture blocks written to disk
              [PretextMap status] :: 393/528 (74.43%) texture blocks written to disk
              [PretextMap status] :: 394/528 (74.62%) texture blocks written to disk
              [PretextMap status] :: 395/528 (74.81%) texture blocks written to disk
              [PretextMap status] :: 396/528 (75.00%) texture blocks written to disk
              [PretextMap status] :: 397/528 (75.19%) texture blocks written to disk
              [PretextMap status] :: 398/528 (75.38%) texture blocks written to disk
              [PretextMap status] :: 399/528 (75.57%) texture blocks written to disk
              [PretextMap status] :: 400/528 (75.76%) texture blocks written to disk
              [PretextMap status] :: 401/528 (75.95%) texture blocks written to disk
              [PretextMap status] :: 402/528 (76.14%) texture blocks written to disk
              [PretextMap status] :: 403/528 (76.33%) texture blocks written to disk
              [PretextMap status] :: 404/528 (76.52%) texture blocks written to disk
              [PretextMap status] :: 405/528 (76.70%) texture blocks written to disk
              [PretextMap status] :: 406/528 (76.89%) texture blocks written to disk
              [PretextMap status] :: 407/528 (77.08%) texture blocks written to disk
              [PretextMap status] :: 408/528 (77.27%) texture blocks written to disk
              [PretextMap status] :: 409/528 (77.46%) texture blocks written to disk
              [PretextMap status] :: 410/528 (77.65%) texture blocks written to disk
              [PretextMap status] :: 411/528 (77.84%) texture blocks written to disk
              [PretextMap status] :: 412/528 (78.03%) texture blocks written to disk
              [PretextMap status] :: 413/528 (78.22%) texture blocks written to disk
              [PretextMap status] :: 414/528 (78.41%) texture blocks written to disk
              [PretextMap status] :: 415/528 (78.60%) texture blocks written to disk
              [PretextMap status] :: 416/528 (78.79%) texture blocks written to disk
              [PretextMap status] :: 417/528 (78.98%) texture blocks written to disk
              [PretextMap status] :: 418/528 (79.17%) texture blocks written to disk
              [PretextMap status] :: 419/528 (79.36%) texture blocks written to disk
              [PretextMap status] :: 420/528 (79.55%) texture blocks written to disk
              [PretextMap status] :: 421/528 (79.73%) texture blocks written to disk
              [PretextMap status] :: 422/528 (79.92%) texture blocks written to disk
              [PretextMap status] :: 423/528 (80.11%) texture blocks written to disk
              [PretextMap status] :: 424/528 (80.30%) texture blocks written to disk
              [PretextMap status] :: 425/528 (80.49%) texture blocks written to disk
              [PretextMap status] :: 426/528 (80.68%) texture blocks written to disk
              [PretextMap status] :: 427/528 (80.87%) texture blocks written to disk
              [PretextMap status] :: 428/528 (81.06%) texture blocks written to disk
              [PretextMap status] :: 429/528 (81.25%) texture blocks written to disk
              [PretextMap status] :: 430/528 (81.44%) texture blocks written to disk
              [PretextMap status] :: 431/528 (81.63%) texture blocks written to disk
              [PretextMap status] :: 432/528 (81.82%) texture blocks written to disk
              [PretextMap status] :: 433/528 (82.01%) texture blocks written to disk
              [PretextMap status] :: 434/528 (82.20%) texture blocks written to disk
              [PretextMap status] :: 435/528 (82.39%) texture blocks written to disk
              [PretextMap status] :: 436/528 (82.58%) texture blocks written to disk
              [PretextMap status] :: 437/528 (82.77%) texture blocks written to disk
              [PretextMap status] :: 438/528 (82.95%) texture blocks written to disk
              [PretextMap status] :: 439/528 (83.14%) texture blocks written to disk
              [PretextMap status] :: 440/528 (83.33%) texture blocks written to disk
              [PretextMap status] :: 441/528 (83.52%) texture blocks written to disk
              [PretextMap status] :: 442/528 (83.71%) texture blocks written to disk
              [PretextMap status] :: 443/528 (83.90%) texture blocks written to disk
              [PretextMap status] :: 444/528 (84.09%) texture blocks written to disk
              [PretextMap status] :: 445/528 (84.28%) texture blocks written to disk
              [PretextMap status] :: 446/528 (84.47%) texture blocks written to disk
              [PretextMap status] :: 447/528 (84.66%) texture blocks written to disk
              [PretextMap status] :: 448/528 (84.85%) texture blocks written to disk
              [PretextMap status] :: 449/528 (85.04%) texture blocks written to disk
              [PretextMap status] :: 450/528 (85.23%) texture blocks written to disk
              [PretextMap status] :: 451/528 (85.42%) texture blocks written to disk
              [PretextMap status] :: 452/528 (85.61%) texture blocks written to disk
              [PretextMap status] :: 453/528 (85.80%) texture blocks written to disk
              [PretextMap status] :: 454/528 (85.98%) texture blocks written to disk
              [PretextMap status] :: 455/528 (86.17%) texture blocks written to disk
              [PretextMap status] :: 456/528 (86.36%) texture blocks written to disk
              [PretextMap status] :: 457/528 (86.55%) texture blocks written to disk
              [PretextMap status] :: 458/528 (86.74%) texture blocks written to disk
              [PretextMap status] :: 459/528 (86.93%) texture blocks written to disk
              [PretextMap status] :: 460/528 (87.12%) texture blocks written to disk
              [PretextMap status] :: 461/528 (87.31%) texture blocks written to disk
              [PretextMap status] :: 462/528 (87.50%) texture blocks written to disk
              [PretextMap status] :: 463/528 (87.69%) texture blocks written to disk
              [PretextMap status] :: 464/528 (87.88%) texture blocks written to disk
              [PretextMap status] :: 465/528 (88.07%) texture blocks written to disk
              [PretextMap status] :: 466/528 (88.26%) texture blocks written to disk
              [PretextMap status] :: 467/528 (88.45%) texture blocks written to disk
              [PretextMap status] :: 468/528 (88.64%) texture blocks written to disk
              [PretextMap status] :: 469/528 (88.83%) texture blocks written to disk
              [PretextMap status] :: 470/528 (89.02%) texture blocks written to disk
              [PretextMap status] :: 471/528 (89.20%) texture blocks written to disk
              [PretextMap status] :: 472/528 (89.39%) texture blocks written to disk
              [PretextMap status] :: 473/528 (89.58%) texture blocks written to disk
              [PretextMap status] :: 474/528 (89.77%) texture blocks written to disk
              [PretextMap status] :: 475/528 (89.96%) texture blocks written to disk
              [PretextMap status] :: 476/528 (90.15%) texture blocks written to disk
              [PretextMap status] :: 477/528 (90.34%) texture blocks written to disk
              [PretextMap status] :: 478/528 (90.53%) texture blocks written to disk
              [PretextMap status] :: 479/528 (90.72%) texture blocks written to disk
              [PretextMap status] :: 480/528 (90.91%) texture blocks written to disk
              [PretextMap status] :: 481/528 (91.10%) texture blocks written to disk
              [PretextMap status] :: 482/528 (91.29%) texture blocks written to disk
              [PretextMap status] :: 483/528 (91.48%) texture blocks written to disk
              [PretextMap status] :: 484/528 (91.67%) texture blocks written to disk
              [PretextMap status] :: 485/528 (91.86%) texture blocks written to disk
              [PretextMap status] :: 486/528 (92.05%) texture blocks written to disk
              [PretextMap status] :: 487/528 (92.23%) texture blocks written to disk
              [PretextMap status] :: 488/528 (92.42%) texture blocks written to disk
              [PretextMap status] :: 489/528 (92.61%) texture blocks written to disk
              [PretextMap status] :: 490/528 (92.80%) texture blocks written to disk
              [PretextMap status] :: 491/528 (92.99%) texture blocks written to disk
              [PretextMap status] :: 492/528 (93.18%) texture blocks written to disk
              [PretextMap status] :: 493/528 (93.37%) texture blocks written to disk
              [PretextMap status] :: 494/528 (93.56%) texture blocks written to disk
              [PretextMap status] :: 495/528 (93.75%) texture blocks written to disk
              [PretextMap status] :: 496/528 (93.94%) texture blocks written to disk
              [PretextMap status] :: 497/528 (94.13%) texture blocks written to disk
              [PretextMap status] :: 498/528 (94.32%) texture blocks written to disk
              [PretextMap status] :: 499/528 (94.51%) texture blocks written to disk
              [PretextMap status] :: 500/528 (94.70%) texture blocks written to disk
              [PretextMap status] :: 501/528 (94.89%) texture blocks written to disk
              [PretextMap status] :: 502/528 (95.08%) texture blocks written to disk
              [PretextMap status] :: 503/528 (95.27%) texture blocks written to disk
              [PretextMap status] :: 504/528 (95.45%) texture blocks written to disk
              [PretextMap status] :: 505/528 (95.64%) texture blocks written to disk
              [PretextMap status] :: 506/528 (95.83%) texture blocks written to disk
              [PretextMap status] :: 507/528 (96.02%) texture blocks written to disk
              [PretextMap status] :: 508/528 (96.21%) texture blocks written to disk
              [PretextMap status] :: 509/528 (96.40%) texture blocks written to disk
              [PretextMap status] :: 510/528 (96.59%) texture blocks written to disk
              [PretextMap status] :: 511/528 (96.78%) texture blocks written to disk
              [PretextMap status] :: 512/528 (96.97%) texture blocks written to disk
              [PretextMap status] :: 513/528 (97.16%) texture blocks written to disk
              [PretextMap status] :: 514/528 (97.35%) texture blocks written to disk
              [PretextMap status] :: 515/528 (97.54%) texture blocks written to disk
              [PretextMap status] :: 516/528 (97.73%) texture blocks written to disk
              [PretextMap status] :: 517/528 (97.92%) texture blocks written to disk
              [PretextMap status] :: 518/528 (98.11%) texture blocks written to disk
              [PretextMap status] :: 519/528 (98.30%) texture blocks written to disk
              [PretextMap status] :: 520/528 (98.48%) texture blocks written to disk
              [PretextMap status] :: 521/528 (98.67%) texture blocks written to disk
              [PretextMap status] :: 522/528 (98.86%) texture blocks written to disk
              [PretextMap status] :: 523/528 (99.05%) texture blocks written to disk
              [PretextMap status] :: 524/528 (99.24%) texture blocks written to disk
              [PretextMap status] :: 525/528 (99.43%) texture blocks written to disk
              [PretextMap status] :: 526/528 (99.62%) texture blocks written to disk
              [PretextMap status] :: 527/528 (99.81%) texture blocks written to disk
              [PretextMap status] :: 528/528 (100.00%) texture blocks written to disk
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              filter {"__current_case__": 0, "filter_type": ""}
              map_qual "0"
              sorting {"__current_case__": 1, "sortby": "length", "sortorder": "descend"}
      • Step 3: Haplotype 2:

        • step_state: scheduled
      • Step 21: toolshed.g2.bx.psu.edu/repos/devteam/column_maker/Add_a_column1/2.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/column_maker/aff5135563c6/column_maker/column_maker.py' --column-types str,int,int  --file '/tmp/tmpjw6cynca/job_working_directory/000/67/configs/tmpeht25n09' --fail-on-non-existent-columns --fail-on-non-computable '/tmp/tmpjw6cynca/files/9/a/7/dataset_9a7f18e5-fbbe-4167-9f10-7dfb02c4c31a.dat' '/tmp/tmpjw6cynca/job_working_directory/000/67/outputs/dataset_12d228d7-477a-41a5-8b2f-4f4f219b1122.dat'

            Exit Code:

            • 0

            Standard Output:

            • abs(int(c3)-int(c2))
              Computing 1 new columns with instructions ['abs(int(c3)-int(c2));;']
              Computed new column values for 100.00% of 8 lines written.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              avoid_scientific_notation false
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              error_handling {"auto_col_types": true, "fail_on_non_existent_columns": true, "non_computable": {"__current_case__": 0, "action": "--fail-on-non-computable"}}
              ops {"__current_case__": 0, "expressions": [{"__index__": 0, "add_column": {"__current_case__": 0, "mode": "", "pos": ""}, "cond": "abs(int(c3)-int(c2))"}], "header_lines_select": "no"}
      • Step 22: Cut1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • perl '/tmp/tmpjw6cynca/galaxy-dev/tools/filters/cutWrapper.pl' '/tmp/tmpjw6cynca/files/3/c/a/dataset_3caa6820-5fb7-4260-acb4-ab5801408326.dat' 'c1,c2,c3' T '/tmp/tmpjw6cynca/job_working_directory/000/68/outputs/dataset_9ca98651-b162-459d-85bf-3f2f9e7f9314.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "bed"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              columnList "c1,c2,c3"
              dbkey "?"
              delimiter "T"
      • Step 23: toolshed.g2.bx.psu.edu/repos/bgruening/deeptools_bam_coverage/deeptools_bam_coverage/3.5.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -s '/tmp/tmpjw6cynca/files/4/3/0/dataset_4302f78e-444f-46b2-a31b-d8fd4444548b.dat' one.bam && ln -s '/tmp/tmpjw6cynca/files/_metadata_files/d/f/3/metadata_df3b600f-c351-4b09-bc60-12872e537c89.dat' one.bam.bai &&  bamCoverage --numberOfProcessors "${GALAXY_SLOTS:-4}"  --bam one.bam --outFileName '/tmp/tmpjw6cynca/job_working_directory/000/69/outputs/dataset_2451f336-5538-4514-b277-5dad48f1cbe0.dat' --outFileFormat 'bigwig'  --binSize 100

            Exit Code:

            • 0

            Standard Error:

            • bamFilesList: ['one.bam']
              binLength: 100
              numberOfSamples: None
              blackListFileName: None
              skipZeroOverZero: False
              bed_and_bin: False
              genomeChunkSize: None
              defaultFragmentLength: read length
              numberOfProcessors: 1
              verbose: False
              region: None
              bedFile: None
              minMappingQuality: None
              ignoreDuplicates: False
              chrsToSkip: []
              stepSize: 100
              center_read: False
              samFlag_include: None
              samFlag_exclude: None
              minFragmentLength: 0
              maxFragmentLength: 0
              zerosToNans: False
              smoothLength: None
              save_data: False
              out_file_for_raw_data: None
              maxPairedFragmentLength: 1000
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              advancedOpt {"__current_case__": 0, "showAdvancedOpt": "no"}
              binSize "100"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              exactScaling false
              outFileFormat "bigwig"
              region ""
              scaling {"__current_case__": 3, "type": "no"}
      • Step 24: toolshed.g2.bx.psu.edu/repos/devteam/column_maker/Add_a_column1/2.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/column_maker/aff5135563c6/column_maker/column_maker.py' --column-types  --file '/tmp/tmpjw6cynca/job_working_directory/000/70/configs/tmpdu5zb0oz' --fail-on-non-existent-columns --fail-on-non-computable '/tmp/tmpjw6cynca/files/9/c/a/dataset_9ca98651-b162-459d-85bf-3f2f9e7f9314.dat' '/tmp/tmpjw6cynca/job_working_directory/000/70/outputs/dataset_fcfc5b1f-d91e-4d9d-b975-4e8c04ca91bd.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              avoid_scientific_notation false
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              error_handling {"auto_col_types": true, "fail_on_non_existent_columns": true, "non_computable": {"__current_case__": 0, "action": "--fail-on-non-computable"}}
              ops {"__current_case__": 0, "expressions": [{"__index__": 0, "add_column": {"__current_case__": 0, "mode": "", "pos": ""}, "cond": "abs(int(c3)-int(c2))"}], "header_lines_select": "no"}
      • Step 25: Add Coverage Track:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cp '/tmp/tmpjw6cynca/files/3/f/c/dataset_3fc2587e-f204-4a6c-8818-cf4d45aa0b1e.dat' input.pretext && ln -s '/tmp/tmpjw6cynca/files/2/4/5/dataset_2451f336-5538-4514-b277-5dad48f1cbe0.dat' input.bigwig && bigWigToBedGraph input.bigwig /dev/stdout | PretextGraph -i input.pretext -n 'coverage' -o output.pretext

            Exit Code:

            • 0

            Standard Output:

            • [PretextGraph status] :: Release mode, number of thread: 4
              
              [PretextGraph status] :: Pretext file: input.pretext
              [PretextGraph status] :: Graph name: 'coverage'
              [PretextGraph status] :: The extension of the graph name is default, set the data_type into 0 (default).
              [PretextGraph status] :: Pretext file: output.pretext
              [PretextGraph status] :: Reading file...
              [PretextGraph status] :: File read
              
                                                                                              
              [PretextGraph status] :: 16.4 k bedgraph lines read
              [PretextGraph status] :: Transfer f32 to s32...
              [PretextGraph status] :: Saving graph...
              [PretextGraph status] :: Done
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              name "coverage"
      • Step 26: Test if telomere track is empty:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "bedgraph"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              param_type "text"
              remove_newlines true
      • Step 27: False if telomere track is empty:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input_param_type {"__current_case__": 0, "input_param": "", "mappings": [{"__index__": 0, "from": "", "to": "false"}], "type": "text"}
              output_param_type "boolean"
              unmapped {"__current_case__": 2, "default_value": "true", "on_unmapped": "default"}
      • Step 28: Add telomere track:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is skipped

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              name "telomeres_gap_format"
      • Step 29: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 78, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 75, "src": "hda"}]}}]}}
      • Step 30: Add gaps track:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cp '/tmp/tmpjw6cynca/files/4/d/5/dataset_4d5fd6d7-b139-4137-83f6-2f0fec200e31.dat' input.pretext && cat '/tmp/tmpjw6cynca/files/1/2/d/dataset_12d228d7-477a-41a5-8b2f-4f4f219b1122.dat' | PretextGraph -i input.pretext -n 'gaps' -o output.pretext

            Exit Code:

            • 0

            Standard Output:

            • [PretextGraph status] :: Release mode, number of thread: 4
              
              [PretextGraph status] :: Pretext file: input.pretext
              [PretextGraph status] :: Graph name: 'gaps'
              [PretextGraph status] :: The extension of the graph name is gap, set the data_type to 2 (gap).
              [PretextGraph status] :: Pretext file: output.pretext
              [PretextGraph status] :: Reading file...
              [PretextGraph status] :: File read
              
              [PretextGraph status] :: Transfer f32 to s32...
              [PretextGraph status] :: Saving graph...
              [PretextGraph status] :: Done
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              name "gaps"
      • Step 4: Do you want to add suffixes to the scaffold names?:

        • step_state: scheduled
      • Step 31: toolshed.g2.bx.psu.edu/repos/iuc/pretext_snapshot/pretext_snapshot/0.0.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • PretextSnapshot -m /tmp/tmpjw6cynca/files/8/3/3/dataset_833d2fd3-488b-4be9-b131-68b875454ab7.dat -f png -r 2000 -c 5 --sequences '=full' --minTexels 64 --gridSize 1 --gridColour black '' -o output --prefix pretext_snapshot

            Exit Code:

            • 0

            Standard Output:

            • [PretextSnapshot status] :: Input map: '/tmp/tmpjw6cynca/files/8/3/3/dataset_833d2fd3-488b-4be9-b131-68b875454ab7.dat'
              [PretextSnapshot status] :: Output format: png
              [PretextSnapshot status] :: Output resolution: 2000 pixels
              [PretextSnapshot status] :: Colour map: Three Wave Blue-Green-Yellow
              [PretextSnapshot status] :: Minimum texels per image: 64
              [PretextSnapshot status] :: Grid size: 1 pixel
              [PretextSnapshot status] :: Grid colour: 000000ff
              [PretextSnapshot status] :: Output path: 'output/pretext_snapshot'
              [PretextSnapshot status] :: Indexing file...
              [PretextSnapshot status] :: File indexed
              [PretextSnapshot status] :: 'output/pretext_snapshotFullMap.png' written to disk
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              chromInfo "/tmp/tmpjw6cynca/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              colormap "5"
              dbkey "?"
              formats {"__current_case__": 0, "outformat": "png"}
              grid {"__current_case__": 0, "gridcolor": "black", "gridsize": "1", "showGrid": "yes"}
              mintexels "64"
              resolution "2000"
              sequencenames false
              sequences "=full"
      • Step 32: __EXTRACT_DATASET__:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __workflow_invocation_uuid__ "e2d2d7dc0b2a11f0aeed000d3a3012a8"
              input {"values": [{"id": 14, "src": "hdca"}]}
              which {"__current_case__": 0, "which_dataset": "first"}
      • Step 5: First Haplotype suffix:

        • step_state: scheduled
      • Step 6: Second Haplotype suffix:

        • step_state: scheduled
      • Step 7: Hi-C reads:

        • step_state: scheduled
      • Step 8: Do you want to trim the Hi-C data?:

        • step_state: scheduled
      • Step 9: Telomere repeat to suit species:

        • step_state: scheduled
      • Step 10: PacBio reads:

        • step_state: scheduled
    • Other invocation details
      • history_id

        • 496fdcad65465798
      • history_state

        • ok
      • invocation_id

        • 74d2c40334dbdfb1
      • invocation_state

        • scheduled
      • workflow_id

        • 6939a158f632cb4b

@github-actions
Copy link
Copy Markdown

Test Results (powered by Planemo)

Test Summary

Test State Count
Total 2
Passed 0
Error 2
Failure 0
Skipped 0
Errored Tests
  • ❌ hi-c-map-for-assembly-manual-curation.ga_0

    Execution Problem:

    • Final state of invocation 6b6f885a347c1737 is [failed]. Failed to run workflow, at least one job is in [paused] state.
      

    Workflow invocation details

    • Invocation Messages

      • Invocation scheduling failed because step 28 requires a dataset, but dataset entered a failed state.
    • Steps
      • Step 1: Haplotype 1:

        • step_state: scheduled
      • Step 2: Will you use a second haplotype?:

        • step_state: scheduled
      • Step 11: Hap2 not provided:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input_param_type {"__current_case__": 3, "input_param": false, "mappings": [{"__index__": 0, "from": false, "to": "true"}, {"__index__": 1, "from": true, "to": "false"}], "type": "boolean"}
              output_param_type "boolean"
              unmapped {"__current_case__": 2, "default_value": "false", "on_unmapped": "default"}
      • Step 12: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Hap1:

            • step_state: scheduled
          • Step 2: Hap2:

            • step_state: scheduled
          • Step 11: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d70b2c11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 10, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 2, "src": "hda"}]}}]}}
          • Step 12: toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_cat/9.3+galaxy1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d70b2c11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  queries [{"__index__": 0, "inputs2": {"values": [{"id": 12, "src": "hda"}]}}]
          • Step 3: Do you want to add suffixes to the scaffold names?:

            • step_state: scheduled
          • Step 4: Hap1 suffix:

            • step_state: scheduled
          • Step 5: Hap2 suffix:

            • step_state: scheduled
          • Step 6: Expression for hap1 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d70b2c11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H1", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 7: Expression for hap2 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d70b2c11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H2", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 8: add hap1 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d70b2c11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": null}]
          • Step 9: add hap2 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d70b2c11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": null}]
          • Step 10: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d70b2c11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 9, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 1, "src": "hda"}]}}]}}
      • Step 13: concatenate HiFi:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cat '/tmp/tmp94ch8nif/files/b/5/5/dataset_b55e5453-0ab1-48ba-a8f4-42917ce4e1de.dat' >> '/tmp/tmp94ch8nif/job_working_directory/000/14/outputs/dataset_234accb8-bbf8-4f8d-a670-b9726f819999.dat' && exit 0

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              queries []
      • Step 14: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Hap1:

            • step_state: scheduled
          • Step 2: Do you want to add suffixes to the scaffold names?:

            • step_state: scheduled
          • Step 3: Hap1 suffix:

            • step_state: scheduled
          • Step 4: Expression for hap1 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d80b2c11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H1", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 5: add hap1 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • sed -r --sandbox -e 's/>.+$/&.H1/g' '/tmp/tmp94ch8nif/files/0/7/c/dataset_07c18519-71f5-43c7-bee5-1f2ee6661491.dat' > '/tmp/tmp94ch8nif/job_working_directory/000/16/outputs/dataset_c1b44820-ed47-49a4-8ff0-bf5c6c919a85.dat'

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d80b2c11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": "&.H1"}]
          • Step 6: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d80b2c11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 16, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 1, "src": "hda"}]}}]}}
      • Step 15: Pick Assembly:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 13, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 17, "src": "hda"}]}}]}}
      • Step 16: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Reference:

            • step_state: scheduled
          • Step 2: Hi-C reads:

            • step_state: scheduled
          • Step 3: Do you want to trim the Hi-C data?:

            • step_state: scheduled
          • Step 4: Trim Hi-C reads 2:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • ln -f -s '/tmp/tmp94ch8nif/files/2/b/6/dataset_2b67fe9a-77cf-4b89-8dc4-86c38b532bb2.dat' 'Hi-C reads_1.fq.gz' && ln -f -s '/tmp/tmp94ch8nif/files/9/7/a/dataset_97aabc98-c477-4b13-af10-b97e1777d9f4.dat' 'Hi-C reads_2.fq.gz' &&  cutadapt  -j=${GALAXY_SLOTS:-4}     --error-rate=0.1 --times=1 --overlap=3    --action=trim   --cut=5 -U 5       --minimum-length=1      -o 'out1.fq.gz' -p 'out2.fq.gz'  'Hi-C reads_1.fq.gz' 'Hi-C reads_2.fq.gz'  > report.txt

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d90b2c11f08a576045bd086594"
                  adapter_options {"action": "trim", "error_rate": "0.1", "match_read_wildcards": false, "no_indels": false, "no_match_adapter_wildcards": true, "overlap": "3", "revcomp": false, "times": "1"}
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  filter_options {"discard_casava": false, "discard_trimmed": false, "discard_untrimmed": false, "max_average_error_rate": null, "max_expected_errors": null, "max_n": null, "maximum_length": null, "maximum_length2": null, "minimum_length": "1", "minimum_length2": null, "pair_filter": "any"}
                  library {"__current_case__": 2, "input_1": {"values": [{"id": 1, "src": "dce"}]}, "pair_adapters": false, "r1": {"adapters": [], "anywhere_adapters": [], "front_adapters": []}, "r2": {"adapters2": [], "anywhere_adapters2": [], "front_adapters2": []}, "type": "paired_collection"}
                  other_trimming_options {"cut": "5", "cut2": "5", "nextseq_trim": "0", "poly_a": false, "quality_cutoff": "0", "quality_cutoff2": "", "shorten_options": {"__current_case__": 1, "shorten_values": "False"}, "shorten_options_r2": {"__current_case__": 1, "shorten_values_r2": "False"}, "trim_n": false}
                  output_selector ["report"]
                  read_mod_options {"length_tag": null, "rename": null, "strip_suffix": null, "zero_cap": false}
          • Step 5: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d90b2c11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 7, "src": "dce"}]}}, {"__index__": 1, "value": {"values": [{"id": 2, "src": "dce"}]}}]}}
              • Job 2:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d90b2c11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 8, "src": "dce"}]}}, {"__index__": 1, "value": {"values": [{"id": 3, "src": "dce"}]}}]}}
          • Step 6: toolshed.g2.bx.psu.edu/repos/iuc/bwa_mem2/bwa_mem2/2.2.1+galaxy1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is error

                Command Line:

                • set -o | grep -q pipefail && set -o pipefail;  ln -s '/tmp/tmp94ch8nif/files/c/1/b/dataset_c1b44820-ed47-49a4-8ff0-bf5c6c919a85.dat' 'localref.fasta' && bwa-mem2 index 'localref.fasta' &&    bwa-mem2 mem -t "${GALAXY_SLOTS:-1}" -v 1   -k '19' -w '100' -d '100' -r '1.5' -y '20' -c '500' -D '0.5' -W '0' -m '50' -S -P    -T '20' -h '5' -a                      'localref.fasta' '/tmp/tmp94ch8nif/files/b/7/6/dataset_b76235dd-eadb-427c-a44f-d09e8fca5fed.dat' '/tmp/tmp94ch8nif/files/4/6/2/dataset_46280c71-12da-4c80-8eb3-b2b92e47f9c2.dat'  | samtools sort -@${GALAXY_SLOTS:-2} -T "${TMPDIR:-.}" -O bam -o '/tmp/tmp94ch8nif/job_working_directory/000/22/outputs/dataset_31837295-affd-4fae-a4b4-7e6de13fe606.dat'

                Exit Code:

                • 127

                Standard Error:

                • Looking to launch executable "/usr/local/bin/bwa-mem2.avx2", simd = .avx2
                  Launching executable "/usr/local/bin/bwa-mem2.avx2"
                  [bwa_index] Pack FASTA... 0.65 sec
                  * Entering FMI_search
                  init ticks = 13841455736
                  ref seq len = 279307354
                  binary seq ticks = 5258441738
                  build suffix-array ticks = 91854353683
                  pos: 34913420, ref_seq_len__: 34913419
                  build fm-index ticks = 22451527139
                  Total time taken: 55.2677
                  /tmp/tmp94ch8nif/job_working_directory/000/22/tool_script.sh: line 23: samtools: command not found
                  Looking to launch executable "/usr/local/bin/bwa-mem2.avx2", simd = .avx2
                  Launching executable "/usr/local/bin/bwa-mem2.avx2"
                  -----------------------------
                  Executing in AVX2 mode!!
                  -----------------------------
                  * SA compression enabled with xfactor: 8
                  * Ref file: localref.fasta
                  * Entering FMI_search
                  * Index file found. Loading index from localref.fasta.bwt.2bit.64
                  * Reference seq len for bi-index = 279307355
                  * sentinel-index: 13446364
                  * Count:
                  0,	1
                  1,	84204314
                  2,	139653678
                  3,	195103042
                  4,	279307355
                  
                  * Reading other elements of the index from files localref.fasta
                  * Index prefix: localref.fasta
                  * Read 0 ALT contigs
                  * Done reading Index!!
                  * Reading reference genome..
                  * Binary seq file = localref.fasta.0123
                  * Reference genome size: 279307354 bp
                  * Done reading reference genome !!
                  
                  ------------------------------------------
                  1. Memory pre-allocation for Chaining: 139.3584 MB
                  2. Memory pre-allocation for BSW: 239.6170 MB
                  3. Memory pre-allocation for BWT: 77.3142 MB
                  ------------------------------------------
                  * Threads used (compute): 1
                  * No. of pipeline threads: 2
                  
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] Reallocating initial memory allocations!!
                  [0000] Calling mem_process_seqs.., task: 0
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 139653677, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (1649, 4140, 6027)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 14783)
                  [0000][PE] mean and std.dev: (4063.15, 2569.67)
                  [0000][PE] low and high boundaries for proper pairs: (1, 19161)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (157, 248, 382)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 832)
                  [0000][PE] mean and std.dev: (254.60, 148.66)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1057)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (2814, 4404, 5708)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 11496)
                  [0000][PE] mean and std.dev: (4321.60, 2279.57)
                  [0000][PE] low and high boundaries for proper pairs: (1, 14390)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1378, 2846, 5532)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 13840)
                  [0000][PE] mean and std.dev: (3534.90, 2567.86)
                  [0000][PE] low and high boundaries for proper pairs: (1, 17994)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 92.671 CPU sec, 92.889 real sec
                  

                Standard Output:

                • ref_seq_len = 279307354
                  count = 0, 84204313, 139653677, 195103041, 279307354
                  BWT[13446364] = 4
                  CP_SHIFT = 6, CP_MASK = 63
                  sizeof CP_OCC = 64
                  max_occ_ind = 4364177
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d90b2c11f08a576045bd086594"
                  analysis_type {"__current_case__": 4, "algorithmic_options": {"D": "0.5", "P": true, "S": true, "W": "0", "__current_case__": 0, "algorithmic_options_selector": "set", "c": "500", "d": "100", "e": false, "k": "19", "m": "50", "r": "1.5", "w": "100", "y": "20"}, "analysis_type_selector": "full", "io_options": {"C": false, "K": null, "M": false, "T": "20", "V": false, "Y": false, "__current_case__": 0, "a": true, "five": false, "h": "5", "io_options_selector": "set", "q": false}, "scoring_options": {"__current_case__": 1, "scoring_options_selector": "do_not_set"}}
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  fastq_input {"__current_case__": 2, "fastq_input1": {"values": [{"id": 9, "src": "dce"}]}, "fastq_input_selector": "paired_collection", "iset_stats": null}
                  output_sort "coordinate"
                  reference_source {"__current_case__": 1, "ref_file": {"values": [{"id": 18, "src": "hda"}]}, "reference_source_selector": "history"}
                  rg {"__current_case__": 3, "rg_selector": "do_not_set"}
          • Step 7: toolshed.g2.bx.psu.edu/repos/iuc/samtools_merge/samtools_merge/1.20+galaxy2:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is paused

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "f6aba3d90b2c11f08a576045bd086594"
                  bed_file None
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  headerbam None
                  idpg false
                  idrg false
                  region None
                  seed "1"
      • Step 17: toolshed.g2.bx.psu.edu/repos/bgruening/gfastats/gfastats/1.3.9+galaxy2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • gfastats '/tmp/tmp94ch8nif/files/c/1/b/dataset_c1b44820-ed47-49a4-8ff0-bf5c6c919a85.dat' --out-coord g   --tabular > '/tmp/tmp94ch8nif/job_working_directory/000/24/outputs/dataset_4e81dfd3-c328-45a5-90d7-00640ed80f7e.dat' --threads ${GALAXY_SLOTS:-8}

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode_condition {"__current_case__": 1, "discover_paths": false, "locale": false, "selector": "statistics", "statistics_condition": {"__current_case__": 1, "out_coord": "g", "selector": "coordinates"}, "tabular": true}
              target_condition {"__current_case__": 0, "target_option": "false"}
      • Step 18: toolshed.g2.bx.psu.edu/repos/iuc/seqtk/seqtk_telo/1.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • seqtk telo -m 'CCCTAA' -p '1' -d '2000' -s '300'  '/tmp/tmp94ch8nif/files/c/1/b/dataset_c1b44820-ed47-49a4-8ff0-bf5c6c919a85.dat' > '/tmp/tmp94ch8nif/job_working_directory/000/25/outputs/dataset_0f6ad7af-63e8-4547-b519-1636b8eb7e8e.dat'

            Exit Code:

            • 0

            Standard Error:

            • 11012	139653677
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              P false
              __input_ext "input"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              d "2000"
              dbkey "?"
              m "CCCTAA"
              p "1"
              s "300"
      • Step 19: toolshed.g2.bx.psu.edu/repos/iuc/minimap2/minimap2/2.28+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -f -s '/tmp/tmp94ch8nif/files/c/1/b/dataset_c1b44820-ed47-49a4-8ff0-bf5c6c919a85.dat' reference.fa && minimap2 -x map-hifi    --q-occ-frac 0.01       -t ${GALAXY_SLOTS:-4} reference.fa '/tmp/tmp94ch8nif/files/2/3/4/dataset_234accb8-bbf8-4f8d-a670-b9726f819999.dat' -a | samtools view --no-PG -hT reference.fa | samtools sort -@${GALAXY_SLOTS:-2} -T "${TMPDIR:-.}" -O BAM -o '/tmp/tmp94ch8nif/job_working_directory/000/26/outputs/dataset_19ccb6f0-c197-432d-b1f2-b41297632f06.dat'

            Exit Code:

            • 0

            Standard Error:

            • [M::mm_idx_gen::4.322*0.80] collected minimizers
              [M::mm_idx_gen::5.658*0.82] sorted minimizers
              [M::main::5.659*0.82] loaded/built the index for 1 target sequence(s)
              [M::mm_mapopt_update::5.784*0.83] mid_occ = 131
              [M::mm_idx_stat] kmer size: 19; skip: 19; is_hpc: 0; #seq: 1
              [M::mm_idx_stat::5.874*0.83] distinct minimizers: 11315845 (97.08% are singletons); average occurrences: 1.241; average spacing: 9.941; total length: 139653677
              [M::worker_pipeline::146.117*0.98] mapped 5426 sequences
              [M::main] Version: 2.28-r1209
              [M::main] CMD: minimap2 -x map-hifi --q-occ-frac 0.01 -t 1 -a reference.fa /tmp/tmp94ch8nif/files/2/3/4/dataset_234accb8-bbf8-4f8d-a670-b9726f819999.dat
              [M::main] Real time: 146.132 sec; CPU: 143.204 sec; Peak RSS: 1.375 GB
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              alignment_options {"A": null, "B": null, "E": null, "E2": null, "O": null, "O2": null, "no_end_flt": true, "s": null, "splicing": {"__current_case__": 0, "splice_mode": "preset"}, "z": null, "z2": null}
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              fastq_input {"__current_case__": 0, "analysis_type_selector": "map-hifi", "fastq_input1": {"values": [{"id": 14, "src": "hda"}]}, "fastq_input_selector": "single"}
              indexing_options {"H": false, "I": null, "k": null, "w": null}
              io_options {"K": null, "L": false, "Q": false, "Y": false, "c": false, "cs": null, "eqx": false, "output_format": "BAM"}
              mapping_options {"F": null, "N": null, "X": false, "f": null, "g": null, "kmer_ocurrence_interval": {"__current_case__": 1, "interval": ""}, "m": null, "mask_len": null, "max_chain_iter": null, "max_chain_skip": null, "min_occ_floor": null, "n": null, "p": null, "q_occ_frac": "0.01", "r": null}
              reference_source {"__current_case__": 1, "ref_file": {"values": [{"id": 18, "src": "hda"}]}, "reference_source_selector": "history"}
      • Step 20: toolshed.g2.bx.psu.edu/repos/iuc/pretext_map/pretext_map/0.1.9+galaxy1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              filter {"__current_case__": 0, "filter_type": ""}
              map_qual "0"
              sorting {"__current_case__": 1, "sortby": "length", "sortorder": "descend"}
      • Step 3: Haplotype 2:

        • step_state: scheduled
      • Step 21: toolshed.g2.bx.psu.edu/repos/devteam/column_maker/Add_a_column1/2.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/column_maker/aff5135563c6/column_maker/column_maker.py' --column-types str,int,int  --file '/tmp/tmp94ch8nif/job_working_directory/000/28/configs/tmpk2c9hj33' --fail-on-non-existent-columns --fail-on-non-computable '/tmp/tmp94ch8nif/files/4/e/8/dataset_4e81dfd3-c328-45a5-90d7-00640ed80f7e.dat' '/tmp/tmp94ch8nif/job_working_directory/000/28/outputs/dataset_60aa9889-5f0c-425d-b50f-fdd3fc603770.dat'

            Exit Code:

            • 0

            Standard Output:

            • abs(int(c3)-int(c2))
              Computing 1 new columns with instructions ['abs(int(c3)-int(c2));;']
              Computed new column values for 100.00% of 4 lines written.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              avoid_scientific_notation false
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              error_handling {"auto_col_types": true, "fail_on_non_existent_columns": true, "non_computable": {"__current_case__": 0, "action": "--fail-on-non-computable"}}
              ops {"__current_case__": 0, "expressions": [{"__index__": 0, "add_column": {"__current_case__": 0, "mode": "", "pos": ""}, "cond": "abs(int(c3)-int(c2))"}], "header_lines_select": "no"}
      • Step 22: Cut1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • perl '/tmp/tmp94ch8nif/galaxy-dev/tools/filters/cutWrapper.pl' '/tmp/tmp94ch8nif/files/0/f/6/dataset_0f6ad7af-63e8-4547-b519-1636b8eb7e8e.dat' 'c1,c2,c3' T '/tmp/tmp94ch8nif/job_working_directory/000/29/outputs/dataset_b4c9e5ff-27b5-412f-b75c-a0b7342f46de.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "bed"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              columnList "c1,c2,c3"
              dbkey "?"
              delimiter "T"
      • Step 23: toolshed.g2.bx.psu.edu/repos/bgruening/deeptools_bam_coverage/deeptools_bam_coverage/3.5.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -s '/tmp/tmp94ch8nif/files/1/9/c/dataset_19ccb6f0-c197-432d-b1f2-b41297632f06.dat' one.bam && ln -s '/tmp/tmp94ch8nif/files/_metadata_files/1/4/e/metadata_14e632ef-42c3-4a1a-978d-859dd76c6650.dat' one.bam.bai &&  bamCoverage --numberOfProcessors "${GALAXY_SLOTS:-4}"  --bam one.bam --outFileName '/tmp/tmp94ch8nif/job_working_directory/000/30/outputs/dataset_1218cdec-ee4f-4a1b-95a1-2bd97063da48.dat' --outFileFormat 'bigwig'  --binSize 100

            Exit Code:

            • 0

            Standard Error:

            • bamFilesList: ['one.bam']
              binLength: 100
              numberOfSamples: None
              blackListFileName: None
              skipZeroOverZero: False
              bed_and_bin: False
              genomeChunkSize: None
              defaultFragmentLength: read length
              numberOfProcessors: 1
              verbose: False
              region: None
              bedFile: None
              minMappingQuality: None
              ignoreDuplicates: False
              chrsToSkip: []
              stepSize: 100
              center_read: False
              samFlag_include: None
              samFlag_exclude: None
              minFragmentLength: 0
              maxFragmentLength: 0
              zerosToNans: False
              smoothLength: None
              save_data: False
              out_file_for_raw_data: None
              maxPairedFragmentLength: 1000
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              advancedOpt {"__current_case__": 0, "showAdvancedOpt": "no"}
              binSize "100"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              exactScaling false
              outFileFormat "bigwig"
              region ""
              scaling {"__current_case__": 3, "type": "no"}
      • Step 24: toolshed.g2.bx.psu.edu/repos/devteam/column_maker/Add_a_column1/2.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/column_maker/aff5135563c6/column_maker/column_maker.py' --column-types str,int,int  --file '/tmp/tmp94ch8nif/job_working_directory/000/31/configs/tmp3i3b9rag' --fail-on-non-existent-columns --fail-on-non-computable '/tmp/tmp94ch8nif/files/b/4/c/dataset_b4c9e5ff-27b5-412f-b75c-a0b7342f46de.dat' '/tmp/tmp94ch8nif/job_working_directory/000/31/outputs/dataset_8a0dd52c-e3e8-4638-85c8-5b652c016ab3.dat'

            Exit Code:

            • 0

            Standard Output:

            • abs(int(c3)-int(c2))
              Computing 1 new columns with instructions ['abs(int(c3)-int(c2));;']
              Computed new column values for 100.00% of 1 lines written.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              avoid_scientific_notation false
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              error_handling {"auto_col_types": true, "fail_on_non_existent_columns": true, "non_computable": {"__current_case__": 0, "action": "--fail-on-non-computable"}}
              ops {"__current_case__": 0, "expressions": [{"__index__": 0, "add_column": {"__current_case__": 0, "mode": "", "pos": ""}, "cond": "abs(int(c3)-int(c2))"}], "header_lines_select": "no"}
      • Step 25: Add Coverage Track:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              name "coverage"
      • Step 26: Test if telomere track is empty:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "bedgraph"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              param_type "text"
              remove_newlines true
      • Step 27: False if telomere track is empty:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "f6aba3d60b2c11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input_param_type {"__current_case__": 0, "input_param": "scaffold_10.H1\t0\t11012\t11012", "mappings": [{"__index__": 0, "from": "", "to": "false"}], "type": "text"}
              output_param_type "boolean"
              unmapped {"__current_case__": 2, "default_value": "true", "on_unmapped": "default"}
      • Step 28: Add telomere track:

        • step_state: new
      • Step 29: Unlabelled step:

        • step_state: new
      • Step 30: Add gaps track:

        • step_state: new
      • Step 4: Do you want to add suffixes to the scaffold names?:

        • step_state: scheduled
      • Step 31: Unlabelled step:

        • step_state: new
      • Step 32: Unlabelled step:

        • step_state: new
      • Step 5: First Haplotype suffix:

        • step_state: scheduled
      • Step 6: Second Haplotype suffix:

        • step_state: scheduled
      • Step 7: Hi-C reads:

        • step_state: scheduled
      • Step 8: Do you want to trim the Hi-C data?:

        • step_state: scheduled
      • Step 9: Telomere repeat to suit species:

        • step_state: scheduled
      • Step 10: PacBio reads:

        • step_state: scheduled
    • Other invocation details
      • error_message

        • Final state of invocation 6b6f885a347c1737 is [failed]. Failed to run workflow, at least one job is in [paused] state.
      • history_id

        • 6b6f885a347c1737
      • history_state

        • paused
      • invocation_id

        • 6b6f885a347c1737
      • invocation_state

        • failed
      • messages

        • [{'dependent_workflow_step_id': None, 'hda_id': '00507017c18859da', 'reason': 'dataset_failed', 'workflow_step_id': 27}]
      • workflow_id

        • 5c0594841f9dcc30
  • ❌ hi-c-map-for-assembly-manual-curation.ga_1

    Execution Problem:

    • Final state of invocation f44384aa1da712c8 is [failed]. Failed to run workflow, at least one job is in [paused] state.
      

    Workflow invocation details

    • Invocation Messages

      • Invocation scheduling failed because step 28 requires a dataset, but dataset entered a failed state.
    • Steps
      • Step 1: Haplotype 1:

        • step_state: scheduled
      • Step 2: Will you use a second haplotype?:

        • step_state: scheduled
      • Step 11: Hap2 not provided:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input_param_type {"__current_case__": 3, "input_param": false, "mappings": [{"__index__": 0, "from": false, "to": "true"}, {"__index__": 1, "from": true, "to": "false"}], "type": "boolean"}
              output_param_type "boolean"
              unmapped {"__current_case__": 2, "default_value": "false", "on_unmapped": "default"}
      • Step 12: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Hap1:

            • step_state: scheduled
          • Step 2: Hap2:

            • step_state: scheduled
          • Step 11: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a70b2d11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 46, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 38, "src": "hda"}]}}]}}
          • Step 12: toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_cat/9.3+galaxy1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a70b2d11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  queries [{"__index__": 0, "inputs2": {"values": [{"id": 48, "src": "hda"}]}}]
          • Step 3: Do you want to add suffixes to the scaffold names?:

            • step_state: scheduled
          • Step 4: Hap1 suffix:

            • step_state: scheduled
          • Step 5: Hap2 suffix:

            • step_state: scheduled
          • Step 6: Expression for hap1 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a70b2d11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H1", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 7: Expression for hap2 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a70b2d11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H2", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 8: add hap1 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a70b2d11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": null}]
          • Step 9: add hap2 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a70b2d11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": null}]
          • Step 10: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a70b2d11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 45, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 37, "src": "hda"}]}}]}}
      • Step 13: concatenate HiFi:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cat '/tmp/tmp94ch8nif/files/9/5/b/dataset_95be8062-98b6-41d8-93b2-1e965b5ff5d8.dat' >> '/tmp/tmp94ch8nif/job_working_directory/000/48/outputs/dataset_1dcb7377-156e-420e-ad94-e73047a76ea8.dat' && exit 0

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              queries []
      • Step 14: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Hap1:

            • step_state: scheduled
          • Step 2: Do you want to add suffixes to the scaffold names?:

            • step_state: scheduled
          • Step 3: Hap1 suffix:

            • step_state: scheduled
          • Step 4: Expression for hap1 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a80b2d11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H1", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 5: add hap1 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • sed -r --sandbox -e 's/>.+$/&.H1/g' '/tmp/tmp94ch8nif/files/8/f/a/dataset_8fa9524f-a180-4b7b-9914-7f85a4b7d35e.dat' > '/tmp/tmp94ch8nif/job_working_directory/000/50/outputs/dataset_ad381e9f-6e2a-49de-b7d3-e9e0e088c622.dat'

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a80b2d11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": "&.H1"}]
          • Step 6: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a80b2d11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 52, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 37, "src": "hda"}]}}]}}
      • Step 15: Pick Assembly:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 49, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 53, "src": "hda"}]}}]}}
      • Step 16: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Reference:

            • step_state: scheduled
          • Step 2: Hi-C reads:

            • step_state: scheduled
          • Step 3: Do you want to trim the Hi-C data?:

            • step_state: scheduled
          • Step 4: Trim Hi-C reads 2:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • ln -f -s '/tmp/tmp94ch8nif/files/4/0/4/dataset_4047d1b7-efa4-4130-bd7c-d016b594ba7c.dat' 'Hi-C reads_1.fq.gz' && ln -f -s '/tmp/tmp94ch8nif/files/c/0/3/dataset_c034b934-a19f-4b55-8a84-0d609f9766fe.dat' 'Hi-C reads_2.fq.gz' &&  cutadapt  -j=${GALAXY_SLOTS:-4}     --error-rate=0.1 --times=1 --overlap=3    --action=trim   --cut=5 -U 5       --minimum-length=1      -o 'out1.fq.gz' -p 'out2.fq.gz'  'Hi-C reads_1.fq.gz' 'Hi-C reads_2.fq.gz'  > report.txt

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a90b2d11f08a576045bd086594"
                  adapter_options {"action": "trim", "error_rate": "0.1", "match_read_wildcards": false, "no_indels": false, "no_match_adapter_wildcards": true, "overlap": "3", "revcomp": false, "times": "1"}
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  filter_options {"discard_casava": false, "discard_trimmed": false, "discard_untrimmed": false, "max_average_error_rate": null, "max_expected_errors": null, "max_n": null, "maximum_length": null, "maximum_length2": null, "minimum_length": "1", "minimum_length2": null, "pair_filter": "any"}
                  library {"__current_case__": 2, "input_1": {"values": [{"id": 13, "src": "dce"}]}, "pair_adapters": false, "r1": {"adapters": [], "anywhere_adapters": [], "front_adapters": []}, "r2": {"adapters2": [], "anywhere_adapters2": [], "front_adapters2": []}, "type": "paired_collection"}
                  other_trimming_options {"cut": "5", "cut2": "5", "nextseq_trim": "0", "poly_a": false, "quality_cutoff": "0", "quality_cutoff2": "", "shorten_options": {"__current_case__": 1, "shorten_values": "False"}, "shorten_options_r2": {"__current_case__": 1, "shorten_values_r2": "False"}, "trim_n": false}
                  output_selector ["report"]
                  read_mod_options {"length_tag": null, "rename": null, "strip_suffix": null, "zero_cap": false}
          • Step 5: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a90b2d11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 19, "src": "dce"}]}}, {"__index__": 1, "value": {"values": [{"id": 14, "src": "dce"}]}}]}}
              • Job 2:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a90b2d11f08a576045bd086594"
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 20, "src": "dce"}]}}, {"__index__": 1, "value": {"values": [{"id": 15, "src": "dce"}]}}]}}
          • Step 6: toolshed.g2.bx.psu.edu/repos/iuc/bwa_mem2/bwa_mem2/2.2.1+galaxy1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is error

                Command Line:

                • set -o | grep -q pipefail && set -o pipefail;  ln -s '/tmp/tmp94ch8nif/files/a/d/3/dataset_ad381e9f-6e2a-49de-b7d3-e9e0e088c622.dat' 'localref.fasta' && bwa-mem2 index 'localref.fasta' &&    bwa-mem2 mem -t "${GALAXY_SLOTS:-1}" -v 1   -k '19' -w '100' -d '100' -r '1.5' -y '20' -c '500' -D '0.5' -W '0' -m '50' -S -P    -T '20' -h '5' -a                      'localref.fasta' '/tmp/tmp94ch8nif/files/1/a/4/dataset_1a44c31d-51c6-4d78-8d4a-e896a84303fb.dat' '/tmp/tmp94ch8nif/files/a/a/0/dataset_aa0b4fdc-e780-45e4-80fb-8662f0df9de3.dat'  | samtools sort -@${GALAXY_SLOTS:-2} -T "${TMPDIR:-.}" -O bam -o '/tmp/tmp94ch8nif/job_working_directory/000/56/outputs/dataset_d9f62870-ba76-4a4d-884e-cfecab515c21.dat'

                Exit Code:

                • 127

                Standard Error:

                • Looking to launch executable "/usr/local/bin/bwa-mem2.avx2", simd = .avx2
                  Launching executable "/usr/local/bin/bwa-mem2.avx2"
                  [bwa_index] Pack FASTA... 0.65 sec
                  * Entering FMI_search
                  init ticks = 13114870874
                  ref seq len = 275747614
                  binary seq ticks = 5681283856
                  build suffix-array ticks = 100589467818
                  pos: 34468452, ref_seq_len__: 34468451
                  build fm-index ticks = 21257192819
                  Total time taken: 58.2091
                  /tmp/tmp94ch8nif/job_working_directory/000/56/tool_script.sh: line 23: samtools: command not found
                  Looking to launch executable "/usr/local/bin/bwa-mem2.avx2", simd = .avx2
                  Launching executable "/usr/local/bin/bwa-mem2.avx2"
                  -----------------------------
                  Executing in AVX2 mode!!
                  -----------------------------
                  * SA compression enabled with xfactor: 8
                  * Ref file: localref.fasta
                  * Entering FMI_search
                  * Index file found. Loading index from localref.fasta.bwt.2bit.64
                  * Reference seq len for bi-index = 275747615
                  * sentinel-index: 224703773
                  * Count:
                  0,	1
                  1,	80527259
                  2,	137873808
                  3,	195220357
                  4,	275747615
                  
                  * Reading other elements of the index from files localref.fasta
                  * Index prefix: localref.fasta
                  * Read 0 ALT contigs
                  * Done reading Index!!
                  * Reading reference genome..
                  * Binary seq file = localref.fasta.0123
                  * Reference genome size: 275747614 bp
                  * Done reading reference genome !!
                  
                  ------------------------------------------
                  1. Memory pre-allocation for Chaining: 139.3584 MB
                  2. Memory pre-allocation for BSW: 239.6170 MB
                  3. Memory pre-allocation for BWT: 77.3142 MB
                  ------------------------------------------
                  * Threads used (compute): 1
                  * No. of pipeline threads: 2
                  
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] Reallocating initial memory allocations!!
                  [0000] Calling mem_process_seqs.., task: 0
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 137873807, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (2021, 3042, 5240)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 11678)
                  [0000][PE] mean and std.dev: (3782.74, 2698.62)
                  [0000][PE] low and high boundaries for proper pairs: (1, 14897)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (151, 230, 368)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 802)
                  [0000][PE] mean and std.dev: (244.21, 150.93)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1019)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (2708, 3331, 6114)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 12926)
                  [0000][PE] mean and std.dev: (4077.29, 2787.30)
                  [0000][PE] low and high boundaries for proper pairs: (1, 16332)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1608, 2936, 5772)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 14100)
                  [0000][PE] mean and std.dev: (3700.00, 2741.58)
                  [0000][PE] low and high boundaries for proper pairs: (1, 18264)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 81.560 CPU sec, 81.608 real sec
                  [0000] Calling mem_process_seqs.., task: 1
                  

                Standard Output:

                • ref_seq_len = 275747614
                  count = 0, 80527258, 137873807, 195220356, 275747614
                  BWT[224703773] = 4
                  CP_SHIFT = 6, CP_MASK = 63
                  sizeof CP_OCC = 64
                  max_occ_ind = 4308556
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a90b2d11f08a576045bd086594"
                  analysis_type {"__current_case__": 4, "algorithmic_options": {"D": "0.5", "P": true, "S": true, "W": "0", "__current_case__": 0, "algorithmic_options_selector": "set", "c": "500", "d": "100", "e": false, "k": "19", "m": "50", "r": "1.5", "w": "100", "y": "20"}, "analysis_type_selector": "full", "io_options": {"C": false, "K": null, "M": false, "T": "20", "V": false, "Y": false, "__current_case__": 0, "a": true, "five": false, "h": "5", "io_options_selector": "set", "q": false}, "scoring_options": {"__current_case__": 1, "scoring_options_selector": "do_not_set"}}
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  fastq_input {"__current_case__": 2, "fastq_input1": {"values": [{"id": 21, "src": "dce"}]}, "fastq_input_selector": "paired_collection", "iset_stats": null}
                  output_sort "coordinate"
                  reference_source {"__current_case__": 1, "ref_file": {"values": [{"id": 54, "src": "hda"}]}, "reference_source_selector": "history"}
                  rg {"__current_case__": 3, "rg_selector": "do_not_set"}
          • Step 7: toolshed.g2.bx.psu.edu/repos/iuc/samtools_merge/samtools_merge/1.20+galaxy2:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is paused

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "e2dc92a90b2d11f08a576045bd086594"
                  bed_file None
                  chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  headerbam None
                  idpg false
                  idrg false
                  region None
                  seed "1"
      • Step 17: toolshed.g2.bx.psu.edu/repos/bgruening/gfastats/gfastats/1.3.9+galaxy2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • gfastats '/tmp/tmp94ch8nif/files/a/d/3/dataset_ad381e9f-6e2a-49de-b7d3-e9e0e088c622.dat' --out-coord g   --tabular > '/tmp/tmp94ch8nif/job_working_directory/000/58/outputs/dataset_31de9602-8f42-4bc4-aaf8-c5b537972b0f.dat' --threads ${GALAXY_SLOTS:-8}

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode_condition {"__current_case__": 1, "discover_paths": false, "locale": false, "selector": "statistics", "statistics_condition": {"__current_case__": 1, "out_coord": "g", "selector": "coordinates"}, "tabular": true}
              target_condition {"__current_case__": 0, "target_option": "false"}
      • Step 18: toolshed.g2.bx.psu.edu/repos/iuc/seqtk/seqtk_telo/1.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • seqtk telo -m 'CCCTAA' -p '1' -d '2000' -s '300'  '/tmp/tmp94ch8nif/files/a/d/3/dataset_ad381e9f-6e2a-49de-b7d3-e9e0e088c622.dat' > '/tmp/tmp94ch8nif/job_working_directory/000/59/outputs/dataset_32b6643b-fdb7-4109-bec8-3ed3a7ecb4bf.dat'

            Exit Code:

            • 0

            Standard Error:

            • 0	137873807
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              P false
              __input_ext "input"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              d "2000"
              dbkey "?"
              m "CCCTAA"
              p "1"
              s "300"
      • Step 19: toolshed.g2.bx.psu.edu/repos/iuc/minimap2/minimap2/2.28+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -f -s '/tmp/tmp94ch8nif/files/a/d/3/dataset_ad381e9f-6e2a-49de-b7d3-e9e0e088c622.dat' reference.fa && minimap2 -x map-hifi    --q-occ-frac 0.01       -t ${GALAXY_SLOTS:-4} reference.fa '/tmp/tmp94ch8nif/files/1/d/c/dataset_1dcb7377-156e-420e-ad94-e73047a76ea8.dat' -a | samtools view --no-PG -hT reference.fa | samtools sort -@${GALAXY_SLOTS:-2} -T "${TMPDIR:-.}" -O BAM -o '/tmp/tmp94ch8nif/job_working_directory/000/60/outputs/dataset_88399275-d760-4d15-acf5-3848322acef5.dat'

            Exit Code:

            • 0

            Standard Error:

            • [M::mm_idx_gen::4.470*0.77] collected minimizers
              [M::mm_idx_gen::5.825*0.81] sorted minimizers
              [M::main::5.826*0.81] loaded/built the index for 1 target sequence(s)
              [M::mm_mapopt_update::5.956*0.81] mid_occ = 113
              [M::mm_idx_stat] kmer size: 19; skip: 19; is_hpc: 0; #seq: 1
              [M::mm_idx_stat::6.056*0.82] distinct minimizers: 11276538 (95.62% are singletons); average occurrences: 1.229; average spacing: 9.951; total length: 137873807
              [M::worker_pipeline::143.520*0.98] mapped 5426 sequences
              [M::main] Version: 2.28-r1209
              [M::main] CMD: minimap2 -x map-hifi --q-occ-frac 0.01 -t 1 -a reference.fa /tmp/tmp94ch8nif/files/1/d/c/dataset_1dcb7377-156e-420e-ad94-e73047a76ea8.dat
              [M::main] Real time: 143.535 sec; CPU: 140.513 sec; Peak RSS: 1.257 GB
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              alignment_options {"A": null, "B": null, "E": null, "E2": null, "O": null, "O2": null, "no_end_flt": true, "s": null, "splicing": {"__current_case__": 0, "splice_mode": "preset"}, "z": null, "z2": null}
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              fastq_input {"__current_case__": 0, "analysis_type_selector": "map-hifi", "fastq_input1": {"values": [{"id": 50, "src": "hda"}]}, "fastq_input_selector": "single"}
              indexing_options {"H": false, "I": null, "k": null, "w": null}
              io_options {"K": null, "L": false, "Q": false, "Y": false, "c": false, "cs": null, "eqx": false, "output_format": "BAM"}
              mapping_options {"F": null, "N": null, "X": false, "f": null, "g": null, "kmer_ocurrence_interval": {"__current_case__": 1, "interval": ""}, "m": null, "mask_len": null, "max_chain_iter": null, "max_chain_skip": null, "min_occ_floor": null, "n": null, "p": null, "q_occ_frac": "0.01", "r": null}
              reference_source {"__current_case__": 1, "ref_file": {"values": [{"id": 54, "src": "hda"}]}, "reference_source_selector": "history"}
      • Step 20: toolshed.g2.bx.psu.edu/repos/iuc/pretext_map/pretext_map/0.1.9+galaxy1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              filter {"__current_case__": 0, "filter_type": ""}
              map_qual "0"
              sorting {"__current_case__": 1, "sortby": "length", "sortorder": "descend"}
      • Step 3: Haplotype 2:

        • step_state: scheduled
      • Step 21: toolshed.g2.bx.psu.edu/repos/devteam/column_maker/Add_a_column1/2.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/column_maker/aff5135563c6/column_maker/column_maker.py' --column-types str,int,int  --file '/tmp/tmp94ch8nif/job_working_directory/000/62/configs/tmpg6_hv1n3' --fail-on-non-existent-columns --fail-on-non-computable '/tmp/tmp94ch8nif/files/3/1/d/dataset_31de9602-8f42-4bc4-aaf8-c5b537972b0f.dat' '/tmp/tmp94ch8nif/job_working_directory/000/62/outputs/dataset_c9a4bd44-f832-4a87-8f68-5e861cf89113.dat'

            Exit Code:

            • 0

            Standard Output:

            • abs(int(c3)-int(c2))
              Computing 1 new columns with instructions ['abs(int(c3)-int(c2));;']
              Computed new column values for 100.00% of 8 lines written.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              avoid_scientific_notation false
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              error_handling {"auto_col_types": true, "fail_on_non_existent_columns": true, "non_computable": {"__current_case__": 0, "action": "--fail-on-non-computable"}}
              ops {"__current_case__": 0, "expressions": [{"__index__": 0, "add_column": {"__current_case__": 0, "mode": "", "pos": ""}, "cond": "abs(int(c3)-int(c2))"}], "header_lines_select": "no"}
      • Step 22: Cut1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • perl '/tmp/tmp94ch8nif/galaxy-dev/tools/filters/cutWrapper.pl' '/tmp/tmp94ch8nif/files/3/2/b/dataset_32b6643b-fdb7-4109-bec8-3ed3a7ecb4bf.dat' 'c1,c2,c3' T '/tmp/tmp94ch8nif/job_working_directory/000/63/outputs/dataset_6f4d8408-c90c-4538-b820-220c36659c15.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "bed"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              columnList "c1,c2,c3"
              dbkey "?"
              delimiter "T"
      • Step 23: toolshed.g2.bx.psu.edu/repos/bgruening/deeptools_bam_coverage/deeptools_bam_coverage/3.5.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -s '/tmp/tmp94ch8nif/files/8/8/3/dataset_88399275-d760-4d15-acf5-3848322acef5.dat' one.bam && ln -s '/tmp/tmp94ch8nif/files/_metadata_files/1/b/4/metadata_1b4ae637-1394-4169-94fc-91813f3e1b04.dat' one.bam.bai &&  bamCoverage --numberOfProcessors "${GALAXY_SLOTS:-4}"  --bam one.bam --outFileName '/tmp/tmp94ch8nif/job_working_directory/000/64/outputs/dataset_c9f4f5ee-d298-40a3-8d96-673b8dc3a639.dat' --outFileFormat 'bigwig'  --binSize 100

            Exit Code:

            • 0

            Standard Error:

            • bamFilesList: ['one.bam']
              binLength: 100
              numberOfSamples: None
              blackListFileName: None
              skipZeroOverZero: False
              bed_and_bin: False
              genomeChunkSize: None
              defaultFragmentLength: read length
              numberOfProcessors: 1
              verbose: False
              region: None
              bedFile: None
              minMappingQuality: None
              ignoreDuplicates: False
              chrsToSkip: []
              stepSize: 100
              center_read: False
              samFlag_include: None
              samFlag_exclude: None
              minFragmentLength: 0
              maxFragmentLength: 0
              zerosToNans: False
              smoothLength: None
              save_data: False
              out_file_for_raw_data: None
              maxPairedFragmentLength: 1000
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              advancedOpt {"__current_case__": 0, "showAdvancedOpt": "no"}
              binSize "100"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              exactScaling false
              outFileFormat "bigwig"
              region ""
              scaling {"__current_case__": 3, "type": "no"}
      • Step 24: toolshed.g2.bx.psu.edu/repos/devteam/column_maker/Add_a_column1/2.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/column_maker/aff5135563c6/column_maker/column_maker.py' --column-types  --file '/tmp/tmp94ch8nif/job_working_directory/000/65/configs/tmp6d21jbrj' --fail-on-non-existent-columns --fail-on-non-computable '/tmp/tmp94ch8nif/files/6/f/4/dataset_6f4d8408-c90c-4538-b820-220c36659c15.dat' '/tmp/tmp94ch8nif/job_working_directory/000/65/outputs/dataset_8dcd5910-6423-4b52-a222-379d87353d5c.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              avoid_scientific_notation false
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              error_handling {"auto_col_types": true, "fail_on_non_existent_columns": true, "non_computable": {"__current_case__": 0, "action": "--fail-on-non-computable"}}
              ops {"__current_case__": 0, "expressions": [{"__index__": 0, "add_column": {"__current_case__": 0, "mode": "", "pos": ""}, "cond": "abs(int(c3)-int(c2))"}], "header_lines_select": "no"}
      • Step 25: Add Coverage Track:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              name "coverage"
      • Step 26: Test if telomere track is empty:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "bedgraph"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              param_type "text"
              remove_newlines true
      • Step 27: False if telomere track is empty:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "e2dc92a60b2d11f08a576045bd086594"
              chromInfo "/tmp/tmp94ch8nif/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input_param_type {"__current_case__": 0, "input_param": "", "mappings": [{"__index__": 0, "from": "", "to": "false"}], "type": "text"}
              output_param_type "boolean"
              unmapped {"__current_case__": 2, "default_value": "true", "on_unmapped": "default"}
      • Step 28: Add telomere track:

        • step_state: new
      • Step 29: Unlabelled step:

        • step_state: new
      • Step 30: Add gaps track:

        • step_state: new
      • Step 4: Do you want to add suffixes to the scaffold names?:

        • step_state: scheduled
      • Step 31: Unlabelled step:

        • step_state: new
      • Step 32: Unlabelled step:

        • step_state: new
      • Step 5: First Haplotype suffix:

        • step_state: scheduled
      • Step 6: Second Haplotype suffix:

        • step_state: scheduled
      • Step 7: Hi-C reads:

        • step_state: scheduled
      • Step 8: Do you want to trim the Hi-C data?:

        • step_state: scheduled
      • Step 9: Telomere repeat to suit species:

        • step_state: scheduled
      • Step 10: PacBio reads:

        • step_state: scheduled
    • Other invocation details
      • error_message

        • Final state of invocation f44384aa1da712c8 is [failed]. Failed to run workflow, at least one job is in [paused] state.
      • history_id

        • ccf04a673afd229c
      • history_state

        • paused
      • invocation_id

        • f44384aa1da712c8
      • invocation_state

        • failed
      • messages

        • [{'dependent_workflow_step_id': None, 'hda_id': '35e1bbbab2d37aa0', 'reason': 'dataset_failed', 'workflow_step_id': 27}]
      • workflow_id

        • 5c0594841f9dcc30

@github-actions
Copy link
Copy Markdown

Test Results (powered by Planemo)

Test Summary

Test State Count
Total 2
Passed 0
Error 2
Failure 0
Skipped 0
Errored Tests
  • ❌ hi-c-map-for-assembly-manual-curation.ga_0

    Execution Problem:

    • Final state of invocation 426c98cdc83d7e58 is [failed]. Failed to run workflow, at least one job is in [paused] state.
      

    Workflow invocation details

    • Invocation Messages

      • Invocation scheduling failed because step 28 requires a dataset, but dataset entered a failed state.
    • Steps
      • Step 1: Haplotype 1:

        • step_state: scheduled
      • Step 2: Will you use a second haplotype?:

        • step_state: scheduled
      • Step 11: Hap2 not provided:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input_param_type {"__current_case__": 3, "input_param": false, "mappings": [{"__index__": 0, "from": false, "to": "true"}, {"__index__": 1, "from": true, "to": "false"}], "type": "boolean"}
              output_param_type "boolean"
              unmapped {"__current_case__": 2, "default_value": "false", "on_unmapped": "default"}
      • Step 12: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Hap1:

            • step_state: scheduled
          • Step 2: Hap2:

            • step_state: scheduled
          • Step 11: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53d0b3011f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 10, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 2, "src": "hda"}]}}]}}
          • Step 12: toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_cat/9.3+galaxy1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53d0b3011f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  queries [{"__index__": 0, "inputs2": {"values": [{"id": 12, "src": "hda"}]}}]
          • Step 3: Do you want to add suffixes to the scaffold names?:

            • step_state: scheduled
          • Step 4: Hap1 suffix:

            • step_state: scheduled
          • Step 5: Hap2 suffix:

            • step_state: scheduled
          • Step 6: Expression for hap1 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53d0b3011f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H1", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 7: Expression for hap2 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53d0b3011f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H2", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 8: add hap1 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53d0b3011f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": null}]
          • Step 9: add hap2 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53d0b3011f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": null}]
          • Step 10: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53d0b3011f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 9, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 1, "src": "hda"}]}}]}}
      • Step 13: concatenate HiFi:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cat '/tmp/tmpkp3ixf17/files/9/3/d/dataset_93d2fae1-7f12-43c2-a620-8edc84a5e9aa.dat' >> '/tmp/tmpkp3ixf17/job_working_directory/000/14/outputs/dataset_5e607b56-db95-44e4-be2b-43b2b4653fe7.dat' && exit 0

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              queries []
      • Step 14: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Hap1:

            • step_state: scheduled
          • Step 2: Do you want to add suffixes to the scaffold names?:

            • step_state: scheduled
          • Step 3: Hap1 suffix:

            • step_state: scheduled
          • Step 4: Expression for hap1 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53e0b3011f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H1", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 5: add hap1 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • sed -r --sandbox -e 's/>.+$/&.H1/g' '/tmp/tmpkp3ixf17/files/4/3/1/dataset_4318a073-31ad-46b7-aa34-8d07b97ede11.dat' > '/tmp/tmpkp3ixf17/job_working_directory/000/16/outputs/dataset_1a673082-9227-49d5-9725-d3fd3b2aafa2.dat'

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53e0b3011f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": "&.H1"}]
          • Step 6: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53e0b3011f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 16, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 1, "src": "hda"}]}}]}}
      • Step 15: Pick Assembly:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 13, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 17, "src": "hda"}]}}]}}
      • Step 16: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Reference:

            • step_state: scheduled
          • Step 2: Hi-C reads:

            • step_state: scheduled
          • Step 3: Do you want to trim the Hi-C data?:

            • step_state: scheduled
          • Step 4: Trim Hi-C reads 2:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • ln -f -s '/tmp/tmpkp3ixf17/files/6/5/4/dataset_6545ecab-df9d-4a0c-abd7-1140a62d9f69.dat' 'Hi-C reads_1.fq.gz' && ln -f -s '/tmp/tmpkp3ixf17/files/f/0/1/dataset_f01beb80-c6e6-4f81-bd8d-768f5090963b.dat' 'Hi-C reads_2.fq.gz' &&  cutadapt  -j=${GALAXY_SLOTS:-4}     --error-rate=0.1 --times=1 --overlap=3    --action=trim   --cut=5 -U 5       --minimum-length=1      -o 'out1.fq.gz' -p 'out2.fq.gz'  'Hi-C reads_1.fq.gz' 'Hi-C reads_2.fq.gz'  > report.txt

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53f0b3011f08a576045bdd71c8b"
                  adapter_options {"action": "trim", "error_rate": "0.1", "match_read_wildcards": false, "no_indels": false, "no_match_adapter_wildcards": true, "overlap": "3", "revcomp": false, "times": "1"}
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  filter_options {"discard_casava": false, "discard_trimmed": false, "discard_untrimmed": false, "max_average_error_rate": null, "max_expected_errors": null, "max_n": null, "maximum_length": null, "maximum_length2": null, "minimum_length": "1", "minimum_length2": null, "pair_filter": "any"}
                  library {"__current_case__": 2, "input_1": {"values": [{"id": 1, "src": "dce"}]}, "pair_adapters": false, "r1": {"adapters": [], "anywhere_adapters": [], "front_adapters": []}, "r2": {"adapters2": [], "anywhere_adapters2": [], "front_adapters2": []}, "type": "paired_collection"}
                  other_trimming_options {"cut": "5", "cut2": "5", "nextseq_trim": "0", "poly_a": false, "quality_cutoff": "0", "quality_cutoff2": "", "shorten_options": {"__current_case__": 1, "shorten_values": "False"}, "shorten_options_r2": {"__current_case__": 1, "shorten_values_r2": "False"}, "trim_n": false}
                  output_selector ["report"]
                  read_mod_options {"length_tag": null, "rename": null, "strip_suffix": null, "zero_cap": false}
          • Step 5: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53f0b3011f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 7, "src": "dce"}]}}, {"__index__": 1, "value": {"values": [{"id": 2, "src": "dce"}]}}]}}
              • Job 2:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53f0b3011f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 8, "src": "dce"}]}}, {"__index__": 1, "value": {"values": [{"id": 3, "src": "dce"}]}}]}}
          • Step 6: toolshed.g2.bx.psu.edu/repos/iuc/bwa_mem2/bwa_mem2/2.2.1+galaxy1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is error

                Command Line:

                • set -o | grep -q pipefail && set -o pipefail;  ln -s '/tmp/tmpkp3ixf17/files/1/a/6/dataset_1a673082-9227-49d5-9725-d3fd3b2aafa2.dat' 'localref.fasta' && bwa-mem2 index 'localref.fasta' &&    bwa-mem2 mem -t "${GALAXY_SLOTS:-1}" -v 1   -k '19' -w '100' -d '100' -r '1.5' -y '20' -c '500' -D '0.5' -W '0' -m '50' -S -P    -T '20' -h '5' -a                      'localref.fasta' '/tmp/tmpkp3ixf17/files/d/4/1/dataset_d41d1393-b065-4360-ae8b-5413875feca6.dat' '/tmp/tmpkp3ixf17/files/2/8/e/dataset_28e14152-ff7c-4d3a-82d6-9b138838fbc3.dat'  | samtools sort -@${GALAXY_SLOTS:-2} -T "${TMPDIR:-.}" -O bam -o '/tmp/tmpkp3ixf17/job_working_directory/000/22/outputs/dataset_9479d8dc-aee0-4d0b-a017-e49e82bd3104.dat'

                Exit Code:

                • 127

                Standard Error:

                • Looking to launch executable "/usr/local/bin/bwa-mem2.avx2", simd = .avx2
                  Launching executable "/usr/local/bin/bwa-mem2.avx2"
                  [bwa_index] Pack FASTA... 0.60 sec
                  * Entering FMI_search
                  init ticks = 13443405854
                  ref seq len = 279307354
                  binary seq ticks = 5386094896
                  build suffix-array ticks = 88242136442
                  pos: 34913420, ref_seq_len__: 34913419
                  build fm-index ticks = 20970795659
                  Total time taken: 53.0134
                  /tmp/tmpkp3ixf17/job_working_directory/000/22/tool_script.sh: line 23: samtools: command not found
                  Looking to launch executable "/usr/local/bin/bwa-mem2.avx2", simd = .avx2
                  Launching executable "/usr/local/bin/bwa-mem2.avx2"
                  -----------------------------
                  Executing in AVX2 mode!!
                  -----------------------------
                  * SA compression enabled with xfactor: 8
                  * Ref file: localref.fasta
                  * Entering FMI_search
                  * Index file found. Loading index from localref.fasta.bwt.2bit.64
                  * Reference seq len for bi-index = 279307355
                  * sentinel-index: 13446364
                  * Count:
                  0,	1
                  1,	84204314
                  2,	139653678
                  3,	195103042
                  4,	279307355
                  
                  * Reading other elements of the index from files localref.fasta
                  * Index prefix: localref.fasta
                  * Read 0 ALT contigs
                  * Done reading Index!!
                  * Reading reference genome..
                  * Binary seq file = localref.fasta.0123
                  * Reference genome size: 279307354 bp
                  * Done reading reference genome !!
                  
                  ------------------------------------------
                  1. Memory pre-allocation for Chaining: 139.3584 MB
                  2. Memory pre-allocation for BSW: 239.6170 MB
                  3. Memory pre-allocation for BWT: 77.3142 MB
                  ------------------------------------------
                  * Threads used (compute): 1
                  * No. of pipeline threads: 2
                  
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] Reallocating initial memory allocations!!
                  [0000] Calling mem_process_seqs.., task: 0
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 139653677, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (1649, 4140, 6027)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 14783)
                  [0000][PE] mean and std.dev: (4063.15, 2569.67)
                  [0000][PE] low and high boundaries for proper pairs: (1, 19161)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (157, 248, 382)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 832)
                  [0000][PE] mean and std.dev: (254.60, 148.66)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1057)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (2814, 4404, 5708)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 11496)
                  [0000][PE] mean and std.dev: (4321.60, 2279.57)
                  [0000][PE] low and high boundaries for proper pairs: (1, 14390)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1378, 2846, 5532)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 13840)
                  [0000][PE] mean and std.dev: (3534.90, 2567.86)
                  [0000][PE] low and high boundaries for proper pairs: (1, 17994)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 93.442 CPU sec, 93.703 real sec
                  

                Standard Output:

                • ref_seq_len = 279307354
                  count = 0, 84204313, 139653677, 195103041, 279307354
                  BWT[13446364] = 4
                  CP_SHIFT = 6, CP_MASK = 63
                  sizeof CP_OCC = 64
                  max_occ_ind = 4364177
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53f0b3011f08a576045bdd71c8b"
                  analysis_type {"__current_case__": 4, "algorithmic_options": {"D": "0.5", "P": true, "S": true, "W": "0", "__current_case__": 0, "algorithmic_options_selector": "set", "c": "500", "d": "100", "e": false, "k": "19", "m": "50", "r": "1.5", "w": "100", "y": "20"}, "analysis_type_selector": "full", "io_options": {"C": false, "K": null, "M": false, "T": "20", "V": false, "Y": false, "__current_case__": 0, "a": true, "five": false, "h": "5", "io_options_selector": "set", "q": false}, "scoring_options": {"__current_case__": 1, "scoring_options_selector": "do_not_set"}}
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  fastq_input {"__current_case__": 2, "fastq_input1": {"values": [{"id": 9, "src": "dce"}]}, "fastq_input_selector": "paired_collection", "iset_stats": null}
                  output_sort "coordinate"
                  reference_source {"__current_case__": 1, "ref_file": {"values": [{"id": 18, "src": "hda"}]}, "reference_source_selector": "history"}
                  rg {"__current_case__": 3, "rg_selector": "do_not_set"}
          • Step 7: toolshed.g2.bx.psu.edu/repos/iuc/samtools_merge/samtools_merge/1.20+galaxy2:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is paused

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "ef47f53f0b3011f08a576045bdd71c8b"
                  bed_file None
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  headerbam None
                  idpg false
                  idrg false
                  region None
                  seed "1"
      • Step 17: toolshed.g2.bx.psu.edu/repos/bgruening/gfastats/gfastats/1.3.9+galaxy2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • gfastats '/tmp/tmpkp3ixf17/files/1/a/6/dataset_1a673082-9227-49d5-9725-d3fd3b2aafa2.dat' --out-coord g   --tabular > '/tmp/tmpkp3ixf17/job_working_directory/000/24/outputs/dataset_b3ac5b46-e92f-4422-8a54-f26e223298ad.dat' --threads ${GALAXY_SLOTS:-8}

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode_condition {"__current_case__": 1, "discover_paths": false, "locale": false, "selector": "statistics", "statistics_condition": {"__current_case__": 1, "out_coord": "g", "selector": "coordinates"}, "tabular": true}
              target_condition {"__current_case__": 0, "target_option": "false"}
      • Step 18: toolshed.g2.bx.psu.edu/repos/iuc/seqtk/seqtk_telo/1.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • seqtk telo -m 'CCCTAA' -p '1' -d '2000' -s '300'  '/tmp/tmpkp3ixf17/files/1/a/6/dataset_1a673082-9227-49d5-9725-d3fd3b2aafa2.dat' > '/tmp/tmpkp3ixf17/job_working_directory/000/25/outputs/dataset_006a287f-de60-486b-b08b-acf15c23b6b1.dat'

            Exit Code:

            • 0

            Standard Error:

            • 11012	139653677
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              P false
              __input_ext "input"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              d "2000"
              dbkey "?"
              m "CCCTAA"
              p "1"
              s "300"
      • Step 19: toolshed.g2.bx.psu.edu/repos/iuc/minimap2/minimap2/2.28+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -f -s '/tmp/tmpkp3ixf17/files/1/a/6/dataset_1a673082-9227-49d5-9725-d3fd3b2aafa2.dat' reference.fa && minimap2 -x map-hifi    --q-occ-frac 0.01       -t ${GALAXY_SLOTS:-4} reference.fa '/tmp/tmpkp3ixf17/files/5/e/6/dataset_5e607b56-db95-44e4-be2b-43b2b4653fe7.dat' -a | samtools view --no-PG -hT reference.fa | samtools sort -@${GALAXY_SLOTS:-2} -T "${TMPDIR:-.}" -O BAM -o '/tmp/tmpkp3ixf17/job_working_directory/000/26/outputs/dataset_c7be03cd-7de3-4a54-8959-efb87252b58a.dat'

            Exit Code:

            • 0

            Standard Error:

            • [M::mm_idx_gen::4.515*0.77] collected minimizers
              [M::mm_idx_gen::5.954*0.80] sorted minimizers
              [M::main::5.954*0.80] loaded/built the index for 1 target sequence(s)
              [M::mm_mapopt_update::6.082*0.81] mid_occ = 131
              [M::mm_idx_stat] kmer size: 19; skip: 19; is_hpc: 0; #seq: 1
              [M::mm_idx_stat::6.183*0.81] distinct minimizers: 11315845 (97.08% are singletons); average occurrences: 1.241; average spacing: 9.941; total length: 139653677
              [M::worker_pipeline::146.477*0.98] mapped 5426 sequences
              [M::main] Version: 2.28-r1209
              [M::main] CMD: minimap2 -x map-hifi --q-occ-frac 0.01 -t 1 -a reference.fa /tmp/tmpkp3ixf17/files/5/e/6/dataset_5e607b56-db95-44e4-be2b-43b2b4653fe7.dat
              [M::main] Real time: 146.492 sec; CPU: 143.732 sec; Peak RSS: 1.376 GB
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              alignment_options {"A": null, "B": null, "E": null, "E2": null, "O": null, "O2": null, "no_end_flt": true, "s": null, "splicing": {"__current_case__": 0, "splice_mode": "preset"}, "z": null, "z2": null}
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              fastq_input {"__current_case__": 0, "analysis_type_selector": "map-hifi", "fastq_input1": {"values": [{"id": 14, "src": "hda"}]}, "fastq_input_selector": "single"}
              indexing_options {"H": false, "I": null, "k": null, "w": null}
              io_options {"K": null, "L": false, "Q": false, "Y": false, "c": false, "cs": null, "eqx": false, "output_format": "BAM"}
              mapping_options {"F": null, "N": null, "X": false, "f": null, "g": null, "kmer_ocurrence_interval": {"__current_case__": 1, "interval": ""}, "m": null, "mask_len": null, "max_chain_iter": null, "max_chain_skip": null, "min_occ_floor": null, "n": null, "p": null, "q_occ_frac": "0.01", "r": null}
              reference_source {"__current_case__": 1, "ref_file": {"values": [{"id": 18, "src": "hda"}]}, "reference_source_selector": "history"}
      • Step 20: toolshed.g2.bx.psu.edu/repos/iuc/pretext_map/pretext_map/0.1.9+galaxy1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              filter {"__current_case__": 0, "filter_type": ""}
              map_qual "0"
              sorting {"__current_case__": 1, "sortby": "length", "sortorder": "descend"}
      • Step 3: Haplotype 2:

        • step_state: scheduled
      • Step 21: toolshed.g2.bx.psu.edu/repos/devteam/column_maker/Add_a_column1/2.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/column_maker/aff5135563c6/column_maker/column_maker.py' --column-types str,int,int  --file '/tmp/tmpkp3ixf17/job_working_directory/000/28/configs/tmpa9l0hkk5' --fail-on-non-existent-columns --fail-on-non-computable '/tmp/tmpkp3ixf17/files/b/3/a/dataset_b3ac5b46-e92f-4422-8a54-f26e223298ad.dat' '/tmp/tmpkp3ixf17/job_working_directory/000/28/outputs/dataset_9d66fc6f-055f-4132-ad25-de220e4080ba.dat'

            Exit Code:

            • 0

            Standard Output:

            • abs(int(c3)-int(c2))
              Computing 1 new columns with instructions ['abs(int(c3)-int(c2));;']
              Computed new column values for 100.00% of 4 lines written.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              avoid_scientific_notation false
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              error_handling {"auto_col_types": true, "fail_on_non_existent_columns": true, "non_computable": {"__current_case__": 0, "action": "--fail-on-non-computable"}}
              ops {"__current_case__": 0, "expressions": [{"__index__": 0, "add_column": {"__current_case__": 0, "mode": "", "pos": ""}, "cond": "abs(int(c3)-int(c2))"}], "header_lines_select": "no"}
      • Step 22: Cut1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • perl '/tmp/tmpkp3ixf17/galaxy-dev/tools/filters/cutWrapper.pl' '/tmp/tmpkp3ixf17/files/0/0/6/dataset_006a287f-de60-486b-b08b-acf15c23b6b1.dat' 'c1,c2,c3' T '/tmp/tmpkp3ixf17/job_working_directory/000/29/outputs/dataset_3c0b3f62-2684-4179-803b-fc0140bf1404.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "bed"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              columnList "c1,c2,c3"
              dbkey "?"
              delimiter "T"
      • Step 23: toolshed.g2.bx.psu.edu/repos/bgruening/deeptools_bam_coverage/deeptools_bam_coverage/3.5.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -s '/tmp/tmpkp3ixf17/files/c/7/b/dataset_c7be03cd-7de3-4a54-8959-efb87252b58a.dat' one.bam && ln -s '/tmp/tmpkp3ixf17/files/_metadata_files/5/2/7/metadata_527aa75f-5481-4ef0-b410-72487b7b9009.dat' one.bam.bai &&  bamCoverage --numberOfProcessors "${GALAXY_SLOTS:-4}"  --bam one.bam --outFileName '/tmp/tmpkp3ixf17/job_working_directory/000/30/outputs/dataset_849824e7-db94-4400-aeff-c3b0e1af3c47.dat' --outFileFormat 'bigwig'  --binSize 100

            Exit Code:

            • 0

            Standard Error:

            • bamFilesList: ['one.bam']
              binLength: 100
              numberOfSamples: None
              blackListFileName: None
              skipZeroOverZero: False
              bed_and_bin: False
              genomeChunkSize: None
              defaultFragmentLength: read length
              numberOfProcessors: 1
              verbose: False
              region: None
              bedFile: None
              minMappingQuality: None
              ignoreDuplicates: False
              chrsToSkip: []
              stepSize: 100
              center_read: False
              samFlag_include: None
              samFlag_exclude: None
              minFragmentLength: 0
              maxFragmentLength: 0
              zerosToNans: False
              smoothLength: None
              save_data: False
              out_file_for_raw_data: None
              maxPairedFragmentLength: 1000
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              advancedOpt {"__current_case__": 0, "showAdvancedOpt": "no"}
              binSize "100"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              exactScaling false
              outFileFormat "bigwig"
              region ""
              scaling {"__current_case__": 3, "type": "no"}
      • Step 24: toolshed.g2.bx.psu.edu/repos/devteam/column_maker/Add_a_column1/2.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/column_maker/aff5135563c6/column_maker/column_maker.py' --column-types str,int,int  --file '/tmp/tmpkp3ixf17/job_working_directory/000/31/configs/tmpna_v2lbp' --fail-on-non-existent-columns --fail-on-non-computable '/tmp/tmpkp3ixf17/files/3/c/0/dataset_3c0b3f62-2684-4179-803b-fc0140bf1404.dat' '/tmp/tmpkp3ixf17/job_working_directory/000/31/outputs/dataset_04753e13-318d-45f4-94fd-a08e72bb6f29.dat'

            Exit Code:

            • 0

            Standard Output:

            • abs(int(c3)-int(c2))
              Computing 1 new columns with instructions ['abs(int(c3)-int(c2));;']
              Computed new column values for 100.00% of 1 lines written.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              avoid_scientific_notation false
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              error_handling {"auto_col_types": true, "fail_on_non_existent_columns": true, "non_computable": {"__current_case__": 0, "action": "--fail-on-non-computable"}}
              ops {"__current_case__": 0, "expressions": [{"__index__": 0, "add_column": {"__current_case__": 0, "mode": "", "pos": ""}, "cond": "abs(int(c3)-int(c2))"}], "header_lines_select": "no"}
      • Step 25: Add Coverage Track:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              name "coverage"
      • Step 26: Test if telomere track is empty:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "bedgraph"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              param_type "text"
              remove_newlines true
      • Step 27: False if telomere track is empty:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "ef47f53c0b3011f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input_param_type {"__current_case__": 0, "input_param": "scaffold_10.H1\t0\t11012\t11012", "mappings": [{"__index__": 0, "from": "", "to": "false"}], "type": "text"}
              output_param_type "boolean"
              unmapped {"__current_case__": 2, "default_value": "true", "on_unmapped": "default"}
      • Step 28: Add telomere track:

        • step_state: new
      • Step 29: Unlabelled step:

        • step_state: new
      • Step 30: Add gaps track:

        • step_state: new
      • Step 4: Do you want to add suffixes to the scaffold names?:

        • step_state: scheduled
      • Step 31: Unlabelled step:

        • step_state: new
      • Step 32: Unlabelled step:

        • step_state: new
      • Step 5: First Haplotype suffix:

        • step_state: scheduled
      • Step 6: Second Haplotype suffix:

        • step_state: scheduled
      • Step 7: Hi-C reads:

        • step_state: scheduled
      • Step 8: Do you want to trim the Hi-C data?:

        • step_state: scheduled
      • Step 9: Telomere repeat to suit species:

        • step_state: scheduled
      • Step 10: PacBio reads:

        • step_state: scheduled
    • Other invocation details
      • error_message

        • Final state of invocation 426c98cdc83d7e58 is [failed]. Failed to run workflow, at least one job is in [paused] state.
      • history_id

        • 426c98cdc83d7e58
      • history_state

        • paused
      • invocation_id

        • 426c98cdc83d7e58
      • invocation_state

        • failed
      • messages

        • [{'dependent_workflow_step_id': None, 'hda_id': '27149de321e48428', 'reason': 'dataset_failed', 'workflow_step_id': 27}]
      • workflow_id

        • a7fef001aff9b2a8
  • ❌ hi-c-map-for-assembly-manual-curation.ga_1

    Execution Problem:

    • Final state of invocation fb7f479106dc95b8 is [failed]. Failed to run workflow, at least one job is in [paused] state.
      

    Workflow invocation details

    • Invocation Messages

      • Invocation scheduling failed because step 28 requires a dataset, but dataset entered a failed state.
    • Steps
      • Step 1: Haplotype 1:

        • step_state: scheduled
      • Step 2: Will you use a second haplotype?:

        • step_state: scheduled
      • Step 11: Hap2 not provided:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input_param_type {"__current_case__": 3, "input_param": false, "mappings": [{"__index__": 0, "from": false, "to": "true"}, {"__index__": 1, "from": true, "to": "false"}], "type": "boolean"}
              output_param_type "boolean"
              unmapped {"__current_case__": 2, "default_value": "false", "on_unmapped": "default"}
      • Step 12: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Hap1:

            • step_state: scheduled
          • Step 2: Hap2:

            • step_state: scheduled
          • Step 11: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d10b3111f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 46, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 38, "src": "hda"}]}}]}}
          • Step 12: toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_cat/9.3+galaxy1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d10b3111f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  queries [{"__index__": 0, "inputs2": {"values": [{"id": 48, "src": "hda"}]}}]
          • Step 3: Do you want to add suffixes to the scaffold names?:

            • step_state: scheduled
          • Step 4: Hap1 suffix:

            • step_state: scheduled
          • Step 5: Hap2 suffix:

            • step_state: scheduled
          • Step 6: Expression for hap1 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d10b3111f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H1", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 7: Expression for hap2 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d10b3111f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H2", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 8: add hap1 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d10b3111f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": null}]
          • Step 9: add hap2 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d10b3111f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": null}]
          • Step 10: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is skipped

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d10b3111f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 45, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 37, "src": "hda"}]}}]}}
      • Step 13: concatenate HiFi:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cat '/tmp/tmpkp3ixf17/files/8/9/c/dataset_89cb1f46-316a-4da1-af87-e7e37384ce6e.dat' >> '/tmp/tmpkp3ixf17/job_working_directory/000/48/outputs/dataset_2b0c450e-61ee-411c-ad19-acfd711002f5.dat' && exit 0

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              queries []
      • Step 14: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Hap1:

            • step_state: scheduled
          • Step 2: Do you want to add suffixes to the scaffold names?:

            • step_state: scheduled
          • Step 3: Hap1 suffix:

            • step_state: scheduled
          • Step 4: Expression for hap1 suffixing:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d20b3111f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "&.", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "H1", "select_param_type": "text"}}]
                  dbkey "?"
          • Step 5: add hap1 suffix:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • sed -r --sandbox -e 's/>.+$/&.H1/g' '/tmp/tmpkp3ixf17/files/6/b/2/dataset_6b24f0a0-f773-463f-b1f7-3eb8a96cbed2.dat' > '/tmp/tmpkp3ixf17/job_working_directory/000/50/outputs/dataset_34cad202-e811-43a9-97f1-dd2a90a3fcd2.dat'

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d20b3111f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "find_pattern": ">.+$", "replace_pattern": "&.H1"}]
          • Step 6: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d20b3111f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 52, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 37, "src": "hda"}]}}]}}
      • Step 15: Pick Assembly:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 49, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 53, "src": "hda"}]}}]}}
      • Step 16: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Reference:

            • step_state: scheduled
          • Step 2: Hi-C reads:

            • step_state: scheduled
          • Step 3: Do you want to trim the Hi-C data?:

            • step_state: scheduled
          • Step 4: Trim Hi-C reads 2:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • ln -f -s '/tmp/tmpkp3ixf17/files/9/3/c/dataset_93caf481-bec4-4f0b-b4e5-7e9878ef7c9a.dat' 'Hi-C reads_1.fq.gz' && ln -f -s '/tmp/tmpkp3ixf17/files/0/a/9/dataset_0a9a4408-75bb-4f60-9a68-96c37ba70ddc.dat' 'Hi-C reads_2.fq.gz' &&  cutadapt  -j=${GALAXY_SLOTS:-4}     --error-rate=0.1 --times=1 --overlap=3    --action=trim   --cut=5 -U 5       --minimum-length=1      -o 'out1.fq.gz' -p 'out2.fq.gz'  'Hi-C reads_1.fq.gz' 'Hi-C reads_2.fq.gz'  > report.txt

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d30b3111f08a576045bdd71c8b"
                  adapter_options {"action": "trim", "error_rate": "0.1", "match_read_wildcards": false, "no_indels": false, "no_match_adapter_wildcards": true, "overlap": "3", "revcomp": false, "times": "1"}
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  filter_options {"discard_casava": false, "discard_trimmed": false, "discard_untrimmed": false, "max_average_error_rate": null, "max_expected_errors": null, "max_n": null, "maximum_length": null, "maximum_length2": null, "minimum_length": "1", "minimum_length2": null, "pair_filter": "any"}
                  library {"__current_case__": 2, "input_1": {"values": [{"id": 13, "src": "dce"}]}, "pair_adapters": false, "r1": {"adapters": [], "anywhere_adapters": [], "front_adapters": []}, "r2": {"adapters2": [], "anywhere_adapters2": [], "front_adapters2": []}, "type": "paired_collection"}
                  other_trimming_options {"cut": "5", "cut2": "5", "nextseq_trim": "0", "poly_a": false, "quality_cutoff": "0", "quality_cutoff2": "", "shorten_options": {"__current_case__": 1, "shorten_values": "False"}, "shorten_options_r2": {"__current_case__": 1, "shorten_values_r2": "False"}, "trim_n": false}
                  output_selector ["report"]
                  read_mod_options {"length_tag": null, "rename": null, "strip_suffix": null, "zero_cap": false}
          • Step 5: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d30b3111f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 19, "src": "dce"}]}}, {"__index__": 1, "value": {"values": [{"id": 14, "src": "dce"}]}}]}}
              • Job 2:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d30b3111f08a576045bdd71c8b"
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 20, "src": "dce"}]}}, {"__index__": 1, "value": {"values": [{"id": 15, "src": "dce"}]}}]}}
          • Step 6: toolshed.g2.bx.psu.edu/repos/iuc/bwa_mem2/bwa_mem2/2.2.1+galaxy1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is error

                Command Line:

                • set -o | grep -q pipefail && set -o pipefail;  ln -s '/tmp/tmpkp3ixf17/files/3/4/c/dataset_34cad202-e811-43a9-97f1-dd2a90a3fcd2.dat' 'localref.fasta' && bwa-mem2 index 'localref.fasta' &&    bwa-mem2 mem -t "${GALAXY_SLOTS:-1}" -v 1   -k '19' -w '100' -d '100' -r '1.5' -y '20' -c '500' -D '0.5' -W '0' -m '50' -S -P    -T '20' -h '5' -a                      'localref.fasta' '/tmp/tmpkp3ixf17/files/8/a/9/dataset_8a9cfcc2-5f71-4269-bdfa-fddf0b5c2075.dat' '/tmp/tmpkp3ixf17/files/3/a/c/dataset_3ac6184b-1061-411a-9979-bddcaaaef6a0.dat'  | samtools sort -@${GALAXY_SLOTS:-2} -T "${TMPDIR:-.}" -O bam -o '/tmp/tmpkp3ixf17/job_working_directory/000/56/outputs/dataset_e4b427c0-2f03-440b-9ed9-474a56ad0e6b.dat'

                Exit Code:

                • 127

                Standard Error:

                • Looking to launch executable "/usr/local/bin/bwa-mem2.avx2", simd = .avx2
                  Launching executable "/usr/local/bin/bwa-mem2.avx2"
                  [bwa_index] Pack FASTA... 0.64 sec
                  * Entering FMI_search
                  init ticks = 13841978616
                  ref seq len = 275747614
                  binary seq ticks = 5425065674
                  build suffix-array ticks = 91645903764
                  pos: 34468452, ref_seq_len__: 34468451
                  build fm-index ticks = 22116170997
                  Total time taken: 55.1316
                  /tmp/tmpkp3ixf17/job_working_directory/000/56/tool_script.sh: line 23: samtools: command not found
                  Looking to launch executable "/usr/local/bin/bwa-mem2.avx2", simd = .avx2
                  Launching executable "/usr/local/bin/bwa-mem2.avx2"
                  -----------------------------
                  Executing in AVX2 mode!!
                  -----------------------------
                  * SA compression enabled with xfactor: 8
                  * Ref file: localref.fasta
                  * Entering FMI_search
                  * Index file found. Loading index from localref.fasta.bwt.2bit.64
                  * Reference seq len for bi-index = 275747615
                  * sentinel-index: 224703773
                  * Count:
                  0,	1
                  1,	80527259
                  2,	137873808
                  3,	195220357
                  4,	275747615
                  
                  * Reading other elements of the index from files localref.fasta
                  * Index prefix: localref.fasta
                  * Read 0 ALT contigs
                  * Done reading Index!!
                  * Reading reference genome..
                  * Binary seq file = localref.fasta.0123
                  * Reference genome size: 275747614 bp
                  * Done reading reference genome !!
                  
                  ------------------------------------------
                  1. Memory pre-allocation for Chaining: 139.3584 MB
                  2. Memory pre-allocation for BSW: 239.6170 MB
                  3. Memory pre-allocation for BWT: 77.3142 MB
                  ------------------------------------------
                  * Threads used (compute): 1
                  * No. of pipeline threads: 2
                  
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] Reallocating initial memory allocations!!
                  [0000] Calling mem_process_seqs.., task: 0
                  [0000] 1. Calling kt_for - worker_bwt
                  [0000] read_chunk: 10000000, work_chunk_size: 10000124, nseq: 68494
                  	[0000][ M::kt_pipeline] read 68494 sequences (10000124 bp)...
                  [0000] 2. Calling kt_for - worker_aln
                  [0000] Inferring insert size distribution of PE reads from data, l_pac: 137873807, n: 68494
                  [0000][PE] analyzing insert size distribution for orientation FF...
                  [0000][PE] (25, 50, 75) percentile: (2021, 3042, 5240)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 11678)
                  [0000][PE] mean and std.dev: (3782.74, 2698.62)
                  [0000][PE] low and high boundaries for proper pairs: (1, 14897)
                  [0000][PE] analyzing insert size distribution for orientation FR...
                  [0000][PE] (25, 50, 75) percentile: (151, 230, 368)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 802)
                  [0000][PE] mean and std.dev: (244.21, 150.93)
                  [0000][PE] low and high boundaries for proper pairs: (1, 1019)
                  [0000][PE] analyzing insert size distribution for orientation RF...
                  [0000][PE] (25, 50, 75) percentile: (2708, 3331, 6114)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 12926)
                  [0000][PE] mean and std.dev: (4077.29, 2787.30)
                  [0000][PE] low and high boundaries for proper pairs: (1, 16332)
                  [0000][PE] analyzing insert size distribution for orientation RR...
                  [0000][PE] (25, 50, 75) percentile: (1608, 2936, 5772)
                  [0000][PE] low and high boundaries for computing mean and std.dev: (1, 14100)
                  [0000][PE] mean and std.dev: (3700.00, 2741.58)
                  [0000][PE] low and high boundaries for proper pairs: (1, 18264)
                  [0000] 3. Calling kt_for - worker_sam
                  	[0000][ M::mem_process_seqs] Processed 68494 reads in 82.758 CPU sec, 83.273 real sec
                  

                Standard Output:

                • ref_seq_len = 275747614
                  count = 0, 80527258, 137873807, 195220356, 275747614
                  BWT[224703773] = 4
                  CP_SHIFT = 6, CP_MASK = 63
                  sizeof CP_OCC = 64
                  max_occ_ind = 4308556
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d30b3111f08a576045bdd71c8b"
                  analysis_type {"__current_case__": 4, "algorithmic_options": {"D": "0.5", "P": true, "S": true, "W": "0", "__current_case__": 0, "algorithmic_options_selector": "set", "c": "500", "d": "100", "e": false, "k": "19", "m": "50", "r": "1.5", "w": "100", "y": "20"}, "analysis_type_selector": "full", "io_options": {"C": false, "K": null, "M": false, "T": "20", "V": false, "Y": false, "__current_case__": 0, "a": true, "five": false, "h": "5", "io_options_selector": "set", "q": false}, "scoring_options": {"__current_case__": 1, "scoring_options_selector": "do_not_set"}}
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  fastq_input {"__current_case__": 2, "fastq_input1": {"values": [{"id": 21, "src": "dce"}]}, "fastq_input_selector": "paired_collection", "iset_stats": null}
                  output_sort "coordinate"
                  reference_source {"__current_case__": 1, "ref_file": {"values": [{"id": 54, "src": "hda"}]}, "reference_source_selector": "history"}
                  rg {"__current_case__": 3, "rg_selector": "do_not_set"}
          • Step 7: toolshed.g2.bx.psu.edu/repos/iuc/samtools_merge/samtools_merge/1.20+galaxy2:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is paused

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "d1e134d30b3111f08a576045bdd71c8b"
                  bed_file None
                  chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  headerbam None
                  idpg false
                  idrg false
                  region None
                  seed "1"
      • Step 17: toolshed.g2.bx.psu.edu/repos/bgruening/gfastats/gfastats/1.3.9+galaxy2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • gfastats '/tmp/tmpkp3ixf17/files/3/4/c/dataset_34cad202-e811-43a9-97f1-dd2a90a3fcd2.dat' --out-coord g   --tabular > '/tmp/tmpkp3ixf17/job_working_directory/000/58/outputs/dataset_aa60f11b-fb46-4cc2-8d6e-a73922745bff.dat' --threads ${GALAXY_SLOTS:-8}

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode_condition {"__current_case__": 1, "discover_paths": false, "locale": false, "selector": "statistics", "statistics_condition": {"__current_case__": 1, "out_coord": "g", "selector": "coordinates"}, "tabular": true}
              target_condition {"__current_case__": 0, "target_option": "false"}
      • Step 18: toolshed.g2.bx.psu.edu/repos/iuc/seqtk/seqtk_telo/1.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • seqtk telo -m 'CCCTAA' -p '1' -d '2000' -s '300'  '/tmp/tmpkp3ixf17/files/3/4/c/dataset_34cad202-e811-43a9-97f1-dd2a90a3fcd2.dat' > '/tmp/tmpkp3ixf17/job_working_directory/000/59/outputs/dataset_b7b75e32-b416-4512-9b7e-4ec47b4eaad9.dat'

            Exit Code:

            • 0

            Standard Error:

            • 0	137873807
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              P false
              __input_ext "input"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              d "2000"
              dbkey "?"
              m "CCCTAA"
              p "1"
              s "300"
      • Step 19: toolshed.g2.bx.psu.edu/repos/iuc/minimap2/minimap2/2.28+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -f -s '/tmp/tmpkp3ixf17/files/3/4/c/dataset_34cad202-e811-43a9-97f1-dd2a90a3fcd2.dat' reference.fa && minimap2 -x map-hifi    --q-occ-frac 0.01       -t ${GALAXY_SLOTS:-4} reference.fa '/tmp/tmpkp3ixf17/files/2/b/0/dataset_2b0c450e-61ee-411c-ad19-acfd711002f5.dat' -a | samtools view --no-PG -hT reference.fa | samtools sort -@${GALAXY_SLOTS:-2} -T "${TMPDIR:-.}" -O BAM -o '/tmp/tmpkp3ixf17/job_working_directory/000/60/outputs/dataset_87fd14b6-cd94-4287-88bd-0708e477e07f.dat'

            Exit Code:

            • 0

            Standard Error:

            • [M::mm_idx_gen::4.540*0.77] collected minimizers
              [M::mm_idx_gen::5.865*0.80] sorted minimizers
              [M::main::5.865*0.80] loaded/built the index for 1 target sequence(s)
              [M::mm_mapopt_update::5.997*0.81] mid_occ = 113
              [M::mm_idx_stat] kmer size: 19; skip: 19; is_hpc: 0; #seq: 1
              [M::mm_idx_stat::6.095*0.81] distinct minimizers: 11276538 (95.62% are singletons); average occurrences: 1.229; average spacing: 9.951; total length: 137873807
              [M::worker_pipeline::144.110*0.98] mapped 5426 sequences
              [M::main] Version: 2.28-r1209
              [M::main] CMD: minimap2 -x map-hifi --q-occ-frac 0.01 -t 1 -a reference.fa /tmp/tmpkp3ixf17/files/2/b/0/dataset_2b0c450e-61ee-411c-ad19-acfd711002f5.dat
              [M::main] Real time: 144.186 sec; CPU: 141.192 sec; Peak RSS: 1.257 GB
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              alignment_options {"A": null, "B": null, "E": null, "E2": null, "O": null, "O2": null, "no_end_flt": true, "s": null, "splicing": {"__current_case__": 0, "splice_mode": "preset"}, "z": null, "z2": null}
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              fastq_input {"__current_case__": 0, "analysis_type_selector": "map-hifi", "fastq_input1": {"values": [{"id": 50, "src": "hda"}]}, "fastq_input_selector": "single"}
              indexing_options {"H": false, "I": null, "k": null, "w": null}
              io_options {"K": null, "L": false, "Q": false, "Y": false, "c": false, "cs": null, "eqx": false, "output_format": "BAM"}
              mapping_options {"F": null, "N": null, "X": false, "f": null, "g": null, "kmer_ocurrence_interval": {"__current_case__": 1, "interval": ""}, "m": null, "mask_len": null, "max_chain_iter": null, "max_chain_skip": null, "min_occ_floor": null, "n": null, "p": null, "q_occ_frac": "0.01", "r": null}
              reference_source {"__current_case__": 1, "ref_file": {"values": [{"id": 54, "src": "hda"}]}, "reference_source_selector": "history"}
      • Step 20: toolshed.g2.bx.psu.edu/repos/iuc/pretext_map/pretext_map/0.1.9+galaxy1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              filter {"__current_case__": 0, "filter_type": ""}
              map_qual "0"
              sorting {"__current_case__": 1, "sortby": "length", "sortorder": "descend"}
      • Step 3: Haplotype 2:

        • step_state: scheduled
      • Step 21: toolshed.g2.bx.psu.edu/repos/devteam/column_maker/Add_a_column1/2.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/column_maker/aff5135563c6/column_maker/column_maker.py' --column-types str,int,int  --file '/tmp/tmpkp3ixf17/job_working_directory/000/62/configs/tmpsydjm92o' --fail-on-non-existent-columns --fail-on-non-computable '/tmp/tmpkp3ixf17/files/a/a/6/dataset_aa60f11b-fb46-4cc2-8d6e-a73922745bff.dat' '/tmp/tmpkp3ixf17/job_working_directory/000/62/outputs/dataset_d9eb1354-e3d2-4ba9-8e82-49879e1c0900.dat'

            Exit Code:

            • 0

            Standard Output:

            • abs(int(c3)-int(c2))
              Computing 1 new columns with instructions ['abs(int(c3)-int(c2));;']
              Computed new column values for 100.00% of 8 lines written.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              avoid_scientific_notation false
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              error_handling {"auto_col_types": true, "fail_on_non_existent_columns": true, "non_computable": {"__current_case__": 0, "action": "--fail-on-non-computable"}}
              ops {"__current_case__": 0, "expressions": [{"__index__": 0, "add_column": {"__current_case__": 0, "mode": "", "pos": ""}, "cond": "abs(int(c3)-int(c2))"}], "header_lines_select": "no"}
      • Step 22: Cut1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • perl '/tmp/tmpkp3ixf17/galaxy-dev/tools/filters/cutWrapper.pl' '/tmp/tmpkp3ixf17/files/b/7/b/dataset_b7b75e32-b416-4512-9b7e-4ec47b4eaad9.dat' 'c1,c2,c3' T '/tmp/tmpkp3ixf17/job_working_directory/000/63/outputs/dataset_dbe7ecfd-e155-44f9-b183-7bbd58d9f890.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "bed"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              columnList "c1,c2,c3"
              dbkey "?"
              delimiter "T"
      • Step 23: toolshed.g2.bx.psu.edu/repos/bgruening/deeptools_bam_coverage/deeptools_bam_coverage/3.5.4+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -s '/tmp/tmpkp3ixf17/files/8/7/f/dataset_87fd14b6-cd94-4287-88bd-0708e477e07f.dat' one.bam && ln -s '/tmp/tmpkp3ixf17/files/_metadata_files/b/0/f/metadata_b0fe352b-dbb4-423c-ae33-2def3e812c05.dat' one.bam.bai &&  bamCoverage --numberOfProcessors "${GALAXY_SLOTS:-4}"  --bam one.bam --outFileName '/tmp/tmpkp3ixf17/job_working_directory/000/64/outputs/dataset_de145cf4-f094-446e-8d7a-b18cf108f9d1.dat' --outFileFormat 'bigwig'  --binSize 100

            Exit Code:

            • 0

            Standard Error:

            • bamFilesList: ['one.bam']
              binLength: 100
              numberOfSamples: None
              blackListFileName: None
              skipZeroOverZero: False
              bed_and_bin: False
              genomeChunkSize: None
              defaultFragmentLength: read length
              numberOfProcessors: 1
              verbose: False
              region: None
              bedFile: None
              minMappingQuality: None
              ignoreDuplicates: False
              chrsToSkip: []
              stepSize: 100
              center_read: False
              samFlag_include: None
              samFlag_exclude: None
              minFragmentLength: 0
              maxFragmentLength: 0
              zerosToNans: False
              smoothLength: None
              save_data: False
              out_file_for_raw_data: None
              maxPairedFragmentLength: 1000
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              advancedOpt {"__current_case__": 0, "showAdvancedOpt": "no"}
              binSize "100"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              exactScaling false
              outFileFormat "bigwig"
              region ""
              scaling {"__current_case__": 3, "type": "no"}
      • Step 24: toolshed.g2.bx.psu.edu/repos/devteam/column_maker/Add_a_column1/2.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/column_maker/aff5135563c6/column_maker/column_maker.py' --column-types  --file '/tmp/tmpkp3ixf17/job_working_directory/000/65/configs/tmp2s7jutw3' --fail-on-non-existent-columns --fail-on-non-computable '/tmp/tmpkp3ixf17/files/d/b/e/dataset_dbe7ecfd-e155-44f9-b183-7bbd58d9f890.dat' '/tmp/tmpkp3ixf17/job_working_directory/000/65/outputs/dataset_0ab8c90b-81c0-48de-adfe-dc4c553a8a39.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              avoid_scientific_notation false
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              error_handling {"auto_col_types": true, "fail_on_non_existent_columns": true, "non_computable": {"__current_case__": 0, "action": "--fail-on-non-computable"}}
              ops {"__current_case__": 0, "expressions": [{"__index__": 0, "add_column": {"__current_case__": 0, "mode": "", "pos": ""}, "cond": "abs(int(c3)-int(c2))"}], "header_lines_select": "no"}
      • Step 25: Add Coverage Track:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              name "coverage"
      • Step 26: Test if telomere track is empty:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "bedgraph"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              param_type "text"
              remove_newlines true
      • Step 27: False if telomere track is empty:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "d1e134d00b3111f08a576045bdd71c8b"
              chromInfo "/tmp/tmpkp3ixf17/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input_param_type {"__current_case__": 0, "input_param": "", "mappings": [{"__index__": 0, "from": "", "to": "false"}], "type": "text"}
              output_param_type "boolean"
              unmapped {"__current_case__": 2, "default_value": "true", "on_unmapped": "default"}
      • Step 28: Add telomere track:

        • step_state: new
      • Step 29: Unlabelled step:

        • step_state: new
      • Step 30: Add gaps track:

        • step_state: new
      • Step 4: Do you want to add suffixes to the scaffold names?:

        • step_state: scheduled
      • Step 31: Unlabelled step:

        • step_state: new
      • Step 32: Unlabelled step:

        • step_state: new
      • Step 5: First Haplotype suffix:

        • step_state: scheduled
      • Step 6: Second Haplotype suffix:

        • step_state: scheduled
      • Step 7: Hi-C reads:

        • step_state: scheduled
      • Step 8: Do you want to trim the Hi-C data?:

        • step_state: scheduled
      • Step 9: Telomere repeat to suit species:

        • step_state: scheduled
      • Step 10: PacBio reads:

        • step_state: scheduled
    • Other invocation details
      • error_message

        • Final state of invocation fb7f479106dc95b8 is [failed]. Failed to run workflow, at least one job is in [paused] state.
      • history_id

        • 938e59a159cb21e6
      • history_state

        • paused
      • invocation_id

        • fb7f479106dc95b8
      • invocation_state

        • failed
      • messages

        • [{'dependent_workflow_step_id': None, 'hda_id': '14027bef5e10794d', 'reason': 'dataset_failed', 'workflow_step_id': 27}]
      • workflow_id

        • a7fef001aff9b2a8

@mvdbeek mvdbeek merged commit f7a004d into galaxyproject:main Mar 28, 2025
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants