|
417 | 417 |
|
418 | 418 | ---
|
419 | 419 |
|
420 |
| -# *TemplateFlow* | Archive |
| 420 | +.pull-left[ |
| 421 | +<p align="center"> |
| 422 | +<img src="../card-nipype.svg" width="100%" /> |
| 423 | +</p> |
| 424 | +<br /> |
| 425 | + |
| 426 | +``` Python |
| 427 | +from nipype.interfaces.fsl import BET |
| 428 | +brain_extract = BET( |
| 429 | + in_file="/data/coolproject/sub-01/ses-01/anat/sub-01_ses-01_T1w.nii", |
| 430 | + out_file="/out/sub-01/ses-01/anat/sub-01_ses-01_desc-brain_T1w.nii" |
| 431 | +) |
| 432 | +brain_extract.run() |
| 433 | +``` |
| 434 | +] |
| 435 | + |
| 436 | +.pull-right[ |
| 437 | +<p align="center"> |
| 438 | +<img src="https://nipype.readthedocs.io/en/latest/_images/nipype_architecture_overview2.png" width="65%" /> |
| 439 | +</p> |
| 440 | +] |
| 441 | + |
| 442 | +--- |
| 443 | + |
| 444 | +# WDL Example - Translation into Nipype |
| 445 | + |
| 446 | + |
| 447 | +```Python |
| 448 | +import nipype.interfaces.fsl as fsl |
| 449 | +import nipype.pipeline.engine as pe |
| 450 | + |
| 451 | +input_image = 'input.nii.gz' |
| 452 | +output_dir = 'results' |
| 453 | + |
| 454 | +preprocess = pe.Node(fsl.BET(), name='preprocess') |
| 455 | +preprocess.inputs.in_file = input_image |
| 456 | +preprocess.inputs.out_file = f'{output_dir}/brain.nii.gz' |
| 457 | + |
| 458 | +stats = pe.Node(fsl.ImageStats(), name='stats') |
| 459 | +stats.inputs.in_file = f'{output_dir}/brain.nii.gz' |
| 460 | +stats.inputs.op_string = '-m -p 50' |
| 461 | + |
| 462 | +workflow = pe.Workflow(name='my_workflow') |
| 463 | +workflow.connect([(preprocess, stats, [('out_file', 'in_file')])]) |
| 464 | +workflow.base_dir = 'working_dir' |
| 465 | +workflow.run() |
| 466 | +``` |
| 467 | + |
| 468 | + |
| 469 | +--- |
| 470 | + |
| 471 | +# Facets to standardize |
| 472 | + |
| 473 | +.boxed-content[ |
| 474 | +.distribute.large[ |
| 475 | +* **Outer interface** (inputs and outputs). Also, consider an internal data structure format for high throughput. |
| 476 | +* **Implementation** (WDL vs. programmed): code styling, best practices, BIDS-Apps. |
| 477 | +* **Modularize** (see *NiPreps* and *TemplateFlow*). |
| 478 | +* Use **containers** to ensure software delivery and reproducibility. |
| 479 | +* Implement testing and **continuous integration** to catch errors early and streamline development. |
| 480 | +* **Version** the workflow and its components to ensure compatibility and track changes, issue **LTS**. |
| 481 | +* Promote **community** and social standardization through collaboration, documentation, telemetry, and open-source practices. |
| 482 | +* **Visual reporting** |
| 483 | +]] |
| 484 | + |
| 485 | +--- |
| 486 | + |
| 487 | +# BIDS-Apps: subject-wise parallelization |
| 488 | + |
| 489 | +<p align="center"> |
| 490 | +<a href="https://doi.org/10.1371/journal.pcbi.1005209"> |
| 491 | +<img src="../journal.pcbi.1005209.g002.png" width="90%" /><br /><br /> |
| 492 | +(Gorgolewski et al., 2017) |
| 493 | +</a> |
| 494 | +</p> |
| 495 | + |
| 496 | +--- |
| 497 | + |
| 498 | +## The individual report |
| 499 | +<p align="center"> |
| 500 | +<video controls="controls" width="70%" |
| 501 | + name="Video Name" src="../fmriprep-report.mov"></video> |
| 502 | +</p> |
| 503 | + |
| 504 | +--- |
| 505 | + |
| 506 | +# Why standarizing? |
| 507 | + |
| 508 | +.boxed-content[ |
| 509 | +.distribute.large[ |
| 510 | +* Pushing the *truck factor* above 1.0. |
| 511 | +* Engage users |
| 512 | +]] |
| 513 | + |
| 514 | +--- |
| 515 | + |
| 516 | +# "Analysis-grade" data |
| 517 | + |
| 518 | +.larger[ |
| 519 | +The *NeuroImaging PREProcessing toolS* (*[NiPreps](https://nipreps.org).org*) augment scanners to produce *analysis-grade* data (= **directly consumable by analyses**) |
| 520 | +] |
| 521 | + |
| 522 | +<br /> |
| 523 | +.pull-left[ |
| 524 | + |
| 525 | +***Analysis-grade* data** is an analogy to the concept of "*sushi-grade (or [sashimi-grade](https://en.wikipedia.org/wiki/Sashimi)) fish*" in that both are: |
| 526 | + |
| 527 | +.large[**minimally preprocessed**,] |
| 528 | + |
| 529 | +and |
| 530 | + |
| 531 | +.large[**safe to consume** directly.] |
| 532 | +] |
| 533 | + |
| 534 | +.pull-right[ |
| 535 | +<img align="right" style='margin-right: 50px' src="https://1.bp.blogspot.com/-Osh4H4WXka0/WlMJmVgkZTI/AAAAAAAAEMY/GynUzSomJ-EBiyqv2m-maiOyKSM7SOmNACLcBGAs/s400/yellowfin%2Btuna%2Bsteaks%2Bnutrition.jpg" /> |
| 536 | +] |
| 537 | + |
| 538 | +--- |
| 539 | + |
| 540 | +<p align="center"> |
| 541 | +<img src="../nipreps-chart.png" width="63%" /><br /> |
| 542 | +<em>NiPreps</em> (<a href="https://doi.org/10.31219/osf.io/ujxp6">Esteban et al., 2020</a>) |
| 543 | +</p> |
| 544 | + |
| 545 | +--- |
| 546 | + |
| 547 | +template: title |
| 548 | +layout: false |
| 549 | +.middle[ |
| 550 | +<p align="center"> |
| 551 | +<img src="https://github.com/oesteban/fmriprep/raw/f4c7a9804be26c912b24ef4dccba54bdd72fa1fd/docs/_static/fmriprep-21.0.0.svg" width="95%" /> |
| 552 | +</p> |
| 553 | +] |
| 554 | + |
| 555 | +--- |
| 556 | + |
| 557 | +# *TemplateFlow* |
421 | 558 |
|
422 | 559 | <p align="center">
|
423 |
| -<img src="../torw2020/assets/templateflow-datatypes.png" width="78%" /><br /> |
424 |
| -(<a href="https://doi.org/10.1101/2021.02.10.430678">Ciric et al., 2022</a>) |
| 560 | +<img src="https://www.templateflow.org/assets/templateflow_fig-birdsview.png" width="60%" /><br /> |
| 561 | +(<a href="https://doi.org/10.1038/s41592-022-01681-2">Ciric et al., 2022</a>) |
425 | 562 | </p>
|
426 | 563 |
|
427 | 564 | ---
|
|
0 commit comments