Skip to content

Commit 842bf9a

Browse files
committed
Edit data import instructions
- Add new images
1 parent 4cde469 commit 842bf9a

File tree

4 files changed

+30
-20
lines changed

4 files changed

+30
-20
lines changed
279 KB
Loading
378 KB
Loading
330 KB
Loading

topics/microbiome/tutorials/metagenomics-assembly/tutorial.md

Lines changed: 30 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -5,10 +5,10 @@ title: "Assembly of metagenomic sequencing data"
55
zenodo_link: "https://zenodo.org/record/7818827"
66
level: Introductory
77
questions:
8-
- "Why metagenomic data should be assembled?"
8+
- "Why should metagenomic data be assembled?"
99
- "What is the difference between co-assembly and individual assembly?"
1010
- "What is the difference between reads, contigs and scaffolds?"
11-
- "How tools based on De Bruijn graph work?"
11+
- "How tools based on de Bruijn graph work?"
1212
- "How to assess the quality of metagenomic data assembly?"
1313
objectives:
1414
- "Describe what an assembly is"
@@ -23,7 +23,7 @@ objectives:
2323
time_estimation: "2H"
2424
key_points:
2525
- "Assembly groups reads into contigs and scafolds."
26-
- "De Brujin Graphs use k-mers to assembly reads"
26+
- "de Brujin Graphs use k-mers to assembly reads"
2727
- "MetaSPAdes and MEGAHIT are assemblers"
2828
- "Quast is the tool to assess the assembly quality"
2929
edam_ontology:
@@ -33,6 +33,7 @@ contributions:
3333
authorship:
3434
- plushz
3535
- bebatut
36+
- vinisalazar
3637
funding:
3738
- gallantries
3839
subtopic: metagenomics
@@ -124,30 +125,39 @@ In case of a not very large dataset it's more convenient to upload data directly
124125

125126
> <hands-on-title>Upload data into Galaxy</hands-on-title>
126127
>
127-
> 2. Import the sequence read raw data (\*.fastqsanger.gz) from [Zenodo]({{ page.zenodo_link }}) or a data library:
128+
> 2. The information we need to import the samples for this tutorial (Sample ID, and link to the FASTQ file (URL) are in the grey box below.
128129
>
129130
> ```text
130-
> {{ page.zenodo_link }}/files/ERR2231567_1.fastqsanger.gz
131-
> {{ page.zenodo_link }}/files/ERR2231567_2.fastqsanger.gz
132-
> {{ page.zenodo_link }}/files/ERR2231568_1.fastqsanger.gz
133-
> {{ page.zenodo_link }}/files/ERR2231568_2.fastqsanger.gz
134-
> {{ page.zenodo_link }}/files/ERR2231569_1.fastqsanger.gz
135-
> {{ page.zenodo_link }}/files/ERR2231569_2.fastqsanger.gz
136-
> {{ page.zenodo_link }}/files/ERR2231570_1.fastqsanger.gz
137-
> {{ page.zenodo_link }}/files/ERR2231570_2.fastqsanger.gz
138-
> {{ page.zenodo_link }}/files/ERR2231571_1.fastqsanger.gz
139-
> {{ page.zenodo_link }}/files/ERR2231571_2.fastqsanger.gz
140-
> {{ page.zenodo_link }}/files/ERR2231572_1.fastqsanger.gz
141-
> {{ page.zenodo_link }}/files/ERR2231572_2.fastqsanger.gz
131+
> SampleID URL
132+
> ERR2231567_1 {{ page.zenodo_link }}/files/ERR2231567_1.fastqsanger.gz
133+
> ERR2231567_2 {{ page.zenodo_link }}/files/ERR2231567_2.fastqsanger.gz
134+
> ERR2231568_1 {{ page.zenodo_link }}/files/ERR2231568_1.fastqsanger.gz
135+
> ERR2231568_2 {{ page.zenodo_link }}/files/ERR2231568_2.fastqsanger.gz
136+
> ERR2231569_1 {{ page.zenodo_link }}/files/ERR2231569_1.fastqsanger.gz
137+
> ERR2231569_2 {{ page.zenodo_link }}/files/ERR2231569_2.fastqsanger.gz
138+
> ERR2231570_1 {{ page.zenodo_link }}/files/ERR2231570_1.fastqsanger.gz
139+
> ERR2231570_2 {{ page.zenodo_link }}/files/ERR2231570_2.fastqsanger.gz
140+
> ERR2231571_1 {{ page.zenodo_link }}/files/ERR2231571_1.fastqsanger.gz
141+
> ERR2231571_2 {{ page.zenodo_link }}/files/ERR2231571_2.fastqsanger.gz
142+
> ERR2231572_1 {{ page.zenodo_link }}/files/ERR2231572_1.fastqsanger.gz
143+
> ERR2231572_2 {{ page.zenodo_link }}/files/ERR2231572_2.fastqsanger.gz
142144
> ```
143145
>
144146
> {% snippet faqs/galaxy/datasets_import_via_link.md %}
145147
>
146-
> {% snippet faqs/galaxy/datasets_import_from_data_library.md %}
148+
> Import the files from Zenodo using Galaxy's Rule-based Uploader.
149+
> - Open the Galaxy Upload Manager
150+
> - Click the tab **Rule-based**
151+
> - Paste the table.
152+
> - Remove the first line.
153+
> - Click **Build**
154+
> ![Rule-based Uploader](./images/data-import-1.png "Rule-based Uploader")
155+
> - On the "Rules" pane on the left, click on "Click here" to set column definitions.
156+
> ![Column definitions](./images/data-import-2.png "Column definitions")
157+
> - Set the first column as "Name" and the second column as "URL".
158+
> ![Upload the data](./images/data-import-3.png "Upload the data")
159+
> - Click "Apply" and then "Upload".
147160
>
148-
> > <comment-title></comment-title>
149-
> > In case of large dataset, we can use FTP server or the [Galaxy Rule-based Uploader]({% link topics/galaxy-interface/tutorials/upload-rules/tutorial.md %}).
150-
> {: .comment}
151161
>
152162
> 3. Create a paired collection named `Raw reads`, rename your pairs with the sample name
153163
>

0 commit comments

Comments
 (0)