Skip to content

Commit c9b4acf

Browse files
committed
Proofreading and duplication
1 parent ac84976 commit c9b4acf

File tree

15 files changed

+134
-46
lines changed

15 files changed

+134
-46
lines changed

pages/platform/ai/notebook_tuto_06_marine_mammal_sounds_classification/guide.de-de.md

Lines changed: 15 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,10 +6,10 @@ section: AI Notebooks - Tutorials
66
order: 06
77
routes:
88
canonical: 'https://docs.ovh.com/gb/en/publiccloud/ai/notebooks/tuto-marine-mammal-sounds-classification/'
9-
updated: 2022-09-01
9+
updated: 2023-03-31
1010
---
1111

12-
**Last updated 1st September, 2022.**
12+
**Last updated 31st March, 2023.**
1313

1414
## Objective
1515

@@ -59,10 +59,11 @@ ovhai data upload <region> <container> <paths>
5959
6060
You need to attach a volume if your data is in your OVHcloud Object Storage and you want to use it during your experiment. For more information on data, volumes and permissions, see [our guide on data](https://docs.ovh.com/de/publiccloud/ai/cli/access-object-storage-data/).
6161

62-
To be able to use the source code below in this article you have to create 2 Object Storage containers mounted as follows:
62+
To be able to use the source code below in this article you have to create 2 Object Storage containers and a git repository mounted as follows:
6363

6464
- mount point name: `/workspace/data`, permissions: `read & write`
6565
- mount point name: `/workspace/saved_model`, permissions: `read & write`
66+
- mount point name: `/workspace/ai-training-examples`, permissions: `read & write`, Git URL: `https://github.com/ovh/ai-training-examples.git`
6667

6768
`Choose the same region as your object container` > `"One image to rule them all" framework` > `Attach Object Storage containers (the one that contains your dataset)`
6869

@@ -71,10 +72,20 @@ If you want to launch it with the CLI, choose the [volume](https://docs.ovh.com/
7172
```bash
7273
ovhai notebook run one-for-all jupyterlab \
7374
--name <notebook-name> \
74-
--gpu <nb-gpus>
75+
--gpu <nb-gpus> \
7576
--volume <container@region/prefix:mount_path:permission>
7677
```
7778

79+
For example:
80+
```bash
81+
ovhai notebook run one-for-all jupyterlab \
82+
--name marine-mammal-sounds-classification \
83+
--gpu 1 \
84+
--volume marine-mammal-sounds@GRA/:/workspace/data:RW:cache \
85+
--volume marine-mammal-model@GRA/:/workspace/saved_model:RW:cache \
86+
--volume https://github.com/ovh/ai-training-examples.git:/workspace/ai-training-examples:RW
87+
```
88+
7889
You can then reach your notebook’s URL once the notebook is running.
7990

8091
Find the notebook by following this path: `ai-training-examples` > `notebooks` > `audio` > `audio-classification` > `notebook-marine-sound-classification.ipynb`.

pages/platform/ai/notebook_tuto_06_marine_mammal_sounds_classification/guide.en-asia.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,10 @@ slug: notebooks/tuto-marine-mammal-sounds-classification
44
excerpt: How to classify sounds with AI
55
section: AI Notebooks - Tutorials
66
order: 06
7-
updated: 2022-09-01
7+
updated: 2023-03-31
88
---
99

10-
**Last updated 1st September, 2022.**
10+
**Last updated 31st March, 2023.**
1111

1212
## Objective
1313

pages/platform/ai/notebook_tuto_06_marine_mammal_sounds_classification/guide.en-au.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,10 @@ slug: notebooks/tuto-marine-mammal-sounds-classification
44
excerpt: How to classify sounds with AI
55
section: AI Notebooks - Tutorials
66
order: 06
7-
updated: 2022-09-01
7+
updated: 2023-03-31
88
---
99

10-
**Last updated 1st September, 2022.**
10+
**Last updated 31st March, 2023.**
1111

1212
## Objective
1313

pages/platform/ai/notebook_tuto_06_marine_mammal_sounds_classification/guide.en-ca.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,10 @@ slug: notebooks/tuto-marine-mammal-sounds-classification
44
excerpt: How to classify sounds with AI
55
section: AI Notebooks - Tutorials
66
order: 06
7-
updated: 2022-09-01
7+
updated: 2023-03-31
88
---
99

10-
**Last updated 1st September, 2022.**
10+
**Last updated 31st March, 2023.**
1111

1212
## Objective
1313

pages/platform/ai/notebook_tuto_06_marine_mammal_sounds_classification/guide.en-gb.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,10 @@ slug: notebooks/tuto-marine-mammal-sounds-classification
44
excerpt: How to classify sounds with AI
55
section: AI Notebooks - Tutorials
66
order: 06
7-
updated: 2022-09-01
7+
updated: 2023-03-31
88
---
99

10-
**Last updated 1st September, 2022.**
10+
**Last updated 31st March, 2023.**
1111

1212
## Objective
1313

pages/platform/ai/notebook_tuto_06_marine_mammal_sounds_classification/guide.en-ie.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,10 @@ slug: notebooks/tuto-marine-mammal-sounds-classification
44
excerpt: How to classify sounds with AI
55
section: AI Notebooks - Tutorials
66
order: 06
7-
updated: 2022-09-01
7+
updated: 2023-03-31
88
---
99

10-
**Last updated 1st September, 2022.**
10+
**Last updated 31st March, 2023.**
1111

1212
## Objective
1313

pages/platform/ai/notebook_tuto_06_marine_mammal_sounds_classification/guide.en-sg.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,10 @@ slug: notebooks/tuto-marine-mammal-sounds-classification
44
excerpt: How to classify sounds with AI
55
section: AI Notebooks - Tutorials
66
order: 06
7-
updated: 2022-09-01
7+
updated: 2023-03-31
88
---
99

10-
**Last updated 1st September, 2022.**
10+
**Last updated 31st March, 2023.**
1111

1212
## Objective
1313

pages/platform/ai/notebook_tuto_06_marine_mammal_sounds_classification/guide.en-us.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,10 @@ slug: notebooks/tuto-marine-mammal-sounds-classification
44
excerpt: How to classify sounds with AI
55
section: AI Notebooks - Tutorials
66
order: 06
7-
updated: 2022-09-01
7+
updated: 2023-03-31
88
---
99

10-
**Last updated 1st September, 2022.**
10+
**Last updated 31st March, 2023.**
1111

1212
## Objective
1313

pages/platform/ai/notebook_tuto_06_marine_mammal_sounds_classification/guide.es-es.md

Lines changed: 15 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,10 +6,10 @@ section: AI Notebooks - Tutorials
66
order: 06
77
routes:
88
canonical: 'https://docs.ovh.com/gb/en/publiccloud/ai/notebooks/tuto-marine-mammal-sounds-classification/'
9-
updated: 2022-09-01
9+
updated: 2023-03-31
1010
---
1111

12-
**Last updated 1st September, 2022.**
12+
**Last updated 31st March, 2023.**
1313

1414
## Objective
1515

@@ -59,10 +59,11 @@ ovhai data upload <region> <container> <paths>
5959
6060
You need to attach a volume if your data is in your OVHcloud Object Storage and you want to use it during your experiment. For more information on data, volumes and permissions, see [our guide on data](https://docs.ovh.com/es/publiccloud/ai/cli/access-object-storage-data/).
6161

62-
To be able to use the source code below in this article you have to create 2 Object Storage containers mounted as follows:
62+
To be able to use the source code below in this article you have to create 2 Object Storage containers and a git repository mounted as follows:
6363

6464
- mount point name: `/workspace/data`, permissions: `read & write`
6565
- mount point name: `/workspace/saved_model`, permissions: `read & write`
66+
- mount point name: `/workspace/ai-training-examples`, permissions: `read & write`, Git URL: `https://github.com/ovh/ai-training-examples.git`
6667

6768
`Choose the same region as your object container` > `"One image to rule them all" framework` > `Attach Object Storage containers (the one that contains your dataset)`
6869

@@ -71,10 +72,20 @@ If you want to launch it with the CLI, choose the [volume](https://docs.ovh.com/
7172
```bash
7273
ovhai notebook run one-for-all jupyterlab \
7374
--name <notebook-name> \
74-
--gpu <nb-gpus>
75+
--gpu <nb-gpus> \
7576
--volume <container@region/prefix:mount_path:permission>
7677
```
7778

79+
For example:
80+
```bash
81+
ovhai notebook run one-for-all jupyterlab \
82+
--name marine-mammal-sounds-classification \
83+
--gpu 1 \
84+
--volume marine-mammal-sounds@GRA/:/workspace/data:RW:cache \
85+
--volume marine-mammal-model@GRA/:/workspace/saved_model:RW:cache \
86+
--volume https://github.com/ovh/ai-training-examples.git:/workspace/ai-training-examples:RW
87+
```
88+
7889
You can then reach your notebook’s URL once the notebook is running.
7990

8091
Find the notebook by following this path: `ai-training-examples` > `notebooks` > `audio` > `audio-classification` > `notebook-marine-sound-classification.ipynb`.

pages/platform/ai/notebook_tuto_06_marine_mammal_sounds_classification/guide.es-us.md

Lines changed: 15 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,10 +6,10 @@ section: AI Notebooks - Tutorials
66
order: 06
77
routes:
88
canonical: 'https://docs.ovh.com/gb/en/publiccloud/ai/notebooks/tuto-marine-mammal-sounds-classification/'
9-
updated: 2022-09-01
9+
updated: 2023-03-31
1010
---
1111

12-
**Last updated 1st September, 2022.**
12+
**Last updated 31st March, 2023.**
1313

1414
## Objective
1515

@@ -59,10 +59,11 @@ ovhai data upload <region> <container> <paths>
5959
6060
You need to attach a volume if your data is in your OVHcloud Object Storage and you want to use it during your experiment. For more information on data, volumes and permissions, see [our guide on data](https://docs.ovh.com/us/es/publiccloud/ai/cli/access-object-storage-data/).
6161

62-
To be able to use the source code below in this article you have to create 2 Object Storage containers mounted as follows:
62+
To be able to use the source code below in this article you have to create 2 Object Storage containers and a git repository mounted as follows:
6363

6464
- mount point name: `/workspace/data`, permissions: `read & write`
6565
- mount point name: `/workspace/saved_model`, permissions: `read & write`
66+
- mount point name: `/workspace/ai-training-examples`, permissions: `read & write`, Git URL: `https://github.com/ovh/ai-training-examples.git`
6667

6768
`Choose the same region as your object container` > `"One image to rule them all" framework` > `Attach Object Storage containers (the one that contains your dataset)`
6869

@@ -71,10 +72,20 @@ If you want to launch it with the CLI, choose the [volume](https://docs.ovh.com/
7172
```bash
7273
ovhai notebook run one-for-all jupyterlab \
7374
--name <notebook-name> \
74-
--gpu <nb-gpus>
75+
--gpu <nb-gpus> \
7576
--volume <container@region/prefix:mount_path:permission>
7677
```
7778

79+
For example:
80+
```bash
81+
ovhai notebook run one-for-all jupyterlab \
82+
--name marine-mammal-sounds-classification \
83+
--gpu 1 \
84+
--volume marine-mammal-sounds@GRA/:/workspace/data:RW:cache \
85+
--volume marine-mammal-model@GRA/:/workspace/saved_model:RW:cache \
86+
--volume https://github.com/ovh/ai-training-examples.git:/workspace/ai-training-examples:RW
87+
```
88+
7889
You can then reach your notebook’s URL once the notebook is running.
7990

8091
Find the notebook by following this path: `ai-training-examples` > `notebooks` > `audio` > `audio-classification` > `notebook-marine-sound-classification.ipynb`.

0 commit comments

Comments
 (0)