diff --git a/DocForParticipants.md b/DocForParticipants.md index 2f64e51..2a643c1 100644 --- a/DocForParticipants.md +++ b/DocForParticipants.md @@ -78,6 +78,7 @@ installation of all dependencies and downloading the example data. [![CPU Codespace](https://img.shields.io/badge/Codespaces-CPU-blue?logo=github)](https://codespaces.new/SyneRBI/SIRF-Exercises) [![GPU Codespace](https://img.shields.io/badge/Codespaces-GPU-green?logo=github)](https://codespaces.new/SyneRBI/SIRF-Exercises?devcontainer_path=.devcontainer%2Fgpu%2Fdevcontainer.json&geo=UsEast&machine=standardLinuxNcv3) Some notes: +- Please select a GPU-node to run the GPU Codespace (you might have to try look in different "regions", and availability varies). - You will have to select a Python kernel for each notebook (top-right). Please use the existing `conda` python kernel (**not** `/usr/bin/python3`) listed in "Python environments". Alternatively, you can access the jupyter server running in the codespace via [port forwarding](https://docs.github.com/en/codespaces/developing-in-a-codespace/forwarding-ports-in-your-codespace). - You might want to conserve some resources by manually [stopping a code space](https://docs.github.com/en/codespaces/developing-in-codespaces/stopping-and-starting-a-codespace), @@ -96,6 +97,8 @@ Follow instructions given elsewhere. ### Using the VM +See our [VM instructions](https://github.com/SyneRBI/SIRF-SuperBuild/blob/master/VirtualBox/README.md) for information. + 1. start the VM from VirtualBox (user `sirfuser`, password `virtual`) 2. Open terminal (either via `Activities` or pressing `ctrl-alt-T`) and type @@ -163,7 +166,7 @@ cells for running the script, but you can also do this from the command line (se cd /wherever/you/installed/it/SIRF-Exercises scripts/download_data.sh -m -p ``` - On the VM and Azure, the exercises are installed in `~/devel`, in docker in `/devel`, and in the STFC Cloud in `~`. (Apologies for that!). + On the VM and Azure, the exercises are installed in `~/devel`, in docker in `~/work/devel`, and in the STFC Cloud in `~`. (Apologies for that!). This will be a ~3 GB download. diff --git a/notebooks/Introductory/introduction.ipynb b/notebooks/Introductory/introduction.ipynb index 594d523..c144965 100644 --- a/notebooks/Introductory/introduction.ipynb +++ b/notebooks/Introductory/introduction.ipynb @@ -220,7 +220,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We will download and use Brainweb data, which is made more convenient by using the Python brainweb module. We will use a FDG image for PET. MR usually provides qualitative images with an image contrast proportional to difference in T1, T2 or T2* depending on the sequence parameters. Nevertheless, we will make our life easy, by directly using the T1 map provided by the brainweb for MR." + "We will use Brainweb data, which is a dataset containing MRI scans as well as anatomical models of the normal brain, see https://brainweb.bic.mni.mcgill.ca/brainweb/anatomic_normal.html. Using this data is made more convenient by using the Python brainweb module. We will use a FDG image for PET. MR usually provides qualitative images with an image contrast proportional to difference in T1, T2 or T2* depending on the sequence parameters. Nevertheless, we will make our life easy, by directly using the T1 map provided by the brainweb for MR." ] }, {