Skip to content

sjames40/LONDN_MRI

Repository files navigation

Adaptive Local Neighborhood-based Neural Networks for MR Image Reconstruction (LONDN-MRI)

IEEE Xplore License: MIT Python 3.8+

Official Code Implementation for the paper "Adaptive Local Neighborhood-based Neural Networks for MR Image Reconstruction from Undersampled Data" (IEEE TCI 2024).

📖 Overview

This repository contains the code for testing and reproducing results for the LONDN-MRI project.

The method focuses on Adaptive Local Neighborhood-based Neural Networks for MRI reconstruction. The codebase supports varying experimental setups including single-coil and multi-coil studies, leveraging both global and local modeling strategies.

This implementation is based on and extends the MODL and BLIPS (Blind Primed Supervised Learning) frameworks.

Key Components

The code is organized into three main study components(For benchmark comparison, we refer to the original implementation of MODL)

  1. Multi-Coil Global MODL
  2. Multi-Coil LONDN

📂 Directory Overview

The core implementation is located under the multi_coil_LONDN/ directory.

📦 LONDN_MRI
 ┣ 📂 data                           # Data directory
 ┣ 📂 models-new                     # Neural network architectures
 ┃ ┣ 📜 networks.py                  # Common network utilities
 ┃ ┗ 📜 Unet_model_fast_mri.py       # Standard UNet for FastMRI
 ┗ 📂 multi_coil_LONDN-new           # Main Local LONDN implementation
   ┣ 📂 prepare_trained_model        # Saved model checkpoints (e.g., index101)
   ┣ 📂 generated_dataset            # Generated image-space datasets and masks
   ┃ ┣ 📂 4acceleration_mask_random3 # Random masks for training (4x accel)
   ┃ ┣ 📂 4acceleration_mask_test2   # Fixed masks for testing (4x accel)
   ┃ ┣ 📂 four_fold_image_shape      # Processed 2-channel complex images
   ┃ ┗ 📂 test_four_fold             # Test set specific data
   ┣ 📜 prepare_data_from_kspace.py  # Script to generate image-space dataset and masks from k-space
   ┣ 📜 local_network_dataset.py     # Data loader for noisy local neighborhood case
   ┗ 📜 train_local_unet.py          # Training/Testing script using UNet for local reconstruction

🚀 Getting Started

1. Prerequisites

  • Python 3.8+
  • PyTorch > 1.7.0
  • BART: Used for generating the initial dataset.

2. Environment Setup

We provide an environment.yml file to easily configure the Conda environment with all dependencies (including PyTorch, Visdom, Dominate, etc.).

# 1. Create the environment
conda env create -f environment.yml

# 2. Activate the environment
conda activate londn-mri

3. Data Preparation

To reproduce the results, please download the specific k-space datasets used in our experiments.

  1. Dataset:

  2. Setup:
    We recommend downloading the fastMRI dataset first, as it is the primary dataset used to generate the results in our_notebook.ipynb.

    Download Instructions:

    • For fastMRI: Please visit the official website to obtain the license/agreement and then download the data.
    • For Stanford 2D FSE: The full dataset is available on the official website. We also provide a partial dataset (subset) via Google Drive for quick testing.

    ⚠️⚠️Once downloaded, unzip the files and place them into the project directory (e.g., inside a folder named data or as specified in the notebook).

    Step 1: Configure Path Unzip the downloaded data to your local storage (e.g., /mnt/DataA/NEW_KSPACE). Open multi_coil_LONDN/prepare_data_from_kspace.py and update the Kspace_data_name variable:

    # Inside prepare_data_from_kspace.py.py
    SOURCE_KSPACE_DIR = '/mnt/DataA/NEW_KSPACE'  # <--- Change this to your path

    Step 2: Generate Dataset Run the script to create the image space data based on the k-space inputs:

    cd multi_coil_LONDN
    python prepare_data_from_kspace.py.py

    The results will be saved into '/multi_coil_LONDN/generated_dataset'.

🏃 Usage

Local Model Training -- LONDN-MRI (UNet)

To train the local model and test reconstruction from undersampled multi-coil k-space measurements:

python train_local_unet.py

Pre-trained Checkpoints

We provide pre-trained model weights in the checkpoints/ directory (e.g., best_net_for_index101, last_net_for_index101, etc.). These checkpoints can be used to:

  • Reproduce results reported in the paper.
  • Fine-tune the model on your own datasets.

Results

The results will be saved into '/multi_coil_LONDN/checkpoints'.

(Note: Ensure you are in the multi_coil_LONDN directory when running these scripts.)

📝 Citation

If you find this code useful, please cite our paper:

@ARTICLE{10510040,
  author={Liang, Shijun and Lahiri, Anish and Ravishankar, Saiprasad},
  journal={IEEE Transactions on Computational Imaging}, 
  title={Adaptive Local Neighborhood-Based Neural Networks for MR Image Reconstruction From Undersampled Data}, 
  year={2024},
  volume={10},
  number={},
  pages={1235-1249},
  keywords={Image reconstruction;Magnetic resonance imaging;Training;Deep learning;Adaptation models;Time measurement;Optimization;Compressed sensing;deep learning;machine learning;magnetic resonance imaging;unrolling},
  doi={10.1109/TCI.2024.3394770}}

📬 Correspondence

For questions regarding the paper or code, please contact:

  • Shijun Liang: liangs16@msu.edu
  • Haijie Yuan: yuanhai1@msu.edu
  • Prof. Saiprasad Ravishankar: ravisha3@msu.edu

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages