Skip to content

[PROJECT] Julia implementation of neural network estimators  #52

@Tinggong

Description

@Tinggong

Introduction
Microstructure.jl is a Julia toolbox (development version) aiming at fast and probabilistic microstructure imaging. It features flexible biophysical modelling with MRI data. For estimating microstructure parameters from these models, it includes generic estimators such as Markov Chain Monte Carlo (MCMC) sampling methods and Monte Carlo dropout with neural networks.

Goal
Using Flux.jl and Microstructure.jl to implement different types of neural networks. The current neural network estimator in Microstructure.jl uses multi-layer perceptron for supervised training with training samples generated from forward models in Microstructure.jl, e.g. MRI measurements as inputs and microstructure parameters as outputs. For other types of methods, an example we can try is to implement self-supervised method that uses the forward models in Microstructure.jl as a decoder.

Resources

  1. Tutorials/domes about how to use Microstructure.jl will be available soon on the documentation website
  2. For neural network examples using Flux, there are various models that you can reference at the Flux model zoo
  3. Python implementation example for the network models

Julia is a programming language designed for high performance. If you are interested in Julia or have experiences in related areas using other languages, join me in hacking towards the goal!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions