All notable changes to MONAI are documented in this file.
The format is based on Keep a Changelog and this project adheres to Semantic Versioning.
0.8.0 - 2021-11-25
- Overview of new features in v0.8
- Network modules for differentiable neural network topology search (DiNTS)
- Multiple Instance Learning transforms and models for digital pathology WSI analysis
- Vision transformers for self-supervised representation learning
- Contrastive loss for self-supervised learning
- Finalized major improvements of 200+ components in
monai.transformsto support input and backend in PyTorch and NumPy - Initial registration module benchmarking with
GlobalMutualInformationLossas an example monai.transformsdocumentation with visual examples and the utility functions- Event handler for
MLfLowintegration - Enhanced data visualization functions including
blend_imagesandmatshow3d RandGridDistortionandSmoothFieldinmonai.transforms- Support of randomized shuffle buffer in iterable datasets
- Performance review and enhancements for data type casting
- Cumulative averaging API with distributed environment support
- Module utility functions including
require_pkgandpytorch_after - Various usability enhancements such as
allow_smallerwhen sampling ROI andwrap_sequencewhen casting object types tifffilesupport inWSIReader- Regression tests for the fast training workflows
- Various tutorials and demos including educational contents at MONAI Bootcamp 2021
- Base Docker image upgraded to
nvcr.io/nvidia/pytorch:21.10-py3fromnvcr.io/nvidia/pytorch:21.08-py3 - Decoupled
TraceKeysandTraceableTransformAPIs fromInvertibleTransform - Skipping affine-based resampling when
resample=FalseinNiftiSaver - Deprecated
threshold_values: boolandnum_classes: intinAsDiscrete - Enhanced
apply_filterfor spatially 1D, 2D and 3D inputs with non-separable kernels - Logging with
loggingin downloading and model archives inmonai.apps - API documentation site now defaults to
stableinstead oflatest skip-magic-trailing-commain coding style enforcements- Pre-merge CI pipelines now include unit tests with Nvidia Ampere architecture
- Support for PyTorch 1.5
- The deprecated
DynUnetV1and the related network blocks - GitHub self-hosted CI/CD pipelines for package releases
- Support of path-like objects as file path inputs in most modules
- Issue of
decollate_batchfor dictionary of empty lists - Typos in documentation and code examples in various modules
- Issue of no available keys when
allow_missing_keys=Truefor theMapTransform - Issue of redundant computation when normalization factors are 0.0 and 1.0 in
ScaleIntensity - Incorrect reports of registered readers in
ImageReader - Wrong numbering of iterations in
StatsHandler - Naming conflicts in network modules and aliases
- Incorrect output shape when
reduction="none"inFocalLoss - Various usability issues reported by users
0.7.0 - 2021-09-24
- Overview of new features in v0.7
- Initial phase of major usability improvements in
monai.transformsto support input and backend in PyTorch and NumPy - Performance enhancements, with profiling and tuning guides for typical use cases
- Reproducing training modules and workflows of state-of-the-art Kaggle competition solutions
- 24 new transforms, including
OneOfmeta transform- DeepEdit guidance signal transforms for interactive segmentation
- Transforms for self-supervised pre-training
- Integration of NVIDIA Tools Extension (NVTX)
- Integration of cuCIM
- Stain normalization and contextual grid for digital pathology
Transchexnetwork for vision-language transformers for chest X-ray analysisDatasetSummaryutility inmonai.dataWarmupCosineSchedule- Deprecation warnings and documentation support for better backwards compatibility
- Padding with additional
kwargsand different backend API - Additional options such as
dropoutandnormin various networks and their submodules
- Base Docker image upgraded to
nvcr.io/nvidia/pytorch:21.08-py3fromnvcr.io/nvidia/pytorch:21.06-py3 - Deprecated input argument
n_classes, in favor ofnum_classes - Deprecated input argument
dimensionsandndims, in favor ofspatial_dims - Updated the Sphinx-based documentation theme for better readability
NdarrayTensortype is replaced byNdarrayOrTensorfor simpler annotations- Self-attention-based network blocks now support both 2D and 3D inputs
- The deprecated
TransformInverter, in favor ofmonai.transforms.InvertD - GitHub self-hosted CI/CD pipelines for nightly and post-merge tests
monai.handlers.utils.evenly_divisible_all_gathermonai.handlers.utils.string_list_all_gather
- A Multi-thread cache writing issue in
LMDBDataset - Output shape convention inconsistencies of the image readers
- Output directory and file name flexibility issue for
NiftiSaver,PNGSaver - Requirement of the
labelfield in test-time augmentation - Input argument flexibility issues for
ThreadDataLoader - Decoupled
DiceandCrossEntropyintermediate results inDiceCELoss - Improved documentation, code examples, and warning messages in various modules
- Various usability issues reported by users
0.6.0 - 2021-07-08
- 10 new transforms, a masked loss wrapper, and a
NetAdapterfor transfer learning - APIs to load networks and pre-trained weights from Clara Train Medical Model ARchives (MMARs)
- Base metric and cumulative metric APIs, 4 new regression metrics
- Initial CSV dataset support
- Decollating mini-batch as the default first postprocessing step, Migrating your v0.5 code to v0.6 wiki shows how to adapt to the breaking changes
- Initial backward compatibility support via
monai.utils.deprecated - Attention-based vision modules and
UNETRfor segmentation - Generic module loaders and Gaussian mixture models using the PyTorch JIT compilation
- Inverse of image patch sampling transforms
- Network block utilities
get_[norm, act, dropout, pool]_layer unpack_itemsmode forapply_transformandCompose- New event
INNER_ITERATION_STARTEDin the deepgrow interactive workflow set_dataAPI for cache-based datasets to dynamically update the dataset content- Fully compatible with PyTorch 1.9
--disttestsand--minoptions forruntests.sh- Initial support of pre-merge tests with Nvidia Blossom system
- Base Docker image upgraded to
nvcr.io/nvidia/pytorch:21.06-py3fromnvcr.io/nvidia/pytorch:21.04-py3 - Optionally depend on PyTorch-Ignite v0.4.5 instead of v0.4.4
- Unified the demo, tutorial, testing data to the project shared drive, and
Project-MONAI/MONAI-extra-test-data - Unified the terms:
post_transformis renamed topostprocessing,pre_transformis renamed topreprocessing - Unified the postprocessing transforms and event handlers to accept the "channel-first" data format
evenly_divisible_all_gatherandstring_list_all_gathermoved tomonai.utils.dist
- Support of 'batched' input for postprocessing transforms and event handlers
TorchVisionFullyConvModelset_visible_devicesutility functionSegmentationSaverandTransformsInverterhandlers
- Issue of handling big-endian image headers
- Multi-thread issue for non-random transforms in the cache-based datasets
- Persistent dataset issue when multiple processes sharing a non-exist cache location
- Typing issue with Numpy 1.21.0
- Loading checkpoint with both
modelandoptmizierusingCheckpointLoaderwhenstrict_shape=False SplitChannelhas different behaviour depending on numpy/torch inputs- Transform pickling issue caused by the Lambda functions
- Issue of filtering by name in
generate_param_groups - Inconsistencies in the return value types of
class_activation_maps - Various docstring typos
- Various usability enhancements in
monai.transforms
0.5.3 - 2021-05-28
- Project default branch renamed to
devfrommaster - Base Docker image upgraded to
nvcr.io/nvidia/pytorch:21.04-py3fromnvcr.io/nvidia/pytorch:21.02-py3 - Enhanced type checks for the
iteration_metrichandler - Enhanced
PersistentDatasetto usetempfileduring caching computation - Enhanced various info/error messages
- Enhanced performance of
RandAffine - Enhanced performance of
SmartCacheDataset - Optionally requires
cucimwhen the platform isLinux - Default
deviceofTestTimeAugmentationchanged tocpu
- Download utilities now provide better default parameters
- Duplicated
key_transformsin the patch-based transforms - A multi-GPU issue in
ClassificationSaver - A default
meta_dataissue inSpacingD - Dataset caching issue with the persistent data loader workers
- A memory issue in
permutohedral_cuda - Dictionary key issue in
CopyItemsd box_startandbox_endparameters for deepgrowSpatialCropForegroundd- Tissue mask array transpose issue in
MaskedInferenceWSIDataset - Various type hint errors
- Various docstring typos
- Support of
to_tensoranddevicearguments forTransformInverter - Slicing options with SpatialCrop
- Class name alias for the networks for backward compatibility
k_divisibleoption for CropForegroundmap_itemsoption forCompose- Warnings of
infandnanfor surface distance computation - A
print_logflag to the image savers - Basic testing pipelines for Python 3.9
0.5.0 - 2021-04-09
- Overview document for feature highlights in v0.5.0
- Invertible spatial transforms
InvertibleTransformbase APIs- Batch inverse and decollating APIs
- Inverse of
Compose - Batch inverse event handling
- Test-time augmentation as an application
- Initial support of learning-based image registration:
- Bending energy, LNCC, and global mutual information loss
- Fully convolutional architectures
- Dense displacement field, dense velocity field computation
- Warping with high-order interpolation with C++/CUDA implementations
- Deepgrow modules for interactive segmentation:
- Workflows with simulations of clicks
- Distance-based transforms for guidance signals
- Digital pathology support:
- Efficient whole slide imaging IO and sampling with Nvidia cuCIM and SmartCache
- FROC measurements for lesion
- Probabilistic post-processing for lesion detection
- TorchVision classification model adaptor for fully convolutional analysis
- 12 new transforms, grid patch dataset,
ThreadDataLoader, EfficientNets B0-B7 - 4 iteration events for the engine for finer control of workflows
- New C++/CUDA extensions:
- Conditional random field
- Fast bilateral filtering using the permutohedral lattice
- Metrics summary reporting and saving APIs
- DiceCELoss, DiceFocalLoss, a multi-scale wrapper for segmentation loss computation
- Data loading utilities:
decollate_batchPadListDataCollatewith inverse support
- Support of slicing syntax for
Dataset - Initial Torchscript support for the loss modules
- Learning rate finder
- Allow for missing keys in the dictionary-based transforms
- Support of checkpoint loading for transfer learning
- Various summary and plotting utilities for Jupyter notebooks
- Contributor Covenant Code of Conduct
- Major CI/CD enhancements covering the tutorial repository
- Fully compatible with PyTorch 1.8
- Initial nightly CI/CD pipelines using Nvidia Blossom Infrastructure
- Enhanced
list_data_collateerror handling - Unified iteration metric APIs
densenet*extensions are renamed toDenseNet*se_res*network extensions are renamed toSERes*- Transform base APIs are rearranged into
compose,inverse, andtransform _do_transformflag for the random augmentations is unified viaRandomizableTransform- Decoupled post-processing steps, e.g.
softmax,to_onehot_y, from the metrics computations - Moved the distributed samplers to
monai.data.samplersfrommonai.data.utils - Engine's data loaders now accept generic iterables as input
- Workflows now accept additional custom events and state properties
- Various type hints according to Numpy 1.20
- Refactored testing utility
runtests.shto have--unittestand--net(integration tests) options - Base Docker image upgraded to
nvcr.io/nvidia/pytorch:21.02-py3fromnvcr.io/nvidia/pytorch:20.10-py3 - Docker images are now built with self-hosted environments
- Primary contact email updated to
monai.contact@gmail.com - Now using GitHub Discussions as the primary communication forum
- Compatibility tests for PyTorch 1.5.x
- Format specific loaders, e.g.
LoadNifti,NiftiDataset - Assert statements from non-test files
from module import *statements, addressed flake8 F403
- Uses American English spelling for code, as per PyTorch
- Code coverage now takes multiprocessing runs into account
- SmartCache with initial shuffling
ConvertToMultiChannelBasedOnBratsClassesnow supports channel-first inputs- Checkpoint handler to save with non-root permissions
- Fixed an issue for exiting the distributed unit tests
- Unified
DynUNetto have single tensor output w/o deep supervision SegmentationSavernow supports user-specified data types and asqueeze_end_dimsflag- Fixed
*Saverevent handlers output filenames with adata_root_diroption - Load image functions now ensure little-endian
- Fixed the test runner to support regex-based test case matching
- Usability issues in the event handlers
0.4.0 - 2020-12-15
- Overview document for feature highlights in v0.4.0
- Torchscript support for the net modules
- New networks and layers:
- Discrete Gaussian kernels
- Hilbert transform and envelope detection
- Swish and mish activation
- Acti-norm-dropout block
- Upsampling layer
- Autoencoder, Variational autoencoder
- FCNet
- Support of initialisation from pretrained weights for densenet, senet, multichannel AHNet
- Layer-wise learning rate API
- New model metrics and event handlers based on occlusion sensitivity, confusion matrix, surface distance
- CAM/GradCAM/GradCAM++
- File format-agnostic image loader APIs with Nibabel, ITK readers
- Enhancements for dataset partition, cross-validation APIs
- New data APIs:
- LMDB-based caching dataset
- Cache-N-transforms dataset
- Iterable dataset
- Patch dataset
- Weekly PyPI release
- Fully compatible with PyTorch 1.7
- CI/CD enhancements:
- Skipping, speed up, fail fast, timed, quick tests
- Distributed training tests
- Performance profiling utilities
- New tutorials and demos:
- Autoencoder, VAE tutorial
- Cross-validation demo
- Model interpretability tutorial
- COVID-19 Lung CT segmentation challenge open-source baseline
- Threadbuffer demo
- Dataset partitioning tutorial
- Layer-wise learning rate demo
- MONAI Bootcamp 2020
- Base Docker image upgraded to
nvcr.io/nvidia/pytorch:20.10-py3fromnvcr.io/nvidia/pytorch:20.08-py3
monai.apps.CVDecathlonDatasetis extended to a genericmonai.apps.CrossValidationwith andataset_clsoption- Cache dataset now requires a
monai.transforms.Composeinstance as the transform argument - Model checkpoint file name extensions changed from
.pthto.pt - Readers'
get_spatial_shapereturns a numpy array instead of list - Decoupled postprocessing steps such as
sigmoid,to_onehot_y,mutually_exclusive,logit_threshfrom metrics and event handlers, the postprocessing steps should be used before calling the metrics methods ConfusionMatrixMetricandDiceMetriccomputation now returns an additionalnot_nansflag to indicate valid resultsUpSampleoptionalmodenow supports"deconv","nontrainable","pixelshuffle";interp_modeis only used whenmodeis"nontrainable"SegResNetoptionalupsample_modenow supports"deconv","nontrainable","pixelshuffle"monai.transforms.Composeclass inheritsmonai.transforms.Transform- In
Rotate,Rotated,RandRotate,RandRotatedtransforms, theanglerelated parameters are interpreted as angles in radians instead of degrees. SplitChannelandSplitChanneldmoved fromtransforms.posttotransforms.utility
- Support of PyTorch 1.4
- Enhanced loss functions for stability and flexibility
- Sliding window inference memory and device issues
- Revised transforms:
- Normalize intensity datatype and normalizer types
- Padding modes for zoom
- Crop returns coordinates
- Select items transform
- Weighted patch sampling
- Option to keep aspect ratio for zoom
- Various CI/CD issues
0.3.0 - 2020-10-02
- Overview document for feature highlights in v0.3.0
- Automatic mixed precision support
- Multi-node, multi-GPU data parallel model training support
- 3 new evaluation metric functions
- 11 new network layers and blocks
- 6 new network architectures
- 14 new transforms, including an I/O adaptor
- Cross validation module for
DecathlonDataset - Smart Cache module in dataset
monai.optimizersmodulemonai.csrcmodule- Experimental feature of ImageReader using ITK, Nibabel, Numpy, Pillow (PIL Fork)
- Experimental feature of differentiable image resampling in C++/CUDA
- Ensemble evaluator module
- GAN trainer module
- Initial cross-platform CI environment for C++/CUDA code
- Code style enforcement now includes isort and clang-format
- Progress bar with tqdm
- Now fully compatible with PyTorch 1.6
- Base Docker image upgraded to
nvcr.io/nvidia/pytorch:20.08-py3fromnvcr.io/nvidia/pytorch:20.03-py3 - Code contributions now require signing off on the Developer Certificate of Origin (DCO)
- Major work in type hinting finished
- Remote datasets migrated to Open Data on AWS
- Optionally depend on PyTorch-Ignite v0.4.2 instead of v0.3.0
- Optionally depend on torchvision, ITK
- Enhanced CI tests with 8 new testing environments
MONAI/examplesfolder (relocated intoProject-MONAI/tutorials)MONAI/researchfolder (relocated toProject-MONAI/research-contributions)
dense_patch_slicesincorrect indexing- Data type issue in
GeneralizedWassersteinDiceLoss ZipDatasetreturn value inconsistenciessliding_window_inferenceindexing anddeviceissues- importing monai modules may cause namespace pollution
- Random data splits issue in
DecathlonDataset - Issue of randomising a
Composetransform - Various issues in function type hints
- Typos in docstring and documentation
PersistentDatasetissue with existing file folder- Filename issue in the output writers
0.2.0 - 2020-07-02
- Overview document for feature highlights in v0.2.0
- Type hints and static type analysis support
MONAI/researchfoldermonai.engine.workflowAPIs for supervised trainingmonai.inferersAPIs for validation and inference- 7 new tutorials and examples
- 3 new loss functions
- 4 new event handlers
- 8 new layers, blocks, and networks
- 12 new transforms, including post-processing transforms
monai.apps.datasetsAPIs, includingMedNISTDatasetandDecathlonDataset- Persistent caching,
ZipDataset, andArrayDatasetinmonai.data - Cross-platform CI tests supporting multiple Python versions
- Optional import mechanism
- Experimental features for third-party transforms integration
For more details please visit the project wiki
- Core modules now require numpy >= 1.17
- Categorized
monai.transformsmodules into crop and pad, intensity, IO, post-processing, spatial, and utility. - Most transforms are now implemented with PyTorch native APIs
- Code style enforcement and automated formatting workflows now use autopep8 and black
- Base Docker image upgraded to
nvcr.io/nvidia/pytorch:20.03-py3fromnvcr.io/nvidia/pytorch:19.10-py3 - Enhanced local testing tools
- Documentation website domain changed to https://docs.monai.io
- Support of Python < 3.6
- Automatic installation of optional dependencies including pytorch-ignite, nibabel, tensorboard, pillow, scipy, scikit-image
- Various issues in type and argument names consistency
- Various issues in docstring and documentation site
- Various issues in unit and integration tests
- Various issues in examples and notebooks
0.1.0 - 2020-04-17
- Public alpha source code release under the Apache 2.0 license (highlights)
- Various tutorials and examples
- Medical image classification and segmentation workflows
- Spacing/orientation-aware preprocessing with CPU/GPU and caching
- Flexible workflows with PyTorch Ignite and Lightning
- Various GitHub Actions
- CI/CD pipelines via self-hosted runners
- Documentation publishing via readthedocs.org
- PyPI package publishing
- Contributing guidelines
- A project logo and badges