Need help with tutorials?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

Project-MONAI
152 Stars 81 Forks Apache License 2.0 133 Commits 15 Opened issues

Description

MONAI Tutorials

Services available

!
?

Need anything else?

Contributors list

MONAI Tutorials

This repository hosts the MONAI tutorials.

1. Requirements

Most of the examples and tutorials require matplotlib and Jupyter Notebook.

These can be installed with:

python -m pip install -U pip
python -m pip install -U matplotlib
python -m pip install -U notebook

Some of the examples may require optional dependencies. In case of any optional import errors, please install the relevant packages according to MONAI's installation guide. Or install all optional requirements with:

pip install -r https://raw.githubusercontent.com/Project-MONAI/MONAI/master/requirements-dev.txt

Run the notebooks from Colab

Most of the Jupyter Notebooks have an "Open in Colab" button. Please right-click on the button, and select "Open Link in New Tab" to start a Colab page with the corresponding notebook content.

To use GPU resources through Colab, please remember to change the runtime type to

GPU
:
  1. From the
    Runtime
    menu select
    Change runtime type
  2. Choose
    GPU
    from the drop-down menu
  3. Click
    SAVE
    This will reset the notebook and may ask you if you are a robot (these instructions assume you are not).

Running:

!nvidia-smi

in a cell will verify this has worked and show you what kind of hardware you have access to.

Data

Some notebooks will require additional data. They can be downloaded by running the runexamples.sh script.

2. Questions and bugs

  • For questions relating to the use of MONAI, please us our Discussions tab on the main repository of MONAI.
  • For bugs relating to MONAI functionality, please create an issue on the main repository.
  • For bugs relating to the running of a tutorial, please create an issue in this repository.

3. Note to developers

During integration testing, we run these notebooks. To save time, we modify variables to avoid unecessary

for
loop iterations. Hence, during training please use the variables
max_epochs
and
val_interval
for the number of training epochs and validation interval, respectively.

If your notebook doesn't use the idea of epochs, then please add it to the variable

doesnt_contain_max_epochs
in
runner.sh
. This lets the runner know that it's not a problem if it doesn't find
max_epochs
.

If you have any other variables that would benefit by setting them to

1
during testing, add them to
strings_to_replace
in
runner.sh
.

4. List of notebooks and examples

2D classification

mednist_tutorial

This notebook shows how to easily integrate MONAI features into existing PyTorch programs. It's based on the MedNIST dataset which is very suitable for beginners as a tutorial. This tutorial also makes use of MONAI's in-built occlusion sensitivity functionality.

2D segmentation

torch examples

Training and evaluation examples of 2D segmentation based on UNet and synthetic dataset. The examples are standard PyTorch programs and have both dictionary-based and array-based versions.

3D classification

ignite examples

Training and evaluation examples of 3D classification based on DenseNet3D and IXI dataset. The examples are PyTorch Ignite programs and have both dictionary-based and array-based transformation versions.

torch examples

Training and evaluation examples of 3D classification based on DenseNet3D and IXI dataset. The examples are standard PyTorch programs and have both dictionary-based and array-based transformation versions.

3D segmentation

ignite examples

Training and evaluation examples of 3D segmentation based on UNet3D and synthetic dataset. The examples are PyTorch Ignite programs and have both dictionary-base and array-based transformations.

torch examples

Training and evaluation examples of 3D segmentation based on UNet3D and synthetic dataset. The examples are standard PyTorch programs and have both dictionary-based and array-based versions.

bratssegmentation3d

This tutorial shows how to construct a training workflow of multi-labels segmentation task based on MSD Brain Tumor dataset.

spleensegmentation3d_lightning

This notebook shows how MONAI may be used in conjunction with the PyTorch Lightning framework.

spleensegmentation3d

This notebook is an end-to-end training and evaluation example of 3D segmentation based on MSD Spleen dataset. The example shows the flexibility of MONAI modules in a PyTorch-based program: - Transforms for dictionary-based training data structure. - Load NIfTI images with metadata. - Scale medical image intensity with expected range. - Crop out a batch of balanced image patch samples based on positive / negative label ratio. - Cache IO and transforms to accelerate training and validation. - 3D UNet, Dice loss function, Mean Dice metric for 3D segmentation task. - Sliding window inference. - Deterministic training for reproducibility.

unetsegmentation3d_catalyst

This notebook shows how MONAI may be used in conjunction with the Catalyst framework.

unetsegmentation3d_ignite

This notebook is an end-to-end training & evaluation example of 3D segmentation based on synthetic dataset. The example is a PyTorch Ignite program and shows several key features of MONAI, especially with medical domain specific transforms and event handlers.

COVID 19-20 challenge baseline

This folder provides a simple baseline method for training, validation, and inference for COVID-19 LUNG CT LESION SEGMENTATION CHALLENGE - 2020 (a MICCAI Endorsed Event).

federated learning

Substra

The example show how to execute the 3d segmentation torch tutorial on a federated learning platform, Substra.

acceleration

distributed_training

The examples show how to execute distributed training and evaluation based on 3 different frameworks: - PyTorch native

DistributedDataParallel
module with
torch.distributed.launch
. - Horovod APIs with
horovodrun
. - PyTorch ignite and MONAI workflows.

They can run on several distributed nodes with multiple GPU devices on every node.

automaticmixedprecision

And compares the training speed and memory usage with/without AMP.

datasettypeperformance

This notebook compares the performance of

Dataset
,
CacheDataset
and
PersistentDataset
. These classes differ in how data is stored (in memory or on disk), and at which moment transforms are applied.

fasttrainingtutorial

This tutorial compares the training performance of pure PyTorch program and optimized program in MONAI based on NVIDIA GPU device and latest CUDA library. The optimization methods mainly include:

AMP
,
CacheDataset
and
Novograd
.

multigputest

This notebook is a quick demo for devices, run the Ignite trainer engine on CPU, GPU and multiple GPUs.

threadbuffer_performance

Demonstrates the use of the

ThreadBuffer
class used to generate data batches during training in a separate thread.

transform_speed

Illustrate reading NIfTI files and test speed of different transforms on different devices.

modules

engines

Training and evaluation examples of 3D segmentation based on UNet3D and synthetic dataset with MONAI workflows, which contains engines, event-handlers, and post-transforms. And GAN training and evaluation example for a medical image generative adversarial network. Easy run training script uses

GanTrainer
to train a 2D CT scan reconstruction network. Evaluation script generates random samples from a trained network.

The examples are built with MONAI workflows, mainly contain: trainer/evaluator, handlers, post_transforms, etc.

3dimagetransforms

This notebook demonstrates the transformations on volumetric images.

autoencoder_mednist

This tutorial uses the MedNIST hand CT scan dataset to demonstrate MONAI's autoencoder class. The autoencoder is used with an identity encode/decode (i.e., what you put in is what you should get back), as well as demonstrating its usage for de-blurring and de-noising.

dynunet_tutorial

This tutorial shows how to train 3D segmentation tasks on all the 10 decathlon datasets with the reimplementation of dynUNet in MONAI.

integrate3rdparty_transforms

This tutorial shows how to integrate 3rd party transforms into MONAI program. Mainly shows transforms from BatchGenerator, TorchIO, Rising and ITK.

layer wise learning rate

This notebook demonstrates how to select or filter out expected network layers and set customized learning rate values.

loadmedicalimagesl

This notebook introduces how to easily load different formats of medical images in MONAI and execute many additional operations.

mednistGANtutorial

This notebook illustrates the use of MONAI for training a network to generate images from a random input tensor. A simple GAN is employed to do with a separate Generator and Discriminator networks.

mednistGANworkflow_dict

This notebook shows the

GanTrainer
, a MONAI workflow engine for modularized adversarial learning. Train a medical image reconstruction network using the MedNIST hand CT scan dataset. Dictionary version.

mednistGANworkflow_array

This notebook shows the

GanTrainer
, a MONAI workflow engine for modularized adversarial learning. Train a medical image reconstruction network using the MedNIST hand CT scan dataset. Array version.

models_ensemble

This tutorial shows how to leverage

EnsembleEvaluator
,
MeanEnsemble
and
VoteEnsemble
modules in MONAI to set up ensemble program.

niftireadexample

Illustrate reading NIfTI files and iterating over image patches of the volumes loaded from them.

dynunet_tutorial

This tutorial shows how to train 3D segmentation tasks on all the 10 decathlon datasets with the reimplementation of dynUNet in MONAI.

post_transforms

This notebook shows the usage of several post transforms based on the model output of spleen segmentation task.

public_datasets

This notebook shows how to quickly set up training workflow based on

MedNISTDataset
and
DecathlonDataset
, and how to create a new dataset.

transformsdemo2d

This notebook demonstrates the image transformations on histology images using the GlaS Contest dataset.

varautoencoder_mednist

This tutorial uses the MedNIST scan (or alternatively the MNIST) dataset to demonstrate MONAI's variational autoencoder class.

interpretability

Tutorials in this folder demonstrate model visualisation and interpretability features of MONAI. Currently, it consists of class activation mapping and occlusion sensitivity for 3D classification model visualisations and analysis.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.