Skip to content

Commit 1c05d9e

Browse files
dongyang0122dongypre-commit-ci[bot]heyufan1995
authored
Update DiNTS Tutorials (#469)
* initialize dints tutorials Signed-off-by: dongy <dongy@nvidia.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * update scripts Signed-off-by: dongy <dongy@nvidia.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * update scripts Signed-off-by: dongy <dongy@nvidia.com> * update scripts Signed-off-by: dongy <dongy@nvidia.com> * Update README.md * Update README.md * update scripts Signed-off-by: dongy <dongy@nvidia.com> * Change Dints interface * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Modify scripts for new Dints interface * update scripts Signed-off-by: dongy <dongy@nvidia.com> * update scripts Signed-off-by: dongy <dongy@nvidia.com> * update scripts Signed-off-by: dongy <dongy@nvidia.com> * update scripts Signed-off-by: dongy <dongy@nvidia.com> * Test push * update scripts Signed-off-by: dongy <dongy@nvidia.com> * update scripts Signed-off-by: dongy <dongy@nvidia.com> * update scripts Signed-off-by: dongy <dongy@nvidia.com> * update readme Signed-off-by: Dong Yang <dongy@dongy-mlt.client.nvidia.com> * Add readme * Update readme * Change lr in search * update readme Signed-off-by: Dong Yang <dongy@dongy-mlt.client.nvidia.com> * Enable single GPU * update readme Signed-off-by: Dong Yang <dongy@dongy-mlt.client.nvidia.com> * Add visualization tutorial transform image (#448) * [DLMED] add visualization tutorial Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] add image Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] update notebook Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] update notebook Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] add reference image Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] fix PEP error Signed-off-by: Nic Ma <nma@nvidia.com> * Update spleen_segmentation_3d.ipynb (#455) The `pip install` statement is missing pytorch-ignite. Changing: `!python -c "import monai" || pip install -q "monai-weekly[gdown, nibabel, tqdm]"` to `!python -c "import monai" || pip install -q "monai-weekly[gdown, nibabel, tqdm, ignite]"` * Figures added, pretrained weights link added, minor fixes (#456) * Figures added, pretrained weights link added, minor fixes Signed-off-by: vnath <vnath@nvidia.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci Co-authored-by: vnath <vnath@nvidia.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * Add itkwidgets example in notebook (#454) * [DLMED] add itkwidgets Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] add reference screenshot Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] update to PLS orientation Signed-off-by: Nic Ma <nma@nvidia.com> * MIL example (#431) * MIL example Signed-off-by: myron <amyronenko@nvidia.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * mil tutorial update Signed-off-by: myron <amyronenko@nvidia.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * mil tutorial update Signed-off-by: myron <amyronenko@nvidia.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * small updates Signed-off-by: myron <amyronenko@nvidia.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * updated images Signed-off-by: myron <amyronenko@nvidia.com> * gdown for json Signed-off-by: myron <amyronenko@nvidia.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Update README Signed-off-by: Behrooz <3968947+drbeh@users.noreply.github.com> * Fix formatting and typos Signed-off-by: Behrooz <3968947+drbeh@users.noreply.github.com> * small fixes Signed-off-by: myron <amyronenko@nvidia.com> * stats Signed-off-by: myron <amyronenko@nvidia.com> * pip install Signed-off-by: myron <amyronenko@nvidia.com> * README fixes Signed-off-by: myron <amyronenko@nvidia.com> Co-authored-by: am <am> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Behrooz <3968947+drbeh@users.noreply.github.com> Co-authored-by: Nic Ma <nma@nvidia.com> * 450 update AsDiscrete (#451) * update asdiscrete Signed-off-by: Yiheng Wang <vennw@nvidia.com> * update postprocessing figures Signed-off-by: Yiheng Wang <vennw@nvidia.com> * fix version error of mutual info Signed-off-by: Yiheng Wang <vennw@nvidia.com> * update to use include in torchin Signed-off-by: Yiheng Wang <vennw@nvidia.com> * 459 update nvidia flare 2.0 example (#460) * update nvidia flare 2.0 example Signed-off-by: Yiheng Wang <vennw@nvidia.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * enhance code format Signed-off-by: Yiheng Wang <vennw@nvidia.com> * update folder and readme Signed-off-by: Yiheng Wang <vennw@nvidia.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * Weights Link Updated (#465) Signed-off-by: vnath <vnath@nvidia.com> Co-authored-by: vnath <vnath@nvidia.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * fixes readme typos Signed-off-by: Wenqi Li <wenqil@nvidia.com> * update readmes Signed-off-by: Wenqi Li <wenqil@nvidia.com> * update readme Signed-off-by: Wenqi Li <wenqil@nvidia.com> * qa commit Signed-off-by: Wenqi Li <wenqil@nvidia.com> * link Signed-off-by: Wenqi Li <wenqil@nvidia.com> * Add plot arch_code utils * Fix bugs in search and update readme * Fix combination weights bug * Small typo update * Fix minor bug in train_dints * update scripts Signed-off-by: dongy <dongy@nvidia.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * update scripts Signed-off-by: Dong Yang <dongy@dongy-mlt.client.nvidia.com> * update scripts Signed-off-by: Dong Yang <dongy@dongy-mlt.client.nvidia.com> * update scripts Signed-off-by: dongy <dongy@nvidia.com> * update scripts Signed-off-by: Dong Yang <dongy@dongy-mlt.client.nvidia.com> * update scripts Signed-off-by: dongy <dongy@nvidia.com> * Update Readme * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci Co-authored-by: dongy <dongy@nvidia.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: heyufan1995 <heyufan1995@gmail.com>
1 parent ec78d2f commit 1c05d9e

File tree

9 files changed

+248
-96
lines changed

9 files changed

+248
-96
lines changed

automl/DiNTS/Figures/search_0.2.png

64.3 KB
Loading

automl/DiNTS/Figures/search_0.8.png

77 KB
Loading

automl/DiNTS/README.md

Lines changed: 41 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Examples of DiNTS: Differentiable neural network topology search
1+
# Examples of DiNTS: Differentiable Neural Network Topology Search
22

33
In this tutorial, we present a novel neural architecture search algorithm for 3D medical image segmentation. The datasets used in this tutorial are Task07 Pancreas (CT images) and Task09 Spleen (CT images) from [Medical Segmentation Decathlon](http://medicaldecathlon.com/). The implementation is based on:
44

@@ -7,33 +7,47 @@ Yufan He, Dong Yang, Holger Roth, Can Zhao, Daguang Xu: "[DiNTS: Differentiable
77
![0.8](./Figures/arch_ram-cost-0.8.png)
88
![space](./Figures/search_space.png)
99

10-
## Requirements
11-
The script is tested with:
12-
- `Ubuntu 20.04` and `CUDA 11`
13-
- The searching and training stage requires at least two 16GB GPUs.
1410

1511
## Dependencies and installation
16-
### Download and install Nvidia PyTorch Docker
12+
The script is tested with: `Ubuntu 20.04` and `CUDA 11`
13+
14+
You can use nvidia docker or conda environments to install the dependencies.
15+
- ### Using Docker Image
16+
1. #### Download and install Nvidia PyTorch Docker
1717
```bash
1818
docker pull nvcr.io/nvidia/pytorch:21.10-py3
1919
```
20-
### Download the repository
20+
2. #### Download the repository
2121
```bash
2222
git clone https://github.com/Project-MONAI/tutorials.git
2323
```
24-
### Run into Docker
24+
3. #### Run into Docker
2525
```
2626
sudo docker run -it --gpus all --pid=host --shm-size 16G -v /location/to/tutorials/automl/DiNTS/:/workspace/DiNTS/ nvcr.io/nvidia/pytorch:21.10-py3
2727
```
28-
### Install MONAI and dependencies
28+
4. #### Install required package in docker
29+
```bash
30+
bash install.sh
31+
```
32+
33+
- ### Using Conda
34+
1. #### Install Pytorch >= 1.6
35+
```bash
36+
conda install pytorch torchvision torchaudio cudatoolkit=11.3 -c pytorch
37+
```
38+
2. #### Install MONAI and dependencies
2939
```bash
3040
bash install.sh
3141
```
42+
- ### Install [Graphviz](https://graphviz.org/download/) for visualization (needed in decode_plot.py)
3243

3344
## Data
3445
[Spleen CT dataset](https://drive.google.com/drive/folders/1HqEgzS8BV2c7xYNrZdEAnrHk7osJJ--2) and [Pancreas MRI dataset](https://drive.google.com/drive/folders/1HqEgzS8BV2c7xYNrZdEAnrHk7osJJ--2)
35-
from [Medical Segmentation Decathlon](http://medicaldecathlon.com/) is used. You can manually download it and save it to args.root. Otherwise, the script will automatic
36-
download the dataset.
46+
from [Medical Segmentation Decathlon](http://medicaldecathlon.com/) is used for this tutorial. You can manually download it and save it to args.root. Or you can use the script `download_msd_datasets.py` to download the MSD datasets of 10 segmentation tasks.
47+
```bash
48+
python download_msd_datasets.py --msd_task "Task07_Pancreas" \
49+
--root "/workspace/data_msd"
50+
```
3751

3852
## Examples
3953
The tutorial contains two stages: searching stage and training stage. An architecture is searched and saved into a `.pth` file using `search_dints.py`.
@@ -53,6 +67,10 @@ python train_dints.py -h
5367
```
5468
- Change ``NUM_GPUS_PER_NODE`` to your number of GPUs.
5569
- Run `bash search_dints.sh`
70+
- Call the function in `decode_plot.py` to visualize the searched model in a vector image (graphvis needs to be installed).
71+
The searched archtecture with ram cost 0.2 and 0.8 are shown below:
72+
![0.2 search](./Figures/search_0.2.png)
73+
![0.8 search](./Figures/search_0.8.png)
5674

5775
### Training
5876
- Add the following script to the commands of running into docker (Optional)
@@ -69,6 +87,18 @@ Training loss and validation metric curves are shown as follows. The experiments
6987

7088
![validation_metric](./Figures/validation_metric.png)
7189

90+
## Citation
91+
If you use this code in your work, please cite:
92+
```
93+
@inproceedings{he2021dints,
94+
title={DiNTS: Differentiable Neural Network Topology Search for 3D Medical Image Segmentation},
95+
author={He, Yufan and Yang, Dong and Roth, Holger and Zhao, Can and Xu, Daguang},
96+
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
97+
pages={5841--5850},
98+
year={2021}
99+
}
100+
```
101+
72102
## Questions and bugs
73103

74104
- For questions relating to the use of MONAI, please use our [Discussions tab](https://github.com/Project-MONAI/MONAI/discussions) on the main repository of MONAI.

automl/DiNTS/decode_plot.py

Lines changed: 93 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,93 @@
1+
# Copyright 2020 - 2021 MONAI Consortium
2+
# Licensed under the Apache License, Version 2.0 (the "License");
3+
# you may not use this file except in compliance with the License.
4+
# You may obtain a copy of the License at
5+
# http://www.apache.org/licenses/LICENSE-2.0
6+
# Unless required by applicable law or agreed to in writing, software
7+
# distributed under the License is distributed on an "AS IS" BASIS,
8+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
9+
# See the License for the specific language governing permissions and
10+
# limitations under the License.
11+
12+
import argparse
13+
import torch
14+
15+
from graphviz import Digraph
16+
17+
18+
parser = argparse.ArgumentParser(
19+
description="training",
20+
)
21+
parser.add_argument(
22+
"--checkpoint",
23+
type=str,
24+
default=None,
25+
help="checkpoint full path",
26+
)
27+
parser.add_argument(
28+
"--directory",
29+
type=str,
30+
default="./",
31+
help="directory to save",
32+
)
33+
parser.add_argument(
34+
"--filename",
35+
type=str,
36+
default="graph",
37+
help="directory to save",
38+
)
39+
40+
def plot_graph(
41+
codepath,
42+
filename="graph",
43+
directory="./",
44+
code2in = [0,1,0,1,2,1,2,3,2,3],
45+
code2out = [0,0,1,1,1,2,2,2,3,3],
46+
):
47+
""" Plot the final searched model
48+
Args:
49+
codepath: path to the saved .pth file, generated from the searching script.
50+
arch_code_a: architecture code (decoded using model.decode).
51+
arch_code_c: cell operation code (decoded using model.decode).
52+
filename: filename to save graph.
53+
directory: directory to save graph.
54+
code2in, code2out: see definition in monai.networks.nets.dints.py.
55+
Return:
56+
graphviz graph.
57+
"""
58+
code = torch.load(codepath)
59+
arch_code_a = code["arch_code_a"]
60+
arch_code_c = code["arch_code_c"]
61+
ga = Digraph("G", filename=filename, engine="neato")
62+
depth = (len(code2in) + 2)//3
63+
64+
# build a initial block
65+
inputs = []
66+
for _ in range(depth):
67+
inputs.append("(in," + str(_) + ")")
68+
69+
with ga.subgraph(name="cluster_all") as g:
70+
with g.subgraph(name="cluster_init") as c:
71+
for idx, _ in enumerate(inputs):
72+
c.node(_,pos="0,"+str(depth-idx)+"!")
73+
for blk_idx in range(arch_code_a.shape[0]):
74+
with g.subgraph(name="cluster"+str(blk_idx)) as c:
75+
outputs = [str((blk_idx,_)) for _ in range(depth)]
76+
for idx, _ in enumerate(outputs):
77+
c.node(_,pos=str(2+2*blk_idx)+","+str(depth-idx)+"!")
78+
for res_idx, activation in enumerate(arch_code_a[blk_idx]):
79+
if activation:
80+
c.edge(inputs[code2in[res_idx]], outputs[code2out[res_idx]], \
81+
label=str(arch_code_c[blk_idx][res_idx]))
82+
inputs = outputs
83+
ga.render(filename=filename, directory=directory, cleanup=True, format="png")
84+
return ga
85+
86+
87+
if __name__ == "__main__":
88+
args = parser.parse_args()
89+
plot_graph(
90+
codepath=args.checkpoint,
91+
filename=args.filename,
92+
directory=args.directory,
93+
)

automl/DiNTS/download_msd_datasets.py

Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,40 @@
1+
# Copyright 2020 - 2021 MONAI Consortium
2+
# Licensed under the Apache License, Version 2.0 (the "License");
3+
# you may not use this file except in compliance with the License.
4+
# You may obtain a copy of the License at
5+
# http://www.apache.org/licenses/LICENSE-2.0
6+
# Unless required by applicable law or agreed to in writing, software
7+
# distributed under the License is distributed on an "AS IS" BASIS,
8+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
9+
# See the License for the specific language governing permissions and
10+
# limitations under the License.
11+
12+
import argparse
13+
import os
14+
15+
from monai.apps import download_and_extract
16+
17+
18+
def main():
19+
parser = argparse.ArgumentParser(description="training")
20+
parser.add_argument(
21+
"--msd_task",
22+
action="store",
23+
default="Task07_Pancreas",
24+
help="msd task",
25+
)
26+
parser.add_argument(
27+
"--root",
28+
action="store",
29+
default="./data_msd",
30+
help="data root",
31+
)
32+
args = parser.parse_args()
33+
34+
resource = "https://msd-for-monai.s3-us-west-2.amazonaws.com/" + args.msd_task + ".tar"
35+
compressed_file = os.path.join(args.root, args.msd_task + ".tar")
36+
if not os.path.exists(args.root):
37+
download_and_extract(resource, compressed_file, args.root)
38+
39+
if __name__ == "__main__":
40+
main()

0 commit comments

Comments
 (0)