|
4 | 4 | "cell_type": "markdown",
|
5 | 5 | "metadata": {},
|
6 | 6 | "source": [
|
7 |
| - "# MONAI Auto3Dseg Reference Python APIs\n", |
| 7 | + "# MONAI Auto3DSeg Reference Python APIs\n", |
8 | 8 | "\n",
|
9 |
| - "In this notebook, we will break down the Auto3Dseg by the modules in the pipeline and introduce the API calls in Python and CLI commands. Particularly, if you have used the AutoRunner class, we will map the AutoRunner commands and configurations to each of the Auto3Dseg module APIs\n", |
| 9 | + "In this notebook, we will break down the Auto3DSeg by the modules in the pipeline and introduce the API calls in Python and CLI commands. Particularly, if you have used the AutoRunner class, we will map the AutoRunner commands and configurations to each of the Auto3DSeg module APIs\n", |
10 | 10 | "\n",
|
11 | 11 | "\n",
|
12 | 12 | "\n",
|
|
150 | 150 | "runner.run() \n",
|
151 | 151 | "```\n",
|
152 | 152 | "\n",
|
153 |
| - "The two lines cover the typical settings in Auto3Dseg and now we are going through the internal APIs calls inside these two lines\n", |
| 153 | + "The two lines cover the typical settings in Auto3DSeg and now we are going through the internal APIs calls inside these two lines\n", |
154 | 154 | "\n",
|
155 | 155 | "### 2.1 Data Analysis\n",
|
156 | 156 | "\n",
|
|
186 | 186 | ],
|
187 | 187 | "source": [
|
188 | 188 | "datastats_file = os.path.join(work_dir, 'data_stats.yaml')\n",
|
189 |
| - "analyser = DataAnalyzer(datalist_file, dataroot, output_path=datastats_file)\n", |
| 189 | + "analyser = DataAnalyzer(datalist_file, dataroot, output_path=datastats_file, device=\"cpu\")\n", |
190 | 190 | "datastat = analyser.get_all_case_stats()"
|
191 | 191 | ]
|
192 | 192 | },
|
|
297 | 297 | "source": [
|
298 | 298 | "### 2.2.1 Getting and Saving the history to hard drive\n",
|
299 | 299 | "\n",
|
300 |
| - "If the users continue to train the algorithms on local system, The history of the algorithm generation can be fetched via `get_history` method of the `BundleGen` object. There also are scenarios that users need to stop the Python process after the `algo_gen`. For example, the users may need to transfer the files to a remote cluster to start the training. `Auto3Dseg` offers a utility function `export_bundle_algo_history` to dump the history to hard drive and recall it by `import_bundle_algo_history`. \n", |
| 300 | + "If the users continue to train the algorithms on local system, The history of the algorithm generation can be fetched via `get_history` method of the `BundleGen` object. There also are scenarios that users need to stop the Python process after the `algo_gen`. For example, the users may need to transfer the files to a remote cluster to start the training. `Auto3DSeg` offers a utility function `export_bundle_algo_history` to dump the history to hard drive and recall it by `import_bundle_algo_history`. \n", |
301 | 301 | "\n",
|
302 | 302 | "If the files are copied to a remote system, please make sure the alrogirthm templates are also copied there. Some functions require the path to instantiate the algorithm class properly."
|
303 | 303 | ]
|
|
397 | 397 | "source": [
|
398 | 398 | "#### 2.3.3 Train with Hyper-parameter Optimization (HPO)\n",
|
399 | 399 | "\n",
|
400 |
| - "Another method to handle the neural network training is to perform HPO (e.g. training & searching). This is made possible by NNI or Optuna packages which are installed in the MONAI development environment. `AutoRunner` uses NNI as backend via the `NNIGen`, but Optuna HPO can also be chosen via the `OptunaGen` method in the Auto3Dseg pipeline\n", |
| 400 | + "Another method to handle the neural network training is to perform HPO (e.g. training & searching). This is made possible by NNI or Optuna packages which are installed in the MONAI development environment. `AutoRunner` uses NNI as backend via the `NNIGen`, but Optuna HPO can also be chosen via the `OptunaGen` method in the Auto3DSeg pipeline\n", |
401 | 401 | "\n",
|
402 | 402 | "To start a NNI, the users need to prepare a config file `nni_config.yaml` and run the command in bash:\n",
|
403 | 403 | "\n",
|
|
438 | 438 | "source": [
|
439 | 439 | "### 2.4 Ensemble\n",
|
440 | 440 | "\n",
|
441 |
| - "Finally, after the neural networks are trained, `AutoRunner` will apply the ensemble methods in Auto3Dseg to improve the overall performance. \n", |
| 441 | + "Finally, after the neural networks are trained, `AutoRunner` will apply the ensemble methods in Auto3DSeg to improve the overall performance. \n", |
442 | 442 | "\n",
|
443 | 443 | "Here we used a utility function `import_bundle_algo_history` to load the `Algo` that are trained into the ensemble. With the history loaded, we build an ensemble method and use the method to perform the inference on all testing data. By default, `AutoRunner` uses the `AlgoEnsembleBestN` to find the best N models and ensemble the prediction maps by taking the mean of the feature maps.\n",
|
444 | 444 | "\n",
|
|
0 commit comments