You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Runner executable is available as `pytorch3d_implicitron_runner` shell command.
34
34
See [Running](#running) section below for examples of training and evaluation commands.
35
35
36
+
36
37
## [Option 2] Supporting custom implementations
37
38
38
39
To plug in custom implementations, for example, of renderer or implicit-function protocols, you need to create your own runner script and import the plug-in implementations there.
You are still encouraged to implement custom plugins as above where possible as it makes reusing the code easier.
56
57
The executable is located in `pytorch3d/projects/implicitron_trainer`.
57
58
59
+
> **_NOTE:_** Both `pytorch3d_implicitron_runner` and `pytorch3d_implicitron_visualizer`
60
+
executables (mentioned below) are not available when using local clone.
61
+
Instead users should use the python scripts `experiment.py` and `visualize_reconstruction.py` (see the [Running](Running) section below).
62
+
58
63
59
64
# Running
60
65
61
-
This section assumes that you use the executable provided by the installed package.
62
-
If you have a custom `experiment.py` script (as in the Option 2 above), replace the executable with the path to your script.
66
+
This section assumes that you use the executable provided by the installed package
67
+
(Option 1 / Option 2 in [#Installation](Installation) above),
68
+
i.e. `pytorch3d_implicitron_runner` and `pytorch3d_implicitron_visualizer` are available.
69
+
70
+
> **_NOTE:_** If the executables are not available (e.g. when using a local clone - Option 3 in [#Installation](Installation)),
71
+
users should directly use the `experiment.py` and `visualize_reconstruction.py` python scripts
72
+
which correspond to the executables as follows:
73
+
-`pytorch3d_implicitron_runner` corresponds to `<pytorch3d_root>/projects/implicitron_trainer/experiment.py`
74
+
-`pytorch3d_implicitron_visualizer` corresponds to `<pytorch3d_root>/projects/implicitron_trainer/visualize_reconstruction.py`
75
+
76
+
For instance, in order to directly execute training with the python script, users can call:
77
+
```shell
78
+
cd<pytorch3d_root>/projects/ \
79
+
python -m implicitron_trainer.experiment <args>`
80
+
```
81
+
82
+
If you have a custom `experiment.py` or `visualize_reconstruction.py` script
83
+
(as in the Option 2 [above](#Installation)), replace the executable with the path to your script.
63
84
64
85
## Training
65
86
@@ -80,6 +101,13 @@ and `<CHECKPOINT_DIR>` with a directory where checkpoints will be dumped during
80
101
Other configuration parameters can be overridden in the same way.
81
102
See [Configuration system](#configuration-system) section for more information on this.
82
103
104
+
### Visdom logging
105
+
106
+
Note that the training script logs its progress to Visdom. Make sure to start a visdom server before the training commences:
107
+
```
108
+
python -m visdom.server
109
+
```
110
+
> In case a Visdom server is not started, the console will get flooded with `requests.exceptions.ConnectionError` errors signalling that a Visdom server is not available. Note that these errors <b>will NOT interrupt</b> the program and the training will still continue without issues.
The argument `n_eval_cameras` sets the number of renderring viewpoints sampled on a trajectory, which defaults to a circular fly-around;
113
141
`render_size` sets the size of a render passed to the model, which can be resized to `video_size` before writing.
114
142
115
-
Rendered videos of images, masks, and depth maps will be saved to `<CHECKPOINT_DIR>/vis`.
143
+
Rendered videos of images, masks, and depth maps will be saved to `<CHECKPOINT_DIR>/video`.
116
144
117
145
118
146
# Configuration system
@@ -127,8 +155,11 @@ Configurables can form hierarchies.
127
155
For example, `GenericModel` has a field `raysampler: RaySampler`, which is also Configurable.
128
156
In the config, inner parameters can be propagated using `_args` postfix, e.g. to change `raysampler.n_pts_per_ray_training` (the number of sampled points per ray), the node `raysampler_args.n_pts_per_ray_training` should be specified.
129
157
130
-
The root of the hierarchy is defined by `ExperimentConfig` dataclass.
131
-
It has top-level fields like `eval_only` which was used above for running evaluation by adding a CLI override.
158
+
### Top-level configuration class: `Experiment`
159
+
160
+
<b>The root of the hierarchy is defined by `Experiment` Configurable in `<pytorch3d_root>/projects/implicitron_trainer/experiment.py`.</b>
161
+
162
+
It has top-level fields like `seed`, which seeds the random number generator.
132
163
Additionally, it has non-leaf nodes like `model_factory_ImplicitronModelFactory_args.model_GenericModel_args`, which dispatches the config parameters to `GenericModel`.
133
164
Thus, changing the model parameters may be achieved in two ways: either by editing the config file, e.g.
0 commit comments