Skip to content

Commit fdaaa29

Browse files
bottlerfacebook-github-bot
authored andcommitted
remove stray "generic_model_args" references
Summary: generic_model_args no longer exists. Update some references to it, mostly in doc. This fixes the testing of all the yaml files in test_forward pass. Reviewed By: shapovalov Differential Revision: D38789202 fbshipit-source-id: f11417efe772d7f86368b3598aa66c52b1309dbf
1 parent d42e0d3 commit fdaaa29

File tree

4 files changed

+44
-28
lines changed

4 files changed

+44
-28
lines changed

projects/implicitron_trainer/README.md

Lines changed: 32 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,9 @@ To run training, pass a yaml config file, followed by a list of overridden argum
6767
For example, to train NeRF on the first skateboard sequence from CO3D dataset, you can run:
6868
```shell
6969
dataset_args=data_source_args.dataset_map_provider_JsonIndexDatasetMapProvider_args
70-
pytorch3d_implicitron_runner --config-path ./configs/ --config-name repro_singleseq_nerf $dataset_args.dataset_root=<DATASET_ROOT> $dataset_args.category='skateboard' $dataset_args.test_restrict_sequence_id=0 test_when_finished=True exp_dir=<CHECKPOINT_DIR>
70+
pytorch3d_implicitron_runner --config-path ./configs/ --config-name repro_singleseq_nerf \
71+
$dataset_args.dataset_root=<DATASET_ROOT> $dataset_args.category='skateboard' \
72+
$dataset_args.test_restrict_sequence_id=0 test_when_finished=True exp_dir=<CHECKPOINT_DIR>
7173
```
7274

7375
Here, `--config-path` points to the config path relative to `pytorch3d_implicitron_runner` location;
@@ -86,7 +88,9 @@ To run evaluation on the latest checkpoint after (or during) training, simply ad
8688
E.g. for executing the evaluation on the NeRF skateboard sequence, you can run:
8789
```shell
8890
dataset_args=data_source_args.dataset_map_provider_JsonIndexDatasetMapProvider_args
89-
pytorch3d_implicitron_runner --config-path ./configs/ --config-name repro_singleseq_nerf $dataset_args.dataset_root=<CO3D_DATASET_ROOT> $dataset_args.category='skateboard' $dataset_args.test_restrict_sequence_id=0 exp_dir=<CHECKPOINT_DIR> eval_only=True
91+
pytorch3d_implicitron_runner --config-path ./configs/ --config-name repro_singleseq_nerf \
92+
$dataset_args.dataset_root=<CO3D_DATASET_ROOT> $dataset_args.category='skateboard' \
93+
$dataset_args.test_restrict_sequence_id=0 exp_dir=<CHECKPOINT_DIR> eval_only=True
9094
```
9195
Evaluation prints the metrics to `stdout` and dumps them to a json file in `exp_dir`.
9296

@@ -101,7 +105,8 @@ conda install ffmpeg
101105

102106
Here is an example of calling the script:
103107
```shell
104-
projects/implicitron_trainer/visualize_reconstruction.py exp_dir=<CHECKPOINT_DIR> visdom_show_preds=True n_eval_cameras=40 render_size="[64,64]" video_size="[256,256]"
108+
projects/implicitron_trainer/visualize_reconstruction.py exp_dir=<CHECKPOINT_DIR> \
109+
visdom_show_preds=True n_eval_cameras=40 render_size="[64,64]" video_size="[256,256]"
105110
```
106111

107112
The argument `n_eval_cameras` sets the number of renderring viewpoints sampled on a trajectory, which defaults to a circular fly-around;
@@ -124,18 +129,21 @@ In the config, inner parameters can be propagated using `_args` postfix, e.g. to
124129

125130
The root of the hierarchy is defined by `ExperimentConfig` dataclass.
126131
It has top-level fields like `eval_only` which was used above for running evaluation by adding a CLI override.
127-
Additionally, it has non-leaf nodes like `generic_model_args`, which dispatches the config parameters to `GenericModel`. Thus, changing the model parameters may be achieved in two ways: either by editing the config file, e.g.
132+
Additionally, it has non-leaf nodes like `model_factory_ImplicitronModelFactory_args.model_GenericModel_args`, which dispatches the config parameters to `GenericModel`.
133+
Thus, changing the model parameters may be achieved in two ways: either by editing the config file, e.g.
128134
```yaml
129-
generic_model_args:
130-
render_image_width: 800
131-
raysampler_args:
132-
n_pts_per_ray_training: 128
135+
model_factory_ImplicitronModelFactory_args:
136+
model_GenericModel_args:
137+
render_image_width: 800
138+
raysampler_args:
139+
n_pts_per_ray_training: 128
133140
```
134141
135142
or, equivalently, by adding the following to `pytorch3d_implicitron_runner` arguments:
136143

137144
```shell
138-
generic_model_args.render_image_width=800 generic_model_args.raysampler_args.n_pts_per_ray_training=128
145+
model_args=model_factory_ImplicitronModelFactory_args.model_GenericModel_args
146+
$model_args.render_image_width=800 $model_args.raysampler_args.n_pts_per_ray_training=128
139147
```
140148

141149
See the documentation in `pytorch3d/implicitron/tools/config.py` for more details.
@@ -149,11 +157,12 @@ This means that other Configurables can refer to them using the base type, while
149157
In that case, `_args` node name has to include the implementation type.
150158
More specifically, to change renderer settings, the config will look like this:
151159
```yaml
152-
generic_model_args:
153-
renderer_class_type: LSTMRenderer
154-
renderer_LSTMRenderer_args:
155-
num_raymarch_steps: 10
156-
hidden_size: 16
160+
model_factory_ImplicitronModelFactory_args:
161+
model_GenericModel_args:
162+
renderer_class_type: LSTMRenderer
163+
renderer_LSTMRenderer_args:
164+
num_raymarch_steps: 10
165+
hidden_size: 16
157166
```
158167

159168
See the documentation in `pytorch3d/implicitron/tools/config.py` for more details on the configuration system.
@@ -188,15 +197,17 @@ class XRayRenderer(BaseRenderer, torch.nn.Module):
188197
```
189198
190199
Please note `@registry.register` decorator that registers the plug-in as an implementation of `Renderer`.
191-
IMPORTANT: In order for it to run, the class (or its enclosing module) has to be imported in your launch script. Additionally, this has to be done before parsing the root configuration class `ExperimentConfig`.
200+
IMPORTANT: In order for it to run, the class (or its enclosing module) has to be imported in your launch script.
201+
Additionally, this has to be done before parsing the root configuration class `ExperimentConfig`.
192202
Simply add `import .x_ray_renderer` in the beginning of `experiment.py`.
193203

194204
After that, you should be able to change the config with:
195205
```yaml
196-
generic_model_args:
197-
renderer_class_type: XRayRenderer
198-
renderer_XRayRenderer_args:
199-
n_pts_per_ray: 128
206+
model_factory_ImplicitronModelFactory_args:
207+
model_GenericModel_args:
208+
renderer_class_type: XRayRenderer
209+
renderer_XRayRenderer_args:
210+
n_pts_per_ray: 128
200211
```
201212

202213
to replace the implementation and potentially override the parameters.
@@ -252,7 +263,8 @@ model_GenericModel_args: GenericModel
252263
╘== ReductionFeatureAggregator
253264
```
254265

255-
Please look at the annotations of the respective classes or functions for the lists of hyperparameters. `tests/experiment.yaml` shows every possible option if you have no user-defined classes.
266+
Please look at the annotations of the respective classes or functions for the lists of hyperparameters.
267+
`tests/experiment.yaml` shows every possible option if you have no user-defined classes.
256268

257269
# Reproducing CO3D experiments
258270

projects/implicitron_trainer/visualize_reconstruction.py

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -333,8 +333,10 @@ def export_scenes(
333333
)
334334
dataset_args.test_on_train = False
335335
# Set the rendering image size
336-
config.generic_model_args.render_image_width = render_size[0]
337-
config.generic_model_args.render_image_height = render_size[1]
336+
model_factory_args = config.model_factory_ImplicitronModelFactory_args
337+
model_args = model_factory_args.model_GenericModel_args
338+
model_args.render_image_width = render_size[0]
339+
model_args.render_image_height = render_size[1]
338340
if restrict_sequence_name is not None:
339341
dataset_args.restrict_sequence_name = restrict_sequence_name
340342

pytorch3d/implicitron/models/generic_model.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -131,8 +131,9 @@ class GenericModel(ImplicitronModelBase): # pyre-ignore: 13
131131
for more details on how to create and register a custom component.
132132
133133
In the config .yaml files for experiments, the parameters below are
134-
contained in the `generic_model_args` node. As GenericModel
135-
derives from Configurable, the input arguments are
134+
contained in the
135+
`model_factory_ImplicitronModelFactory_args.model_GenericModel_args`
136+
node. As GenericModel derives from ReplaceableBase, the input arguments are
136137
parsed by the run_auto_creation function to initialize the
137138
necessary member modules. Please see implicitron_trainer/README.md
138139
for more details on this process.

tests/implicitron/test_forward_pass.py

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -207,10 +207,11 @@ def _load_model_config_from_yaml(config_path, strict=True) -> DictConfig:
207207

208208
def _load_model_config_from_yaml_rec(cfg: DictConfig, config_path: str) -> DictConfig:
209209
cfg_loaded = OmegaConf.load(config_path)
210-
if "generic_model_args" in cfg_loaded:
211-
cfg_model_loaded = cfg_loaded.generic_model_args
212-
else:
213-
cfg_model_loaded = None
210+
cfg_model_loaded = None
211+
if "model_factory_ImplicitronModelFactory_args" in cfg_loaded:
212+
factory_args = cfg_loaded.model_factory_ImplicitronModelFactory_args
213+
if "model_GenericModel_args" in factory_args:
214+
cfg_model_loaded = factory_args.model_GenericModel_args
214215
defaults = cfg_loaded.pop("defaults", None)
215216
if defaults is not None:
216217
for default_name in defaults:

0 commit comments

Comments
 (0)