Skip to content

Commit 5add065

Browse files
davnov134facebook-github-bot
authored andcommitted
Readme updates
Summary: Running: - clearly points users to experiment.py/visualize_reconstruction.py Reproducing: - Adds NeRF training on Blender - Adds CO3Dv2 configs Reviewed By: bottler Differential Revision: D41534315 fbshipit-source-id: e85f5f1eafed8c35c9e91d748a04f238509cf8ec
1 parent a2c6af9 commit 5add065

File tree

3 files changed

+127
-15
lines changed

3 files changed

+127
-15
lines changed

projects/implicitron_trainer/README.md

Lines changed: 126 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -33,6 +33,7 @@ pip install "hydra-core>=1.1" visdom lpips matplotlib accelerate
3333
Runner executable is available as `pytorch3d_implicitron_runner` shell command.
3434
See [Running](#running) section below for examples of training and evaluation commands.
3535

36+
3637
## [Option 2] Supporting custom implementations
3738

3839
To plug in custom implementations, for example, of renderer or implicit-function protocols, you need to create your own runner script and import the plug-in implementations there.
@@ -55,11 +56,31 @@ pip install "hydra-core>=1.1" visdom lpips matplotlib accelerate
5556
You are still encouraged to implement custom plugins as above where possible as it makes reusing the code easier.
5657
The executable is located in `pytorch3d/projects/implicitron_trainer`.
5758

59+
> **_NOTE:_** Both `pytorch3d_implicitron_runner` and `pytorch3d_implicitron_visualizer`
60+
executables (mentioned below) are not available when using local clone.
61+
Instead users should use the python scripts `experiment.py` and `visualize_reconstruction.py` (see the [Running](Running) section below).
62+
5863

5964
# Running
6065

61-
This section assumes that you use the executable provided by the installed package.
62-
If you have a custom `experiment.py` script (as in the Option 2 above), replace the executable with the path to your script.
66+
This section assumes that you use the executable provided by the installed package
67+
(Option 1 / Option 2 in [#Installation](Installation) above),
68+
i.e. `pytorch3d_implicitron_runner` and `pytorch3d_implicitron_visualizer` are available.
69+
70+
> **_NOTE:_** If the executables are not available (e.g. when using a local clone - Option 3 in [#Installation](Installation)),
71+
users should directly use the `experiment.py` and `visualize_reconstruction.py` python scripts
72+
which correspond to the executables as follows:
73+
- `pytorch3d_implicitron_runner` corresponds to `<pytorch3d_root>/projects/implicitron_trainer/experiment.py`
74+
- `pytorch3d_implicitron_visualizer` corresponds to `<pytorch3d_root>/projects/implicitron_trainer/visualize_reconstruction.py`
75+
76+
For instance, in order to directly execute training with the python script, users can call:
77+
```shell
78+
cd <pytorch3d_root>/projects/ \
79+
python -m implicitron_trainer.experiment <args>`
80+
```
81+
82+
If you have a custom `experiment.py` or `visualize_reconstruction.py` script
83+
(as in the Option 2 [above](#Installation)), replace the executable with the path to your script.
6384

6485
## Training
6586

@@ -80,6 +101,13 @@ and `<CHECKPOINT_DIR>` with a directory where checkpoints will be dumped during
80101
Other configuration parameters can be overridden in the same way.
81102
See [Configuration system](#configuration-system) section for more information on this.
82103

104+
### Visdom logging
105+
106+
Note that the training script logs its progress to Visdom. Make sure to start a visdom server before the training commences:
107+
```
108+
python -m visdom.server
109+
```
110+
> In case a Visdom server is not started, the console will get flooded with `requests.exceptions.ConnectionError` errors signalling that a Visdom server is not available. Note that these errors <b>will NOT interrupt</b> the program and the training will still continue without issues.
83111

84112
## Evaluation
85113

@@ -105,14 +133,14 @@ conda install ffmpeg
105133

106134
Here is an example of calling the script:
107135
```shell
108-
projects/implicitron_trainer/visualize_reconstruction.py exp_dir=<CHECKPOINT_DIR> \
136+
pytorch3d_implicitron_visualizer exp_dir=<CHECKPOINT_DIR> \
109137
visdom_show_preds=True n_eval_cameras=40 render_size="[64,64]" video_size="[256,256]"
110138
```
111139

112140
The argument `n_eval_cameras` sets the number of renderring viewpoints sampled on a trajectory, which defaults to a circular fly-around;
113141
`render_size` sets the size of a render passed to the model, which can be resized to `video_size` before writing.
114142

115-
Rendered videos of images, masks, and depth maps will be saved to `<CHECKPOINT_DIR>/vis`.
143+
Rendered videos of images, masks, and depth maps will be saved to `<CHECKPOINT_DIR>/video`.
116144

117145

118146
# Configuration system
@@ -127,8 +155,11 @@ Configurables can form hierarchies.
127155
For example, `GenericModel` has a field `raysampler: RaySampler`, which is also Configurable.
128156
In the config, inner parameters can be propagated using `_args` postfix, e.g. to change `raysampler.n_pts_per_ray_training` (the number of sampled points per ray), the node `raysampler_args.n_pts_per_ray_training` should be specified.
129157

130-
The root of the hierarchy is defined by `ExperimentConfig` dataclass.
131-
It has top-level fields like `eval_only` which was used above for running evaluation by adding a CLI override.
158+
### Top-level configuration class: `Experiment`
159+
160+
<b>The root of the hierarchy is defined by `Experiment` Configurable in `<pytorch3d_root>/projects/implicitron_trainer/experiment.py`.</b>
161+
162+
It has top-level fields like `seed`, which seeds the random number generator.
132163
Additionally, it has non-leaf nodes like `model_factory_ImplicitronModelFactory_args.model_GenericModel_args`, which dispatches the config parameters to `GenericModel`.
133164
Thus, changing the model parameters may be achieved in two ways: either by editing the config file, e.g.
134165
```yaml
@@ -266,14 +297,66 @@ model_GenericModel_args: GenericModel
266297
Please look at the annotations of the respective classes or functions for the lists of hyperparameters.
267298
`tests/experiment.yaml` shows every possible option if you have no user-defined classes.
268299

269-
# Reproducing CO3D experiments
300+
301+
# Implementations of existing methods
302+
303+
We provide configuration files that implement several existing works.
304+
305+
<b>The configuration files live in `pytorch3d/projects/implicitron_trainer/configs`.</b>
306+
307+
308+
## NeRF
309+
310+
The following config file corresponds to training of a vanilla NeRF on Blender Synthetic dataset
311+
(see https://arxiv.org/abs/2003.08934 for details of the method):
312+
313+
`./configs/repro_singleseq_nerf_blender.yaml`
314+
315+
316+
### Downloading Blender-Synthetic
317+
Training requires the Blender Synthetic dataset.
318+
To download the dataset, visit the [gdrive bucket](https://drive.google.com/file/d/18JxhpWD-4ZmuFKLzKlAw-w5PpzZxXOcG/view?usp=share_link)
319+
and click Download.
320+
Then unpack the downloaded .zip file to a folder which we call `<BLENDER_DATASET_ROOT_FOLDER>`.
321+
322+
323+
### Launching NeRF training
324+
In order to train NeRF on the "drums" scene, execute the following command:
325+
```shell
326+
export BLENDER_DATASET_ROOT="<BLENDER_DATASET_ROOT_FOLDER>" \
327+
export BLENDER_SINGLESEQ_CLASS="drums" \
328+
pytorch3d_implicitron_runner --config-path ./configs/ --config-name repro_singleseq_nerf_blender
329+
```
330+
331+
Note that the training scene is selected by setting the environment variable `BLENDER_SINGLESEQ_CLASS`
332+
appropriately (one of `"chair"`, `"drums"`, `"ficus"`, `"hotdog"`, `"lego"`, `"materials"`, `"mic"`, `"ship"`).
333+
334+
By default, the training outputs will be stored to `"./data/nerf_blender_repro/$BLENDER_SINGLESEQ_CLASS/"`
335+
336+
337+
### Visualizing trained NeRF
338+
```shell
339+
pytorch3d_implicitron_visualizer exp_dir=<CHECKPOINT_DIR> \
340+
visdom_show_preds=True n_eval_cameras=40 render_size="[64,64]" video_size="[256,256]"
341+
```
342+
where `<CHECKPOINT_DIR>` corresponds to the directory with the training outputs (defaults to `"./data/nerf_blender_repro/$BLENDER_SINGLESEQ_CLASS/"`).
343+
344+
The script will output a rendered video of the learned radiance field to `"./data/nerf_blender_repro/$BLENDER_SINGLESEQ_CLASS/"` (requires `ffmpeg`).
345+
346+
> **_NOTE:_** Recall that, if `pytorch3d_implicitron_runner`/`pytorch3d_implicitron_visualizer` are not available, replace the calls
347+
with `cd <pytorch3d_root>/projects/; python -m implicitron_trainer.[experiment|visualize_reconstruction]`
348+
349+
350+
## CO3D experiments
270351

271352
Common Objects in 3D (CO3D) is a large-scale dataset of videos of rigid objects grouped into 50 common categories.
272353
Implicitron provides implementations and config files to reproduce the results from [the paper](https://arxiv.org/abs/2109.00512).
273354
Please follow [the link](https://github.com/facebookresearch/co3d#automatic-batch-download) for the instructions to download the dataset.
274355
In training and evaluation scripts, use the download location as `<DATASET_ROOT>`.
275356
It is also possible to define environment variable `CO3D_DATASET_ROOT` instead of specifying it.
276-
To reproduce the experiments from the paper, use the following configs. For single-sequence experiments:
357+
To reproduce the experiments from the paper, use the following configs.
358+
359+
For single-sequence experiments:
277360

278361
| Method | config file |
279362
|-----------------|-------------------------------------|
@@ -286,19 +369,52 @@ To reproduce the experiments from the paper, use the following configs. For sing
286369
| SRN + WCE | repro_singleseq_srn_wce_noharm.yaml |
287370
| SRN + WCE + γ | repro_singleseq_srn_wce_noharm.yaml |
288371

289-
For multi-sequence experiments (without generalisation to new sequences):
372+
For multi-sequence autodecoder experiments (without generalization to new sequences):
290373

291374
| Method | config file |
292375
|-----------------|--------------------------------------------|
293376
| NeRF + AD | repro_multiseq_nerf_ad.yaml |
294377
| SRN + AD | repro_multiseq_srn_ad_hypernet_noharm.yaml |
295378
| SRN + γ + AD | repro_multiseq_srn_ad_hypernet.yaml |
296379

297-
For multi-sequence experiments (with generalisation to new sequences):
380+
For multi-sequence experiments (with generalization to new sequences):
298381

299382
| Method | config file |
300383
|-----------------|--------------------------------------|
301384
| NeRF + WCE | repro_multiseq_nerf_wce.yaml |
302385
| NerFormer | repro_multiseq_nerformer.yaml |
303386
| SRN + WCE | repro_multiseq_srn_wce_noharm.yaml |
304387
| SRN + WCE + γ | repro_multiseq_srn_wce.yaml |
388+
389+
390+
## CO3Dv2 experiments
391+
392+
The following config files implement training on the second version of CO3D, `CO3Dv2`.
393+
394+
In order to launch trainings, set the `CO3DV2_DATASET_ROOT` environment variable
395+
to the root folder of the dataset (note that the name of the env. variable differs from the CO3Dv1 version).
396+
397+
Single-sequence experiments:
398+
399+
| Method | config file |
400+
|-----------------|-------------------------------------|
401+
| NeRF | repro_singleseq_v2_nerf.yaml |
402+
| NerFormer | repro_singleseq_v2_nerformer.yaml |
403+
| IDR | repro_singleseq_v2_idr.yaml |
404+
| SRN | repro_singleseq_v2_srn_noharm.yaml |
405+
406+
Multi-sequence autodecoder experiments (without generalization to new sequences):
407+
408+
| Method | config file |
409+
|-----------------|--------------------------------------------|
410+
| NeRF + AD | repro_multiseq_v2_nerf_ad.yaml |
411+
| SRN + γ + AD | repro_multiseq_v2_srn_ad_hypernet.yaml |
412+
413+
Multi-sequence experiments (with generalization to new sequences):
414+
415+
| Method | config file |
416+
|-----------------|----------------------------------------|
417+
| NeRF + WCE | repro_multiseq_v2_nerf_wce.yaml |
418+
| NerFormer | repro_multiseq_v2_nerformer.yaml |
419+
| SRN + WCE | repro_multiseq_v2_srn_wce_noharm.yaml |
420+
| SRN + WCE + γ | repro_multiseq_v2_srn_wce.yaml |

projects/implicitron_trainer/configs/repro_singleseq_nerf_blender.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ data_source_ImplicitronDataSource_args:
77
dataset_length_train: 100
88
dataset_map_provider_class_type: BlenderDatasetMapProvider
99
dataset_map_provider_BlenderDatasetMapProvider_args:
10-
base_dir: ${oc.env:BLENDER_DATASET_ROOT}
10+
base_dir: ${oc.env:BLENDER_DATASET_ROOT}/${oc.env:BLENDER_SINGLESEQ_CLASS}
1111
n_known_frames_for_test: null
1212
object_name: ${oc.env:BLENDER_SINGLESEQ_CLASS}
1313
path_manager_factory_class_type: PathManagerFactory

projects/implicitron_trainer/experiment.py

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -19,10 +19,6 @@
1919
./experiment.py --config-name base_config.yaml override.param.one=42 override.param.two=84
2020
```
2121
22-
To run an experiment on a specific GPU, specify the `gpu_idx` key in the
23-
config file / CLI. To run on a different device, specify the device in
24-
`run_training`.
25-
2622
Main functions
2723
---------------
2824
- The Experiment class defines `run` which creates the model, optimizer, and other

0 commit comments

Comments
 (0)