Description
🚀 Describe the improvement or the new tutorial
Proposal: Add a Tutorial/Documentation Example on Differentiable Decision Forests
Overview
This is a proposal to add a well-documented example or tutorial demonstrating a Differentiable Decision Forest model in PyTorch — inspired by the Deep Neural Decision Forests paper (Kontschieder et al., ICCV 2015).
The goal is not to introduce a new torch.nn
module, but rather to show how such a model can be implemented using native PyTorch operations in a transparent and educational way.
Why This?
- Combines the interpretability of decision trees with the feature learning power of neural networks.
- Uses soft routing (sigmoid decisions) and learnable leaf distributions (softmax) to allow end-to-end backpropagation.
- Offers an alternative to traditional ensembles or black-box classifiers, especially for tabular and hybrid domains.
What the Tutorial Would Include
- Overview of the model structure (CNN → decision trees)
- How to implement soft decisions and routing probabilities (μ) with PyTorch ops like
sigmoid
,softmax
,einsum
,gather
, etc. - Joint optimization of routing and leaf distributions
- Training on MNIST or tabular datasets
- Emphasis on "Simple over Easy" — no custom abstractions
Reference
Final Note
This is not a request to add this as a built-in PyTorch module — in fact, that might go against PyTorch's Simple over Easy philosophy.
Instead, this would be best suited as a community-contributed tutorial or example in the official PyTorch Tutorials repository or documentation site.
Extended Note
I'm currently in the middle of university exams and may not be able to actively contribute for a few weeks — but I’d be very interested in helping develop the tutorial afterwards.
Existing tutorials on this topic
No response
Additional context
No response