
Description
The NUTS sampler, when auto-assigned, requires only the first derivative (i.e. calls grad()
method only for the operation itself).
For example, this code for Interpolated distribution works:
import numpy as np
import pymc3 as pm
with pm.Model():
uniform = pm.Interpolated('uniform', np.linspace(0, 1, 100), np.ones(100))
pm.sample(1000)
However, when I try to specify NUTS
or HamiltonianMC
step explicitly, it fails because Interpolated
distribution doesn't provide second-order derivatives (the variable returned by grad()
doesn't have grad()
method):
import numpy as np
import pymc3 as pm
with pm.Model():
uniform = pm.Interpolated('uniform', np.linspace(0, 1, 100), np.ones(100))
step = pm.NUTS()
pm.sample(1000, step=step) # fails
According to the NUTS paper, it should require only first-order derivatives, but maybe I miss something.
So my question is: is it a problem or is it for some reason the expected behavior? In the second case I can add a second order gradient implementation to theInterpolated
distribution that for example always returns zeros, but I don't understand why is it needed in the first place.