Closed
Description
Description
import numpy as np
import pytensor.tensor as pt
x = pt.vector("x", shape=(None,))
out = pt.alloc(x, 3, 5)
print(out.eval({x: np.zeros(1)}).shape) # (3, 5)
grad_out = pt.grad(out.sum(), wrt=x)
print(grad_out.eval({x: np.zeros(1)}).shape) # (5,)
Interestingly if we do out = pt.alloc(x, 5)
then, the rewrites consider that alloc useless and out
is replaced by specify_shape(x, 5)
. AFAICT this was always present in Theano, and was not changed in Aesara.