-
-
Notifications
You must be signed in to change notification settings - Fork 65
Finish HSGP #61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Finish HSGP #61
Conversation
In case it is useful: https://colab.research.google.com/drive/10FcpmOm6awyVcR-1DK-xzHTqneD3xfS6 |
yes thank you @junpenglao! Birthdays would make a great example |
omega, phi, m_star = self._eigendecomposition(X, self.L, self.M, self.D) | ||
psd = self.cov_func.psd(omega) | ||
self.beta = pm.Normal(f"{name}_coeffs_", size=m_star) | ||
self.f = pm.Deterministic( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible to pass dims for the variable?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes! thanks for catching this
pymc_experimental/gp/hsgp.py
Outdated
|
||
def conditional(self, name, Xnew): | ||
fnew = self._build_conditional(name, Xnew) | ||
return pm.Deterministic(name, fnew) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible to pass dims to the variable?
I think I'll move HSGP over into to pymc-devs/pymc#6036. HSGP needs that spectral covariance stuff to work. Having a prereq PR in pymc for something in pymc-experimental is just weird anyway. This split I think was a bad idea in hindsight. I don't know how to make the reverse option work, moving the spectral covariance support into experimental, since you need to mess with the Covariance classes quite a bit. And I think that the HSGP method is on firm footing with the main GP stuff. It has a couple papers, and having used it some it really does work super well. |
"dimension." | ||
) | ||
try: | ||
if len(m) != cov_func.D: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do covariance functions have a D
attribute?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this supposed to be len(active_dims)
?
"""A helper function which gives the approximate kernel or covariance matrix K. This can be | ||
helpful when trying to see how well an approximation may work. | ||
""" | ||
X, _ = self.cov_func._slice(X) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
_slice
takes two positional arguments.
@bwengals I think you can ignore my comments -- looks like this depends on your PyMC PR being merged as well! |
The goal of this PR is to finish the implementation of the HSGP and make it generally usable, so tests and docs and all that. This is a function-space GP approximation that is interesting because it uses a fixed set of basis vectors -- but allows the kernel hyperparameters to be learned. The restriction is that the kernel must be stationary and you need it's power spectral density.
TODO list:
prior
methodconditional
methodgp.Latent
andgp.Marginal
do