Skip to content

Finish HSGP #61

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 12 commits into from
Closed

Finish HSGP #61

wants to merge 12 commits into from

Conversation

bwengals
Copy link
Contributor

@bwengals bwengals commented Aug 1, 2022

The goal of this PR is to finish the implementation of the HSGP and make it generally usable, so tests and docs and all that. This is a function-space GP approximation that is interesting because it uses a fixed set of basis vectors -- but allows the kernel hyperparameters to be learned. The restriction is that the kernel must be stationary and you need it's power spectral density.

TODO list:

  • prior method
  • conditional method
  • 1D inputs
  • Multidimensional inputs (existing implementation here was 1D only)
  • Multi-dimensional power spectral densities for common kernels
    • ExpQuad
    • Matern52
    • Matern32
  • Support additivity, like gp.Latent and gp.Marginal do
  • Tests

@junpenglao
Copy link
Member

In case it is useful: https://colab.research.google.com/drive/10FcpmOm6awyVcR-1DK-xzHTqneD3xfS6

@bwengals
Copy link
Contributor Author

bwengals commented Aug 2, 2022

yes thank you @junpenglao! Birthdays would make a great example

omega, phi, m_star = self._eigendecomposition(X, self.L, self.M, self.D)
psd = self.cov_func.psd(omega)
self.beta = pm.Normal(f"{name}_coeffs_", size=m_star)
self.f = pm.Deterministic(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to pass dims for the variable?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes! thanks for catching this


def conditional(self, name, Xnew):
fnew = self._build_conditional(name, Xnew)
return pm.Deterministic(name, fnew)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to pass dims to the variable?

@bwengals
Copy link
Contributor Author

I think I'll move HSGP over into to pymc-devs/pymc#6036. HSGP needs that spectral covariance stuff to work. Having a prereq PR in pymc for something in pymc-experimental is just weird anyway. This split I think was a bad idea in hindsight.

I don't know how to make the reverse option work, moving the spectral covariance support into experimental, since you need to mess with the Covariance classes quite a bit. And I think that the HSGP method is on firm footing with the main GP stuff. It has a couple papers, and having used it some it really does work super well.

"dimension."
)
try:
if len(m) != cov_func.D:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do covariance functions have a D attribute?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this supposed to be len(active_dims)?

"""A helper function which gives the approximate kernel or covariance matrix K. This can be
helpful when trying to see how well an approximation may work.
"""
X, _ = self.cov_func._slice(X)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_slice takes two positional arguments.

@fonnesbeck
Copy link
Member

@bwengals I think you can ignore my comments -- looks like this depends on your PyMC PR being merged as well!

@bwengals bwengals closed this Apr 22, 2023
@bwengals bwengals deleted the hsgp branch April 22, 2023 02:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants