-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
Add HSGPs, add support for power spectral densities to covariance function objects #6036
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Other than tests and pre-commit, putting this up in draft mode because I need to figure out how to test my translations of the power spectral densities. I got them from here, pg 4. |
Codecov Report
Additional details and impacted files@@ Coverage Diff @@
## main #6036 +/- ##
==========================================
- Coverage 94.15% 93.98% -0.17%
==========================================
Files 132 133 +1
Lines 26726 26921 +195
==========================================
+ Hits 25163 25301 +138
- Misses 1563 1620 +57
|
…ance doesnt need to know input_dims and active_dims and Covariance does. This allows Constant and WhiteNoise to work without specified input_dim and active_dims
Is this ready for review now? |
Sorry I missed your comment @fonnesbeck. Almost, I still need to add tests for the |
I also still need to make a notebook for pymc-examples. Here's a quick usage example though that should run off this branch. Additive covariances work as well as higher dimensional X. import numpy as np
import pymc as pm
import matplotlib.pyplot as plt
import pymc.sampling_jax
t = np.linspace(0.0, 10.0, 500)
cov = 4**2 * pm.gp.cov.ExpQuad(1, ls=3) + 1**2 * pm.gp.cov.Matern52(1, ls=0.5)
f_true = pm.draw(pm.MvNormal.dist(mu=np.zeros(len(t)), cov=cov(t[:, None])), 1).flatten()
sigma_true = 1.0
y = f_true + sigma_true * np.random.randn(len(f_true))
plt.plot(t, f_true)
plt.plot(t, y, '.'); with pm.Model() as model:
ell1 = pm.InverseGamma("ell1", mu=2, sigma=4)
ell2 = pm.InverseGamma("ell2", mu=1, sigma=2)
cov = 4**2 * pm.gp.cov.ExpQuad(1, ls=ell1) + 1**2 * pm.gp.cov.Matern52(1, ls=ell2)
gp = pm.gp.HSGP(m=[200], c=5.0, cov_func=cov)
f = gp.prior("f", X=t[:, None])
sigma = pm.HalfNormal("sigma", sigma=1)
pm.Normal("y", mu=f, sigma=sigma, observed=y)
idata = pm.sampling_jax.sample_numpyro_nuts() |
Is this ready to be reviewed for merging? Happy to Crete an example notebook if that's what is holding things up. |
Well maybe two things are left, one is need to add tests for the HSGP class. The other is maybe handling the periodic covariance. I was going to punt on that since it's structured a bit differently (doesn't use the PSD) and save it for another PR, but it might not be too hard to include. Sure! Do you have a example in mind? Whatever the example is, I do want to make sure to include a widget of some sort to help with choosing m, c or L parameters, happy to contribute that part. I think that'd be really handy. |
I always have plenty of baseball data (along the lines of what @danhphan has been using). |
Hi, the spin dataset will be on this link in |
closing in favor of #6458 |
What is this PR about?
Adding support for power spectral density functions to covariance functions. Any stationary kernel may define a
psd
method that takes inomega
as an input.This will be useful for some kernel approximation algorithms like the HSGP approximation and random Fourier features. It should also make it easier to work with other things like spectral mixture kernels too.
This also adds the ability for composite kernels to be constructed and their
psd
to be calculated.It should be OK now for users to define a custom kernel that subclasses
Stationary
, but doesn't define.full
or.diag
methods, just.psd
, which is all some approximations would need.Checklist
Major / Breaking Changes
Bugfixes / New features
.psd
forpm.gp.cov.ExpQuad
,pm.gp.cov.Matern52
andpm.gp.cov.Matern32
..psd
method should work for any dimensionalomega
. Meaning, 1 or 2+ dimensional processes.Docs / Maintenance