You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: examples/time_series/Time_Series_Generative_Graph.myst.md
+32-32Lines changed: 32 additions & 32 deletions
Original file line number
Diff line number
Diff line change
@@ -5,15 +5,15 @@ jupytext:
5
5
format_name: myst
6
6
format_version: 0.13
7
7
kernelspec:
8
-
display_name: pymc_env
8
+
display_name: Python 3 (ipykernel)
9
9
language: python
10
10
name: python3
11
11
---
12
12
13
13
(time_series_generative_graph)=
14
14
# Time Series Models Derived From a Generative Graph
15
15
16
-
:::{post} July, 2024
16
+
:::{post} January, 2025
17
17
:tags: time-series,
18
18
:category: intermediate, reference
19
19
:author: Jesse Grabowski, Juan Orduz and Ricardo Vieira
@@ -70,7 +70,7 @@ rng = np.random.default_rng(42)
70
70
71
71
## Define AR(2) Process
72
72
73
-
We start by encoding the generative graph of the AR(2) model as a function `ar_dist`. The strategy is to pass this function as a custom distribution via {class}`~pymc.CustomDist` inside a PyMC model.
73
+
We start by encoding the generative graph of the AR(2) model as a function `get_ar_steps`. The strategy is to pass this function as a custom distribution via {class}`~pymc.CustomDist` inside a PyMC model.
74
74
75
75
We need to specify the initial state (`ar_init`), the autoregressive coefficients (`rho`), and the standard deviation of the noise (`sigma`). Given such parameters, we can define the generative graph of the AR(2) model using the {meth}`scan <pytensor.scan.basic.scan>` operation.
76
76
@@ -101,23 +101,23 @@ timeseries_length = 100 # Time series length
101
101
# This is the transition function for the AR(2) model.
102
102
# We take as inputs previous steps and then specify the autoregressive relationship.
with pm.observe(model, {"ar_dist": test_data}) as observed_model:
213
+
with pm.observe(model, {"ar_steps": test_data}) as observed_model:
214
214
trace = pm.sample(chains=4, random_seed=rng)
215
215
```
216
216
@@ -319,17 +319,17 @@ $$
319
319
Let's see how to do this in PyMC! The key observation is that we need to pass the observed data explicitly into out "for loop" in the generative graph. That is, we need to pass it into the {meth}`scan <pytensor.scan.basic.scan>` function.
Then we can simply generate samples from the posterior predictive distribution. Observe we need to "rewrite" the generative graph to include the conditioned transition step. When you call {meth}`~pm.sample_posterior_predictive`,PyMC will attempt to match the names of random variables in the active model context to names in the provided `idata.posterior`. If a match is found, the specified model prior is ignored, and replaced with draws from the posterior. This means we can put any prior we want on these parameters, because it will be ignored. We choose {class}`~pymc.distributions.continuous.Flat` because you cannot sample from it. This way, if PyMC does not find a match for one of our priors, we will get an error to let us know something isn't right. For a detailed explanation on these type of cross model predictions, see the great blog post [Out of model predictions with PyMC](https://www.pymc-labs.com/blog-posts/out-of-model-predictions-with-pymc/).
@@ -351,17 +351,17 @@ with pm.Model(coords=coords, check_bounds=False) as conditional_model:
0 commit comments