-
-
Notifications
You must be signed in to change notification settings - Fork 272
Update GLM predictions #204
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
View / edit / reply to this conversation on ReviewNB chiral-carbon commented on 2021-08-06T17:11:58Z @tomicapretto had a doubt here as to how to convert this to use bambi instead. we have to pass a dataframe rather than
tomicapretto commented on 2021-08-06T17:26:26Z Yep! Bambi only works with data frames so far. So you need to put all the data in a pandas data frame and then pass the data as the OriolAbril commented on 2021-08-06T17:39:27Z does it support out of sample posterior predictive sampling though? tomicapretto commented on 2021-08-06T19:58:18Z Yes, it does (in the dev version)
Where
If you have
Then the new data frame must have columns for
|
Yep! Bambi only works with data frames so far. So you need to put all the data in a pandas data frame and then pass the data as the View entire conversation on ReviewNB |
does it support out of sample posterior predictive sampling though? View entire conversation on ReviewNB |
Yes, it does (in the dev version)
Where
If you have
Then the new data frame must have columns for
View entire conversation on ReviewNB |
View / edit / reply to this conversation on ReviewNB OriolAbril commented on 2021-08-24T16:56:14Z I would not compute the decision boundary with the mean of the posterior but instead use xarray, again, everything should broadcast automatically. If doing this we can delete this remark Also, the Deterministic is only valid in pure pymc3 whereas postprocessing with xarray only needs to have the inferencedata |
View / edit / reply to this conversation on ReviewNB chiral-carbon commented on 2021-09-01T12:44:44Z I get a memory error here, but the shape of the dataset being passed is 90000x4, not 90000x25000 as printed. so what is the mistake here? tomicapretto commented on 2021-09-01T13:48:28Z This is definetly a problem with Bambi trying to create a very large object within the predict method. I will try to replicate it on my side to try to understand what is going on.
What I'm not sure about is why you are using
grid = np.linspace(start=-9, stop=9, num=300) x1, x2 = np.meshgrid(grid, grid) x_grid = np.stack(arrays=[x1.flatten(), x2.flatten()], axis=1) new_data = pd.DataFrame(x_grid, columns=["x1", "x2"]) chiral-carbon commented on 2021-09-01T14:28:22Z I thinkI simply replaced the previous Sayam753 commented on 2021-09-19T08:46:21Z so what is the mistake here? I don't think there is any mistake. The shape of dataset is (90k, 4). The shape (90k, 25k) can be interpreted as -
To solve this, two obvious ways would be to
chiral-carbon commented on 2021-09-20T17:24:38Z I'll try this and see if it works |
View / edit / reply to this conversation on ReviewNB chiral-carbon commented on 2021-09-01T12:49:50Z I think the model overfitting? have I made any mistakes in the model definition step? |
This is definetly a problem with Bambi trying to create a very large object within the predict method. I will try to replicate it on my side to try to understand what is going on.
What I'm not sure about is why you are using
grid = np.linspace(start=-9, stop=9, num=300) x1, x2 = np.meshgrid(grid, grid) x_grid = np.stack(arrays=[x1.flatten(), x2.flatten()], axis=1) new_data = pd.DataFrame(x_grid, columns=["x1", "x2"]) View entire conversation on ReviewNB |
I thinkI simply replaced the previous View entire conversation on ReviewNB |
View / edit / reply to this conversation on ReviewNB OriolAbril commented on 2021-09-08T15:46:55Z I don't understand this sentence, and there is still a link to the old glm module code in pymc3 repo |
View / edit / reply to this conversation on ReviewNB Sayam753 commented on 2021-09-19T03:59:53Z The first sentence should use |
so what is the mistake here? I don't think there is any mistake. The shape of dataset is (90k, 4). The shape (90k, 25k) can be interpreted as -
To solve this, two obvious ways would be to
View entire conversation on ReviewNB |
I'll try this and see if it works View entire conversation on ReviewNB |
86510ef
to
da4ed14
Compare
View / edit / reply to this conversation on ReviewNB chiral-carbon commented on 2021-10-10T07:47:06Z @OriolAbril the decision boundary does not look accurate to me and I think it could be because of the reduced sampling size and the reduced dataset size generated by |
Is this close to being ready? At this point we should go ahead and convert to v4. |
@fonnesbeck Hi, sorry to leave it here without updating. Will work on it this week and finish it up. |
@chiral-carbon any chance of pushing this out, or do you want to hand it over to someone? |
I'll handle it! If I don't push it out this weekend I'll let you know to
hand it over to someone else
…On Wed, 11 May 2022, 19:26 Chris Fonnesbeck, ***@***.***> wrote:
@chiral-carbon <https://github.com/chiral-carbon> any chance of pushing
this out, or do you want to hand it over to someone?
—
Reply to this email directly, view it on GitHub
<#204 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGFHGBHXAYTWKM53JGC4IEDVJO4CFANCNFSM5BWI4F7Q>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
@fonnesbeck should I try to update this to v4 now in this PR or update to v3 only and leave v4 update for a new PR? this same question applies to the other open PRs I have pending. |
Definitely v4, as we are trying to get as many examples ported before the v4 release as possible. Let me know if you need a hand! |
@cfonnesbeck okay! will need some help in that case. you could just point out what I should refer to/start with, or anything else that should be done first. |
It looks like this one is in a bit of a holding pattern until the update to Bambi is released, but if you want to get a head start you'd have to build Bambi from its |
@chiral-carbon @fonnesbeck most things should work fine in the new |
@tomicapretto @fonnesbeck thanks! I will update here when as and when I need help |
This notebook was updated to v4 by #370, so closing this PR |
Addresses issue #85 and aims to advance it to best practices, use
arviz-darkgrid
and usebambi
.