Skip to content

[BugFix] Fix sampling in PPO tutorial #2271

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Mar 27, 2023
Merged

[BugFix] Fix sampling in PPO tutorial #2271

merged 1 commit into from
Mar 27, 2023

Conversation

vmoens
Copy link
Contributor

@vmoens vmoens commented Mar 27, 2023

Sampling in the PPO tutorial throws away a bunch of data.
This PR fixes the replay_buffer.sample call.

cc @nairbv

@netlify
Copy link

netlify bot commented Mar 27, 2023

Deploy Preview for pytorch-tutorials-preview ready!

Name Link
🔨 Latest commit 56e2c2b
🔍 Latest deploy log https://app.netlify.com/sites/pytorch-tutorials-preview/deploys/6421d36fc014fa0008759485
😎 Deploy Preview https://deploy-preview-2271--pytorch-tutorials-preview.netlify.app/intermediate/reinforcement_ppo
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site settings.

@svekars svekars added the rl label Mar 27, 2023
@svekars svekars merged commit eb29468 into main Mar 27, 2023
@svekars svekars deleted the fix_ppo_sampling branch March 27, 2023 18:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants