Skip to content

Implemented StreamExt::throttle #356

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 18 commits into from
Nov 14, 2019

Conversation

Wassasin
Copy link
Contributor

Closes #342. Still need to write documentation and tests.

@yoshuawuyts
Copy link
Contributor

@Wassasin as per @binarybana's suggestion in #342 we should probably not drop any messages in throttle itself, but instead leave it up to the stream source to implement the queueing strategy. This was an oversight on my part. Would you be willing to update the implementation to match that?

@yoshuawuyts yoshuawuyts added the enhancement New feature or request label Oct 28, 2019
@Wassasin
Copy link
Contributor Author

Wassasin commented Nov 7, 2019

Certainly. I'll work on it during the impl days. ;)

@Wassasin Wassasin marked this pull request as ready for review November 11, 2019 12:54
// with a pause of 2 seconds between each print
})
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we could convert this to an actual test that would be ideal. I don't think it might be a bit too specific to have as a standalone example like this.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still don't think we quite need this example here, but I'll file a follow-up PR to do so. Thanks heaps!

Copy link
Contributor

@yoshuawuyts yoshuawuyts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is very good! Added some comments to finish out the polish. Thanks so much for putting the time in to get this working!

Copy link
Contributor

@yoshuawuyts yoshuawuyts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems great. Only CI is failing now, I think we should be good to merge once that's resolved. Thanks!

@Wassasin
Copy link
Contributor Author

Even with 50ms/150ms (450ms max) the CI-test fails. Apparantly the CI-server is really really slow. Any ideas on how to fix it? I am reluctant to increase the max exec time to > 1 second for the test.

@yoshuawuyts
Copy link
Contributor

@Wassasin perhaps all we should do is test the lower bounds. Ensure that at least n millis have passed, which is the only hard guarantee we provide about the throttle duration anyway.

@Wassasin
Copy link
Contributor Author

Testing on the CI is very insightful. Everything should be OK now.

Copy link
Contributor

@yoshuawuyts yoshuawuyts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking great; thanks a lot!

@yoshuawuyts yoshuawuyts merged commit 338273e into async-rs:master Nov 14, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add Stream::throttle
2 participants