Skip to content

A basic CI asv check would be useful #1446

Closed
@kandersolar

Description

@kandersolar

Is your feature request related to a problem? Please describe.
Currently there's no automated check to ensure that changes to the benchmarks are valid. For example in #1443 and #1445 I think currently the best way to be sure that the new benchmarks are valid (i.e. they won't error when being run in the nightly job) is to checkout the PR locally and try it out manually. It would be nice if this was done automatically somehow.

An automated benchmark check would also prevent us from forgetting to update the benchmarks when we make breaking changes to pvlib itself.

Describe the solution you'd like
A new github actions workflow that builds the asv environments and executes the benchmarks at least once to ensure validity. Note that I'm not suggesting that we actually use the timing results for anything: the goal is to verify that the benchmarks execute without error, not to detect performance regressions. The latter will still be the nightly VM's responsibility.

Describe alternatives you've considered
Running the benchmarks in earnest for PRs would also solve this, but that is still a complicated problem that I don't want to take on at this point. I think this small step in that direction makes more sense for now.

Additional context
asv run --quick seems to do what I want (ref):

Do a "quick" run, where each benchmark function is run only once. This is useful to find basic errors in the benchmark functions faster. The results are unlikely to be useful, and thus are not saved.

--strict is probably also useful here, although see airspeed-velocity/asv#1199

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions