Skip to content

Pytorch backend slow with pymc model #1110

Open
@Ch0ronomato

Description

@Ch0ronomato

Description

@ricardoV94 did a nice perf improvement in pymc-devs/pymc#7578 to try to speedup jitted backends. I tried out torch as well. The model performed quite slow.

mode t_sampling (seconds) manual measure (seconds)
NUMBA 2.483 11.346
PYTORCH (COMPILED) 206.503 270.188
PYTORCH (EAGER) 60.607 64.140

We need to investigate why

  1. Torch is so slow
  2. Torch compile is slower than eager mode

When doing perf evaluations, keep in mind that torch does a lot of caching. If you want a truly cache-less eval, you can either add torch.compiler.reset() or set the env variable to disable the dynamo cache (google it).

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions