Skip to content

Implement gradient for QR decomposition #1099

Closed
@jessegrabowski

Description

@jessegrabowski

Description

QR is one of the few remaining linalg ops that is missing a gradient. JAX code for the jvp is here, whic also includes this derivation. This paper also claims to derive the gradients for QR, but I find it unreadable.

Relatedly but perhaps worthy of a separate issue, this paper derives gradients for the LQ decomposition, $$A = LQ$$, where $L$ is lower triangular and $Q$ is orthonormal ($$Q^TQ=I$$.) Compare this to QR, which gives you $$A = QR$$, where $$Q$$ is again orthonormal, but $$R$$ is upper triangular, and you see why I mention it in this issue. It wouldn't be hard to offer LQ as well.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions