Skip to content

perhaps there is a misprint at line 40 #2111

Closed
@weiguo-li

Description

@weiguo-li

instead of # self-attention layers in nn.TransformerEncoder are only allowed to attend,

self-attention layers in nn.TransformerDecoder are only allowed to attend.....

Decoder rather than Encoder

cc @svekars @carljparker

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions