Skip to content

Commit 0bbcac5

Browse files
anijain2305pytorchmergebot
authored andcommitted
Monkey patch Variable module to fix FX codegen
Fixes pytorch/torchdynamo#82 There is a `torch.auotgrad.variable` function which conflicts with the module class `torch.autograd.variable.Variable`. @jansel Pull Request resolved: #76079 Approved by: https://github.com/jansel, https://github.com/albanD
1 parent 4766b68 commit 0bbcac5

File tree

1 file changed

+7
-2
lines changed

1 file changed

+7
-2
lines changed

torch/autograd/__init__.py

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -296,8 +296,13 @@ def _is_checkpoint_valid():
296296

297297

298298
def variable(*args, **kwargs):
299-
warnings.warn("torch.autograd.variable(...) is deprecated, use torch.tensor(...) instead")
300-
return torch.tensor(*args, **kwargs)
299+
raise RuntimeError("torch.autograd.variable(...) is deprecated, use torch.tensor(...) instead")
300+
301+
# Monkey patching variable.Variable to fix FX codegen. FX generates a call by roughly doing
302+
# f"{fn.__module__}.{fn.__name__}(...). This yields torch.autograd.variable.Variable(...) in the
303+
# output of an FX graph. Unfortunately the module name torch.autograd.variable is shadowed by the
304+
# deprecated function - variable(...).
305+
variable.Variable = Variable # type: ignore[attr-defined]
301306

302307
if not torch._C._autograd_init():
303308
raise RuntimeError("autograd initialization failed")

0 commit comments

Comments
 (0)