Python DispatchKey missing for custom op

Hey!

The torch_dispatch handler is added as a fallback: pytorch/aten/src/ATen/core/PythonFallbackKernel.cpp at e14026bc2a6cd80bedffead77a5d7b75a37f8e67 · pytorch/pytorch · GitHub
This means that it will be called for any op when an input Tensor has that key enabled.

Are you sure that you’re not overriding the Python key for your custom op by any chance?

btw you can use torch._C._dispatch_dump("namespace::op_name") to know all the registrations that happened for a given op (and where they’re from) and torch._C._dispatch_dump_table("namespace::op_name") to see the computed dispatch table for that op (what will get called for each key). That should help you figure out the details of what is happening here.

2 Likes