Proposed changes to how nn.Modules ought to be written

One of the longest running bugs in PyTorch is a request to make it possible to change the default device. We’ve resisted directly putting this feature for a long time. [POC] Pass on Module kwargs to Parameter initialization by ezyang · Pull Request #53144 · pytorch/pytorch · GitHub is the latest iteration on the discussion of how to avoid needing this, specifically for the problem of allocating Modules on a different device than the default one. Thoughts and opinions solicited.

2 Likes