One of the longest running bugs in PyTorch is a request to make it possible to change the default device. We’ve resisted directly putting this feature for a long time. [POC] Pass on Module kwargs to Parameter initialization by ezyang · Pull Request #53144 · pytorch/pytorch · GitHub is the latest iteration on the discussion of how to avoid needing this, specifically for the problem of allocating Modules on a different device than the default one. Thoughts and opinions solicited.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
RFC: Skip Module Parameter Initialization | 0 | 880 | May 5, 2021 | |
Adapting Models to use TorchScript and Getting them to Produce Fusions | 15 | 6464 | February 26, 2021 | |
PyTorch 1.10 dev release notes | 1 | 1997 | October 21, 2021 | |
State of model creation/initialization/seralization in PyTorch Core
|
1 | 1412 | May 2, 2023 | |
Torch.nn September Update | 0 | 819 | September 16, 2021 |