One of the longest running bugs in PyTorch is a request to make it possible to change the default device. We’ve resisted directly putting this feature for a long time. [POC] Pass on Module kwargs to Parameter initialization by ezyang · Pull Request #53144 · pytorch/pytorch · GitHub is the latest iteration on the discussion of how to avoid needing this, specifically for the problem of allocating Modules on a different device than the default one. Thoughts and opinions solicited.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
RFC: Skip Module Parameter Initialization | 0 | 789 | May 5, 2021 | |
Proposed module structure changes for PyTorch module maintainers
|
0 | 514 | April 6, 2023 | |
State of model creation/initialization/seralization in PyTorch Core
|
1 | 1198 | May 2, 2023 | |
Python/C++ API rules for device-generic APIs | 1 | 208 | October 8, 2024 | |
About "[FSDP] Don't move ignored params / buffers to device"
|
1 | 73 | August 2, 2024 |