One of the longest running bugs in PyTorch is a request to make it possible to change the default device. We’ve resisted directly putting this feature for a long time. [POC] Pass on Module kwargs to Parameter initialization by ezyang · Pull Request #53144 · pytorch/pytorch · GitHub is the latest iteration on the discussion of how to avoid needing this, specifically for the problem of allocating Modules on a different device than the default one. Thoughts and opinions solicited.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
RFC: Skip Module Parameter Initialization | 0 | 827 | May 5, 2021 | |
Proposed module structure changes for PyTorch module maintainers
|
0 | 535 | April 6, 2023 | |
State of model creation/initialization/seralization in PyTorch Core
|
1 | 1266 | May 2, 2023 | |
Python/C++ API rules for device-generic APIs | 2 | 254 | January 23, 2025 | |
About "[FSDP] Don't move ignored params / buffers to device"
|
1 | 138 | August 2, 2024 |