One of the longest running bugs in PyTorch is a request to make it possible to change the default device. We’ve resisted directly putting this feature for a long time. [POC] Pass on Module kwargs to Parameter initialization by ezyang · Pull Request #53144 · pytorch/pytorch · GitHub is the latest iteration on the discussion of how to avoid needing this, specifically for the problem of allocating Modules on a different device than the default one. Thoughts and opinions solicited.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
RFC: Skip Module Parameter Initialization | 0 | 869 | May 5, 2021 | |
About "[FSDP] Don't move ignored params / buffers to device"
|
1 | 295 | August 2, 2024 | |
Adapting Models to use TorchScript and Getting them to Produce Fusions | 15 | 6308 | February 26, 2021 | |
State of model creation/initialization/seralization in PyTorch Core
|
1 | 1385 | May 2, 2023 | |
Weight sharing on cuda | 12 | 2108 | August 31, 2023 |