Is there any plan to support FP8 as a datatype to PyTorch?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
PyTorch 2 Quantization, How it works?
|
1 | 449 | June 24, 2024 | |
Float8 in PyTorch [1/x]
|
1 | 15360 | April 7, 2025 | |
Supporting new dtypes in PyTorch
|
2 | 2574 | January 25, 2024 | |
Quantization in Pytorch
|
3 | 1392 | February 24, 2025 | |
Implementing true boolean tensors
|
2 | 424 | October 12, 2024 |