FP8 datatype in PyTorch

We recently discussed FP8 at a recent composability meeting; you can view the public minutes at Composability meeting notes - Google Docs

The summary is that, while it is a bit premature to add proper FP8 types to PyTorch, we are going to add some generic bits8/16/etc type to PyTorch so you can easily prototype FP8 in a tensor subclass without having to get core to actually add all of the necessary bits of support you need. Angela Yi is looking into adding this support!

3 Likes