Hi everyone.
Hetergeneous memory such as HBM has been available on CPU device in near future, so there are maybe two types memory – DDR and HBM-- in the computer. Flat mode or Cached mode can be configured by platform. If we configure two type memories in flat mode, how to make pytorch tensor awaring memory types? Pytorch tensor has ‘device’ builtin attribute to distinguish device the tensor operating on, but it seems not a good idea to extend a ‘hbm’ device because the tensor still operates on CPU and only store data on HBM. Does it prefer to add NUMA-like attribute to the tensor for this situation?
Expect any feedbacks.
Raymond.