What are these `_foreach_` operators and why there is no fallback?

I’m testing OpenCL backend with 2.1.1 and noticed many calls for _foreach_ functions that are missing like aten::_foreach_mul_.Scalar.

Now while I clearly understand utility of such operators for optimizers they add huge overload for backend developers.

Is there any way to automatically fall them back to standard non-foreach functions that already implemented?

Also I understand that it is better to implement fused_adam since it is way more efficient, I have such function in dlprimitives but it would still require from me implementing several optimizes…

Bump… anybody can say anything about it?

I understand they are mostly called from stuff like Adam but can’t they be implemented automatically in terms of normal kernels?

maybe @janeyx99 can answer

They can indeed be implemented as a for-loop over existing aten ops, and we do do that! However, before my PR Allow slow foreach to run for any backend, not just CPU by janeyx99 · Pull Request #127412 · pytorch/pytorch · GitHub, this implementation was only registered to the CPU backend key. Since this PR a month ago, we’ve registered the path for CompositeExplicitAutograd, which should function as a sufficient fallback kernel.

Let me know if you have further questions/whether this satisfies the ask!

1 Like

Yes I have

  1. Does it means it is present and nightly and not older pytorch versions (since it is new PR)?
  2. Do I need to do anything as backend developer to enable this fallback.

Thanks!

Does it means it is present and nightly and not older pytorch versions (since it is new PR)?

Yes, it is relatively new, but release 2.4 is out and this PR should be included there.

Do I need to do anything as backend developer to enable this fallback?

I do not believe so! This is registered on the CompositeExplicitAutograd key which is an alias key. This alias key includes the PrivateUse1 backend key and will be the implementation dispatched to unless another kernel is specifically registered for that key.

1 Like