State of symbolic shapes branch

State of symbolic shapes: Mar 26 edition

Previous update: State of symbolic shapes branch - #46 by ezyang

Executive summary

This was a sleepy week. There’s a lot of activity going on in various dynamic shapes bugs on the bug tracker, though less fixes than I would have liked.

The numbers:

  • Model status on master. See also Symbolic shapes work items tracker - Google Sheets
    • aot_eager inference: -1 (unchanged). Still vision_maskrcnn sympy error from reshape(torch.empty(s1, (s0 + 1)//2, 2), (s1, s0)).
    • aot_eager training: -2 (unchanged). Still botnet26t_256 and eca_botnext26ts_256.
    • inductor inference: -7 (-1 WoW). The regression is tf_efficientnet_b0 from [inductor] hoist symbolic padding expressions by ngimel · Pull Request #97099 · pytorch/pytorch · GitHub; this is a case of temporarily breaking things to move globally in a better direction
    • inductor training: -1 (+1 WoW). pytorch_unet was fixed, leaving tf_efficientnet_b0 (NameError: name ‘s1’ is not defined)
  • Graph breaks on master. Sorry, CI is still not updated to do dynamic, maybe next week. We think the graph breaks are not super high priority at the moment, unless they are also affecting performance.
  • Tracing cost of enabling dynamic shapes (aot_eager). benchmarks/dynamo/run_delta.sh --backend aot_eager --devices cuda --cold-start-latency --ci. Mean: 12s (unchanged), Max: 153s (+2 WoW, probably noise)

What’s coming next?