With TORCH_LOGS=“recompiles”, it complains:
-
torch._dynamo.guards.__recompiles: [DEBUG] Recompiling function adamw in /opt/conda/envs/python3.7/lib/python3.8/site-packages/torch/optim/adamw.py:279, triggered by the following guard failure(s): - L[‘lr’] == 0.0002999999984974876 # func( # optim/adamw.py:339 in adamw
-
torch._dynamo.guards.__recompiles: [DEBUG] Recompiling function adamw in /opt/conda/envs/python3.7/lib/python3.8/site-packages/torch/optim/adamw.py:279, triggered by the following guard failure(s): - L[‘lr’] == 0.0002999999984964864 # func( # optim/adamw.py:339 in adamw