How to get graph weights in torch compile with custom backend?

In custom backend fx graph, all weights or parameters are lifted to placeholder type. And how can I get the weights/parameters origin value in this FX graph ? I want to do something in custom backend during compile.

Hi @RunnerZhong - would target/name attribute on the placeholder node be sufficient for this use case? Since the order matches example_inputs I imagine you could use those with some heuristic to identify naming - weights/parameters look something like l_self_modules_linear_parameters_weight_

def my_custom_backend(gm: torch.fx.GraphModule, example_inputs: list[torch.Tensor], **kwargs):

    placeholders = [n for n in gm.graph.nodes if n.op == 'placeholder']
    for i, ph, inp in enumerate(zip(placeholders, example_inputs)):
        print(f"Placeholder {i}: {ph.name}, target={ph.target}")
        print(f"inp is: {inp}")

    return gm