Short name describing what triggered the graph break
Data dependent operator
Values or code snippet captured at the break point
str(cause.func)
Explanation of why the graph break was triggered
Operator {cause.func} has a non-Tensor output whose value is dependent on the data of Tensor inputs.
Hints on how to resolve the graph break
No hints provided.
Example code that causes the graph break is:
def fn(x):
return torch.equal(x, x)
A sample workaround around this is:
def fn(x):
return torch.equal(x, x)
input_tensor = torch.ones(5)
# Workaround: Allow graph breaks by using the default fullgraph=False.
# Dynamo will run torch.equal in eager and compile what it can around it.
compiled_fn = torch.compile(fn, backend="eager")
result = compiled_fn(input_tensor)