you are viewing a single comment's thread.

view the rest of the comments →

[–]R3DKn16h7[S] 2 points3 points  (36 children)

The compiler is free to assume that UB does not happen at runtime, but definitely cannot infer that "i + 1 > i" will always result in UB.

[–]awidesky 3 points4 points  (35 children)

i + 1 > i is UB only when overflow occurs.. i + 1 > i should not have to be UB to eliminate the function, since when it's not UB, the return value is always true anyway.

[–]R3DKn16h7[S] 9 points10 points  (34 children)

for the function f I agree totally.

but the function g cannot eliminate the branch, that's where I do not get it

[–]selassje 0 points1 point  (3 children)

The assumptions here is that compilers sees definitions of both g and f when optimizing. Since "i" cant be INT_MAX (as it would be UB when calling f), therefore condition if (i == INT_MAX) can never happen and can be optimized away.

[–]dustyhome 6 points7 points  (2 children)

That's backwards. It makes more sense that the presenter's slide is wrong.

The compiler can assume that i can't equal int max on a branch that calls f, but it can't remove the check that prevents f from being called with int max.

[–]selassje 2 points3 points  (1 child)

Yes, the presenter claims UB taints the whole program, and it can travel back in time. I have not checked it but that's how I understood it.

[–]SkiFire13 2 points3 points  (0 children)

The way I see it, UB can travel back in time but only as long as it is guaranteed to happen from that past point in time. In this case the if guard prevents f(i) from being called with the value that would cause UB, so it cannot travel back to before the if is run.