Why did the compiler do that - the one with VRP and integer size

Here’s a simple program - it right shifts a signed int by 9 bits, truncates it to an unsigned char, and returns 1 or 0 after comparing the result with 0x74. $ cat test.c extern int e; int main() { unsigned char result = e >> 9; // result = 0x74; if ((int)result != 0x74) return 1; return 0; } Let’s compile this for the avr target, choosing to optimize for size.