The optimizer recently added the ability to replace a compare with a subtraction under certain circumstances. This can fail for integers. For inputs a = 0x80000000, b = 4, int(0x80000000) < 4, but int(0x80000000) - 4 overflows and results in 0x7ffffffc. That's not less than zero, so the flags get set differently than for (a < b).
This really only affected the signed comparisons because the subtract would always have a signed source types, so it wouldn't be seen as a match for the compare with unsigned source types.
v2: Just require GL_EXT_shader_integer_mix. Remove stray change to tests/spec/CMakeLists.txt. Both suggested by Ilia. Since this is no longer a "stock" GLSL 1.30 test, move to tests/shaders/.
975fd8853 shaders: Reproduce a bug in the i965 backend optimizer
.../glsl-fs-absoluteDifference-int.shader_test | 82 +++++++++++++++++++
.../glsl-fs-absoluteDifference-uint.shader_test | 82 +++++++++++++++++++
.../glsl-vs-absoluteDifference-int.shader_test | 93 ++++++++++++++++++++++
.../glsl-vs-absoluteDifference-uint.shader_test | 93 ++++++++++++++++++++++
4 files changed, 350 insertions(+)