I have been following this discussion about the C compiler and can no longer stop myself from making a (snarky?) comment.
The K&R standard for C was very much written when the C language was a higher than assembler language for the PDP-11 (at least that's how I became acquainted with it back in 1976). Most of us, in those days, welcomed something that was more high level than macro-assembler and yet amenable to writing operating systems and utilities (unlike FORTRAN, ALGOL and COBOL of that era). Many of us would use the -s switch to check the generated assembler code and in some cases even modify the assembler code for selected functions to get exactly the desired result.
The PDP-11 had a rather simple instruction set, thus the compiler produced relatively predictable code. The undefined behaviours in many cases meant that at least on the PDP-11 we would know what to expect. It was only once code was ported to other systems that these assumptions started getting sorely tested.
Fast forward to present time, we have a bloated C standard and even more bloated C++ standards. The target instruction sets are rich with lots of special case instructions; out of sequence execution; multi-level caches add further constraints. So today's compilers need to analyse complex source code to run with good performance on extremely complex targets. We shouldn't forget that in the case of the x86 family the compilers need to optimise for an ever evolving instruction set and retain backward compatibility across earlier variants.