Ha HA !  Good one !

I believe that the core of the problem with the C language is that is based upon abstracting the PDP-11 instruction set.  CPUs, such as Intel/AMD x64 are vastly more complex so "optimising" C compilers are trying to make something simple take advantage of something far more complex.  Perhaps we should call them "complexifying" compilers.

Generally, model-to-model transformations (which is effectively what compilers do under the covers) are easier to define when we transform from a higher level of abstraction to a lower level of abstraction.  As folks in the MBSE field explain it, trying to put a pig together from sausages.

On Wed, 5 Sep 2018 at 09:20, Charles Forsyth <charles.forsyth@gmail.com> wrote:
Plan 9 C implements C by attempting to follow the programmer's instructions, which is surprisingly useful in systems programming.
The big fat compilers work hard to find grounds to interpret those instructions as "undefined behaviour".


On Sun, 2 Sep 2018 at 17:32, Chris McGee <newton688@gmail.com> wrote:
Hi All,

I'm reading this article about how they are going through the giant heaping pile of Linux kernel code and trying to come up with safer practices to avoid the "dangers" of C. The prevailing wisdom appears to be that things should eventually be rewritten in Rust some day.


I'm curious how the Plan 9 C compiler fits into this story. I know that it was designed to avoid many of the pitfalls of standard C. Does it try to address some of these dangers or is it focused on making code more readable so that problems are more apparent?

How does everyone feel about the Plan 9/9front kernel? Have they gone through hardening/testing exercises over the years? I'm curious what tools are available to help discover bugs.

Cheers,
Chris