On Mon, Feb 28, 2022 at 1:49 PM Dan Cross <crossd@gmail.com> wrote:

#ifdef has more or less always struck me as the solution to the wrong problem. "We have all this code and we want to shoehorn it into a new environment," instead of, "we have many environments, so we carefully structure the code to accommodate the differences." Of course, the latter is harder than the former, but it also pays larger dividends over time as compared to the former.
Absolutely - but the problem is the development process (more in a minute).

The biggest problem with #ifdef wasn't so much that it existed, but rather that it was used for too many things that it wasn't well-suited for.
Agreed.

 
The second biggest problem was that it was semantically unaware of the language; it was purely textual. Bummer.
Dan - I'll argue with you on that one actually.   One of the reasons why C was successful in a professional language is that it could be easily preprocessed, which other languages like Pascal and even Ada sucked when you tried.  There are times when a preprocessor is just really handy as Larry said, particularly in a product programming house.

And here is the issue ...

Doug and Rob are 100% right about Plan 9 but .. it's not a fair comparison.  Plan9 was a research system.  X started out as that, but by the time of X11 it was a production system and very different types of programmers.   When you are porting code, particularly when under type constraints, Larry described exactly how it's done ...   #ifdef NOTDEF  ...   ok this works.... and off you go the next issue with the product. In my experience, the development process often (tends to) reinforce this behavior.

Note I am not telling you that's a good idea ... but it is a typical behavior (a.k.a. whiskey to teenagers).  You intend to go back and redo it, but it never gets done.  The next person adds their hack and pretty soon you have the BSD kernel if #ifdef FASTVAXl or  #ifdef BIG_ENDIAN scattered all over the code base.

As Larry and others have pointed out, an abstraction layer, when you can enforce it, clearly is the best.   But to be fair, one of the reasons why C was so popular, is that so much code could be moved.  One of the things I hated about Pascal, was that so much code had been written that assumed 36 or 40 integers, or that string limits varied depending on byte or work size and how the len was encoded -- yeech.

The problem, as Larry said, is that it is a powerful tool that was easy to and often was, abused by both neophytes and people that should have known better.