From mboxrd@z Thu Jan 1 00:00:00 1970 Message-ID: To: 9fans@cse.psu.edu Subject: Re: [9fans] new compilers From: Charles Forsyth Date: Sat, 25 Mar 2006 20:02:48 +0000 In-Reply-To: <3e1162e60603251148x17b45d1cy5b2772e2413eb941@mail.gmail.com> MIME-Version: 1.0 Content-Type: text/plain; charset="US-ASCII" Content-Transfer-Encoding: 7bit Topicbox-Message-UUID: 211af374-ead1-11e9-9d60-3106f5b1d025 > no no no... when you change code you're supposed to add more code. > Aren't you paying attention to the way software development apparently > works? :) in some ways, the code reduction result is closer to the general approach encouraged by earlier pioneers in the field. it's one reason there was once a little dismay of the choice of `lines of code produced' as a productivity metric. these days, it sells code generators. not the sort that russ changed that does some real work, but the sort that generates thousands of lines of asn.1 parser that people then tweak by hand. but i digress... ken's compilers used a `copy and change' method rather than an elaborate portability layer (or many many layers, or many many #ifdefs, which is gcc's technique). in this case, however, after 15 or more architectures failed to change a copied section appreciably, russ declared it portable after nearly 20 years, and moved it to ../cc with any luck, it might start a trend. two trends: the copy+change technique makes it easier to read the code, and also the bit you pointed out, namely revision of code leads to less of it that does as much. and there would be much rejoicing. sadly, there currently seems to be fat chance of thin code.