From: "Lucio De Re" > I don't hold anything against using the web as an interface, but it > sucks as historical records go. It is firmly stuck in the present. > > > this is a lot faster to write and use than some random piece of > > junk on windows/unix/linux etc and is reasonably portable across > > bourne shells that support shell functions. > > > I take this to mean that SCCS, RCS and CVS are all unsuitable, in your > opinion, for source management? Here I think our opinions diverge: I > wish to retain a convenient record of changes. More about that in a > moment. no, i use revision control religiously (automated by make). i prefer RCS for text files. SCCS is a catastrophe. never used CVS; do you think they find me contracts where i could? the url should point at the current version. the author uses whatever revision control with the master version of the source, which is the url does not point to. new releases are shipped to the url. > Different schools. A friend and I once tackled a moderately simple > text processing task in AWK and C respectively. We were all > experienced programmers, but not proficient in AWK or C, just > differently so, hence the choice. text processing in C? are you mad? C is dreadful when it comes to smashing strings about. you wind up building some library so that you don't have to worry about it or you declare: char buf[N]; which is all very fine until n > N bytes get stuffed into buf. i go for the former, if i have to use C. ~15 years ago i'd have to do it in C, 'cos a 1 mip 11/780 wasn't that fast (well, our's were). this is a bit painful to use, but is doesn't break, until you run out of memory [it's old, this code]: echo flex.h sed 's/.//' >flex.h <<'//GO.SYSIN DD flex.h' -/* - * Flexible string definitions. - * - * @(#)flex.h 1.31 - */ - -#define FLEXZ 128 - -typedef struct -{ - char *f_str; - char *f_end; - char *f_ptr; -} - flex; - -#define flex_char(f, c) (*((f)->f_ptr == (f)->f_end ? flex_fill(f) : (f)->f_ptr++) = (c)) - -extern void flex_end(); -extern char *flex_fill(); -extern void flex_init(); -extern void flex_str(); -extern void flex_nstr(); //GO.SYSIN DD flex.h echo flex.c sed 's/.//' >flex.c <<'//GO.SYSIN DD flex.c' -/* - * Flexible string handling. - */ - -#ifndef lint -static char sccsid[] = "@(#)flex.c 1.31"; -#endif lint - -#include "mace.h" -#include "flex.h" - -void -flex_init(f) -register flex *f; -{ - f->f_str = f->f_ptr = salloc(FLEXZ); - f->f_end = f->f_ptr + FLEXZ; -} - -char * -flex_fill(f) -register flex *f; -{ - register int s; - - s = f->f_end - f->f_str + FLEXZ; - - f->f_str = srealloc(f->f_str, s); - f->f_end = f->f_str + s; - f->f_ptr = f->f_end - FLEXZ; - - return f->f_ptr++; -} - -void -flex_end(f) -register flex *f; -{ - f->f_ptr = f->f_str; -} - -void -flex_str(f, s) -register flex *f; -register char *s; -{ - while (*s != '\0') - flex_char(f, *s++); -} - -void -flex_nstr(f, s, n) -register flex *f; -register char *s; -register int n; -{ - while (n-- > 0 && *s != '\0') - flex_char(f, *s++); -} //GO.SYSIN DD flex.c > We finished at about the same time, with identical results, and we > were both surprised :-) yeah, but they were toys: ``a moderately simple text processing task'' after the reading the 150 or so pages of the 4 MIME RFC's [in near despair] i was faced with the choice of C or awk to 'parse' who was sending what to whom. C would've been months of work, but i worked out a way to give them [the rat squad] what they wanted with awk. so it was days of reading mime rfc's, with thoughts varying between: - they did WHAT!?! - i'll gonna hunt 'em down and shoot 'em - kill me NOW and then a coupla hours with awk. > > IIRC plan 9 was an experiment that came out of a research > > lab and in response to the question what they for version > > control, well was: > > > > /n/dump > > > That's fine where conversation is the primary communication channel. nope. you have the code. read it. that's what it's there for. > Rob can walk up to Dave and ask a question and get a reasonable > answer. But with the advent of geographically remote development, the > knowledge has to be embedded in the source code. RCS (more than CVS, > all I like about CVS is the client-server model, other things I accept > intellectually, but have no emotional ties to) and SCCS held in a > nutshell batches of changes that are related to each other in a > fashion that /n/dump cannot do. Or am I still not making sense? at SRC (in Palo Alto) and PRL (in Paris) we used a package management system [The Siphon] and this was back in the early 1990s: Francis J. Prusker and Edward P. Wobber. The Siphon: Managing distant replicated repositories. In Proceedings of the IEEE Workshop on the Management of Replicated Data. IEEE, April 1990, Also appeared as PRL Research Report 7. (PostScript). ftp://ftp.digital.com/pub/DEC/PRL/research-reports/PRL-RR-7.ps.Z distributed bug fixing. now if engineering could have just listened. i could tell you a story about ULTRIX engineering, but then i'd have to shoot you or compaq would have me for breech of confidentiality. > > on that topic i built a version of /n/dump using a program > > (to call ftw(3) and stat(2)) and some scripts (to select and > > copy the files) when i was at PRL. as fast as i could free up > > RA90's [1GB] i was headed to have the last ~30 days of all the > > user's home directories on mag disc. presented it as a WIP at a > > USENIX -- hell, ken, with phil in tow, walked in. > > > You could have been well rewarded for that, ... i was: got treated like shit by one of the researchers for weeks on end and then small thank you when he blew away some file or other. a PhD in comp sci but can't master rm(1)? > the first crisis the DOS > user encounters when moving to Unix is the inability to recover > deleted files, and a facility as you provided would have been a > godsend to many like me who had to learn the discipline of > > alias rm="rm -i" these people should be all given loaded, hair trigger .357 magnum colt pythons shipped with windows. it'd either teach them a lesson about confirmations or improve the gene pool. do knives come with a -i option? > > on top of that ran the normal backups. my version of /n/dump > > was there for when someone blew away a file that they'd created > > recently so they could go and get it themselves with cd, ls and cp. > > > Very cool, as I was saying. But only viable once disk space stopped > being the most expensive computing resource (or second, whatever, when > it was no longer as critical). for our needs IIRC we needed about 4Gb for 30 days and then it'd cycle. slowly i 'stole' RA-90's [1Gb] by shifting stuff around. now 4Gb is nothing. > For curiosity's sake, when I was at > university in the early seventies, admin and research (just under > 10000 students) shared a Univac 1106, later Univac 1110, with 50Meg of > disk space. ever heard of running vi(1) on an 11/23 with 7th Ed and RL-02s? > CVS was a bunch of scripts, and RCS was very much based on SCCS, which > I presume was a mature product, at least be the time RCS was being > undertaken. I think your points are sound, but I'm really not aiming > at immature technology, although no doubt it still has rough edges. i think you better check that 'cos IIRC: SCCS came out of USG: a dreadful piece of junk RCS was 'BSD' based: raw, small and kinda neat but useless for DNS zone file version #'s. > As I said, it is a matter of comfort zones. I have to refer to man > pages even for rudimentary shell commands, whereas I think I have a > good memory for C idioms (some of which I get wrong without fail :-) i have a selective memory: 173000g > Please write a book! ``that's not been un-thought of... so do what the subject line says. get a rapid prototype, see how it goes, fix/improve it, loop. if i get some peace in the next few days, i might do some real work. 10 years, 7 days, ~20 hours in this 'police action'. i have a firefight to attend to first, so the rest is on hold. obradovitch takes his war kinda serious -- Pettibone's Law, John, Keene