From mboxrd@z Thu Jan 1 00:00:00 1970 Message-Id: <1518614226.2839083.1270434856.67A2AB28@webmail.messagingengine.com> From: Ethan Grammatikidis To: 9fans@9fans.net MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Content-Type: text/plain; charset="utf-8" In-Reply-To: Date: Wed, 14 Feb 2018 13:17:06 +0000 References: <1518397859.379213.1267373104.4DD6AB1E@webmail.messagingengine.com> <1518440723.631581.1267818416.51CB1908@webmail.messagingengine.com> Subject: Re: [9fans] There is no fork Topicbox-Message-UUID: d16494b2-ead9-11e9-9d60-3106f5b1d025 On Mon, Feb 12, 2018, at 3:21 PM, Giacomo Tesio wrote: > 2018-02-12 14:05 GMT+01:00 Ethan Grammatikidis : > > On Mon, Feb 12, 2018, at 8:33 AM, Giacomo Tesio wrote: > >> 2018-02-12 2:10 GMT+01:00 Ethan Grammatikidis : > >>> linux-style package managers and bsd-style port trees facilitate and enable coupling. > >> > >> What a package manager really facilitate is version management. > >> That is when you want to use/update a software at version X that depends on libraries at version Y and Z. > > > > That's the marketing blurb, I've heard it a thousand times before. [...] > > So, for the last 10-12 years, maybe more, mountains of software have been produced on the assumption that it will be easy to find and install all their dependencies. That's only true for users of big 'distributions' which have lots of people, a large professional team or many contributors, to create and maintain the package tree. > > True, but part of cost here is the complexity of the package manager. > > > > >> The use of dynamic linking make this complex and cumbersome. Also a single filesystem hierarchy does not help > > > > Dynamic linking is probably the largest, most visible part of the problem, but your saying this makes me think you haven't tried to compile very much software -- certainly not on anything other than Debian or a similarly large distro where, these days, you can get all the dependencies by pasting a line the package author provided. > > Well, I use Debian from Potato, but I've got my headaches with > pinning, backports, conflicts and broken upgrades. > > Also, I think I've compiled my amount of software. > As a rather recent example, automating the compilation of the gcc > cross-compiler for Jehanne took its time since I had to compile and > install locally specific versions of autotools and binutils, without > forcing users to install texinfo. Well done! I've only built gcc as a cross compiler once in my life, and I think that might have been gcc 2.95. I think the reason I get so grumpy about all this is because it's harder for me. I could say I never developed the mental toolset needed, but sometimes I have managed to do these things "without killing myself", so it's doubly frustrating when I fail. On the other hand, you are talking about a c compiler, which isn't going to have a lot of uncommon dependencies. Graphical programs can be much worse, and so can some background servers for less-standard features. I had trouble with a filesystem search indexer. > > I think I have an idea of what I'm doing, but I'm pretty open to > suggestions and criticisms: the best time for them is now, since I did > no real work on the matter. Indeed, I'm sorry I didn't offer any in my last mail. I'd forgotten about the operating system I planned last summer. I've found it now, all my notes on a computer I rarely use. I put all the thought I could into it, but of course it's not perfect. It would particularly need a lot of directories to be searched on executing programs. (I guess it would need a cache for that.) My plan was to have each package in its own directory. Some of the subdirectories were mandated: doc; cfg (user's config, empty on installation); cfg.def (defaults); inc (include); src; and arch-dependent dirs with 'all' for scripts. the arch-dependent dirs would have subdirs: exe; lib; inc; src; test. (Like you, I wanted to change 'bin'. It's ridiculous!) Looking at it now, I see it allows tight dependencies between packages, so I guess it doesn't solve much. I think a big part of the plan I didn't write down was to have large packages: include the dependencies in the package. WIndows programs have done this for decades, it's what ended "dll hell". It's certainly something I intend for pretty-much anything large I might develop in the future. If there's one point I'll really stand by, it's this one. There are some odd other things in my notes, not really on topic but relevant to operating systems and software choices. "Find the right layer for the task." Okay. Then there's my wish for a single scripting language, in contrast to Plan 9's maze of little languages, none quite alike. :) "The ministry of silly walks is a bad idea," which turned out to be about unions, not using them as a standard feature or implementing them in the kernel, because walk() is a bottleneck and can of course hit deadlocked fileservers. Even not rejecting seemingly featureful programs too quickly; there's this xgrep program which implements \{n,n\} and character classes in under 2000 lines of assembly language. Structured pipes? Sure, if you want to change the whole concept of a terminal. :) > > > The painful ones particularly included dependencies for otherwise nice programs. I'd get 2 or 3 levels down the dependency tree, and suddenly, chaos! > > [...] > > Thinking about this stuff makes me bitter, so I ought to stop now. It's possible the things you want won't intersect with the things which caused me trouble, but I think I have considerable reason for pessimism. > > Well, obviously I'm naive enough to try to do something better! :-D And that's not a bad thing! :) > > I think the problem is really tied to the nature of software > development... just because bugs are. Largely, yes. It can only be partially mitigated. I do think the best idea is for package authors to include dependencies in the package, not expecting users or distributors to do that work. Some will rant and holler about insecurity, and maybe they're right, but I want the freedom to try interesting software without the slave labour of looking after those difficult packages in the dependency tree. > > To my money you have an alternative: > - to be mostly self contained (as Plan 9/9front does), which is the > optimal solution to the problem > - to leverage outside softwares which evolve outside your control I like this, but it does rather require a team. > > Both solution have to cope with bugs: > - in Plan 9/9front you fix them > - in other systems you can still fix them but if you contribute back > the fix things turn "complex"... Yup. My thought about "fixing things in the right layer" is relevant here. If you have a self-contained system, you can do that. If it's split up, it requires cooperation. > > Versioning, dependency trees (or sometime even graphs) and all these > monsters comes from this problem. > > The self-contained approach is way more efficient... and simpler. Thus > it's very attractive to me. > > But, my insight is that these monsters need to be faced, and defeated. :-) > Since you can't stop (or control) the evolution of software you do not > code yourself, there's no way to avoid versioning and using that > software. There is: use the big commercial OSs for most stuff, and keep the software you want to develop private. Rather restricting, I know. :) > > But again my insight is that using static linking and namespaces, > packages can be way easier to maintain. I haven't considered namespaces in package management for a long time. I'm not convinced about the static linking part at all. It certainly makes binary package management simpler, but the problems I complain about occurred during compilation, or sometimes configuring. > > > Still, I'd really like to know details about your concerns and your > painful experiences, since they could put me on the right track. They're too far in the past to remember them all. Mostly they took this form: X requires Y and Z. Z's all right, but Y requires foo bar baz quux *and* syzygy, and then I'd find 3 of those had their own compilation problems. Slave labour required! The other thing was of course autohell, but that got a lot less likely some time during the last decade. It's still possible though. > > > > > I'd like to think there are enough people who don't want to be tied up in all this pain to create some semblance of a software ecosystem outside it, but when you consider that few people want to start from the ground up, and how poettering and fd.o have their tentacles in crucial parts of the posix-related ecosystem, it looks impossible. > > Well, actually there are several hobby OSes that do not support posix > and package management. > (and some have interesting ideas too...) I've always had a problem of not looking around enough. I use FreeDOS. It has a package manager, but I use unzip and deltree instead (or rm -r, i forget which.) > > But the problem with your approach is not just posix compliance. > > > For example, in Jehanne most tools you are used in Plan 9 are not part > of the core system. I have to say, as much as I like Plan 9 C and hate gcc, I understand you taking the C compilers out and replacing them with gcc. My big plan at the moment is to ditch C for a language which only requires an extremely small interpreter, and yet can be compiled too. I'm having a little bit of trouble following the indirection in a double-indirect threaded interpreter, but this is something I'm confident I can achieve. That language is Forth, but I'm starting to wonder if the same could be achieved with other languages. > > For example, porting Plan 9/9front games to Jehanne is trivial (could > even be automated), but their changes should not cause the core system > to "evolve". > > So the solution, again, is installing them as packages, with their own > versions. And this is the reason why there are no games in Jehanne: > they are waiting for a package manager. > > > The problem, as always, is to get the axes right. Oh that's similar to "put the fix in the right layer." Yup. > > > An OS core system should evolve as a whole. > But since its usefulness depend on the amount of things people can do > with it, it should also be able to run exogenous software. Your OS should, perhaps. :) My minimum requirements for an OS I'll consider useful are much lower, provided I have an alternative system for web browsing and maybe games. For 2 or 3 years I used 9front full time, primarily acme, telnet, and page. I was happy most of the time. Sometimes I'd play Minecraft or something like Second Life, but despite a certain amount of addiction to 3D graphics, I got almost as much out of my Plan 9 use. > > > The Plan9/9front approach is optimal because it's perfectly useful > exactly to the people it want to be useful to. > > > Jehanne is useless. It's just a toy, aka a research operating system. :-) > But in it's research scope is how to enable software development to > scale more easily than in mainstream alternatives. > > > My insight is that, as a simpler system, Jehanne will be more powerful. > And the additional simplicity/power will make such complex problems > almost disappear. Additional power can come with simplification, as everyone on this list knows. I hope it works out for you! :) > > > Giacomo > -- The lyf so short, the craft so long to lerne. -- Chaucer