From mboxrd@z Thu Jan 1 00:00:00 1970 MIME-Version: 1.0 In-Reply-To: <20090406035529.C9F5A5B21@mail.bitblocks.com> References: <20090406035529.C9F5A5B21@mail.bitblocks.com> Date: Thu, 16 Apr 2009 18:24:36 +0100 Message-ID: From: roger peppe To: Fans of the OS Plan 9 from Bell Labs <9fans@9fans.net> Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Subject: Re: [9fans] typed sh (was: what features would you like in a shell?) Topicbox-Message-UUID: dcedd618-ead4-11e9-9d60-3106f5b1d025 2009/4/6 Bakul Shah : > On Thu, 02 Apr 2009 20:28:57 BST roger peppe =C2=A0w= rote: >> a pipeline is an amazingly powerful thing considering >> that it's not a turing-complete abstraction. > > "f | g" is basically function composition, where f and g are > stream functions. Of course, this simple analogy breaks down > the moment we add more input/output channels -- may be that > is why anything beyond a simple pipeline seems to get people > in trouble (see the rc output redirection thread). actually, the analogy works fine if we add more input channels - it's multiple output channels that make things hard, as they mean that you have an arbitrary directed graph rather than a tree, which doesn't have such a convenient textual representation and is harder to comprehend to boot. in alphabet, i had a diagnostic channel, which was strictly textual and not easily accessible from the language, which was arguably not the best solution, but i didn't want things to get too complex. > To go beyond simple char streams, one can for example build a > s-expr pipeline: a stream of self identifying objects of a > few types (chars, numbers, symbols, lists, vectors). the difficulty with s-exprs (and most nested structures, e.g. XML) from a pipeline point of view is that their nested nature means that any branch might contain unlimited quantities of stuff, so you can't always process in O(1) space, which is one of the things i really like about pipeline processing. i found a nice counter-example in the fs stuff - the fundamental type was based around a "conditional-push" protocol for sending trees of files - the sender sends some information on a file/directory and the receiver replies whether to descend into that file or not. the tree had a canonical order (alphabetical on name), so tree merging could be done straightforwardly in O(1) space. this kind of streaming "feels" like a regular pipeline, but you can't do this with a regular pipeline. for instance, a later element in the pipeline can prevent an earlier from descending into a part of the file system that might block indefinitely. every language has a trade-off between typed and untyped representations; with alphabet i was trying to create something where it was *possible* to create new kinds of types where necessary (as in the fs example), but where it wouldn't be customary or necessary to do so in the vast majority of cases. perhaps it was folly, but i still think it was an interesting experiment, and i don't know of anything similar.