zsh-workers
 help / color / mirror / code / Atom feed
* Re: PATCH: compadd (+ questions)
@ 1999-02-12  8:42 Sven Wischnowsky
  1999-02-13 19:19 ` Bart Schaefer
  1999-02-14  0:30 ` Thinking about Functions/Completion/* differently Bart Schaefer
  0 siblings, 2 replies; 4+ messages in thread
From: Sven Wischnowsky @ 1999-02-12  8:42 UTC (permalink / raw)
  To: zsh-workers


Bart Schaefer wrote:

> On Feb 11, 10:11am, Peter Stephenson wrote:
>
> ...
>
> } I, too, found the new [[-tests a little confusing at first sight.  I'm
> } inclined to think maybe they should be limited to those that can't be
> } done so easily in the standard way.  For example, I don't see why
> } people shouldn't use (( NMATCHES )) rather than [[ ! -nmatches 0 ]] .
> } But I haven't really got a proper grip on using this yet.
> 
> This is what I'm thinking, too.  I may be getting my chronology confused,
> but wasn't it the case that the new condition codes showed up before all
> the variables for NMATCHES and CURRENT etc. were available?
> 
> There's a short list of stuff you can't easily do with the variables:
> 
> 1.  Automatic shifting of bits of PREFIX into IPREFIX, as -iprefix does
>     and as -string and -class do with two args.
> 2.  Adjusting the range of words used, as -position does with two args.
> 3.  Skipping exactly N occurrences of an embedded substring, as -string
>     and -class do with two args.
> 
> Have I missed any others?  Sven didn't answer my question about whether
> the remaining condition codes are faster or have other side-effects:

(Sorry, I was rather busy yesterday and had to leave earlier...)

The -[m]between and -[m]after conditions also adjust the range of
words to use and these are the ones that are most complicated to
re-implement in shell-code.
Changing the values of the user-visible parameters is the only
side-effect they have.
As for the speed: they may be a bit faster than the equivalent tests
with parameter expansion but I don't think this is a argument for
keeping the condition codes since normally you'll have only few such
tests.

> The point being that, although being able to add conditions in modules is
> a cool thing, perhaps we should drop the ones that are easily replicated
> using simple tests on the variables, and then look again at the rest to
> see if there's a better way to express them.

Agreed, although I'm not sure if I'll produce a patch for this any
time soon.

> Unless there's a reason not to, such as, interpolating the string values
> of the variables during parsing of the expression is significantly slower
> than recognizing the new codes themselves (which I don't know whether or
> not is the case).

Again, I don't think that this is a problem, due to the expected
number of such tests.

> } For really clean completion functions we would need a new control
> } structure to be able to put all tests into one function and having
> } everything reset to the previous state automatically. Doing something
> } like:
> } 
> }   if comptest iprefix '-'; then
> }     ...
> }     compreset
> }   else
> } 
> } wouldn't be that easy to glark, too.
> 
> I still think the right way to do this is with shell-function scoping.
> 
> 	tryprefix () {
> 	    complocal	# Equivalent of "local CURRENT NMATCHES ...",
> 	    		# if "local" worked on special parameters
> 	    comptest ignored-prefix $1 && { shift ; "$@" }
> 	}
> 	tryprefix - complist -k "($signals[1,-3])"
>
> The unfortunate thing about this is that it clashes with $argv holding
> the command line (but that's true of any function called from within the
> top-level main-complete function, isn't it?).  There should probably be
> another array that also holds the words, but I guess it isn't that hard
> to have the main completion function set up a global for it.

Ah, now I finally understand why you wanting restting in function
scopes. One problem I have with this is that we can come up with a few 
functions for the tests, but the things to do if the test succeeds can 
often not easily be put in a command line.
Also we often need combinations of tests, for and'ed tests this would
lead to things like `tryrange -exec \; tryprefix - ...', but for or'ed 
tests. Also, will the non-parameter-modifying tests have equivalent
functions or will this lead to combinations of such function calls and 
`[[...]]' conditions? This can get very hard to read, I think.

About using argv for the stuff from the command line. I'm not too
happy with this anyway. If the new style completion is slower than the 
old one, this may partly be caused by several calls to sub-functions where 
we have to use "$@" to propagate the positional parameters.

> Looking at the above gives me a nagging idea ... it's half-formed at the
> moment, but it goes something like this ...
> 
> The effect of `compsave; [[ -iprefix ... ]] ; compreset` is that we want
> to try a completion with a particular prefix and if that fails, start
> over with the original state and try again.  Similarly for -position to 
> limit the range of words, etc.
> 
> So why don't we actually DO that?  That is, make a "recursive" call to
> the whole completion system?  Add -iprefix, -position, -string, -class
> options to "compcall" (or some similar new command) and have THAT adjust
> the variables, re-invoke the appropriate main completion function, and
> then restore everything if that function returns nonzero.  It would then
> be an advertized side-effect of compcall that, if it returns nonzero,
> the state of the completion variables has been adjusted -- which makes a
> lot more sense to me than having a conditional do it.

Only seldom do we want to do normal completion after adjusting the
word/words to use. In most cases we want to do some very specialised
calls to complist/compadd or whatever and we want only those
matches. Especially we normally don't want the main completion
function be called, it will use the `first'-completion-definition
again and things like that.

But, maybe we are on a completely wrong track. I forgot to mention it
yesterday, but when splitting the examples in different files I had
the unpleasent feeling that all this is the wrong way to do things.
The problem is that the current approach is completion-centered, not
command-centered. Just think about someone who wants to use one of the
example completions for another command. He would have to either
edit the file, adding the command name to the first line, or call
`defcomp' by hand.
I have the feeling, that this and your remarks about the `tryprefix'
function and about this recursive calling (and, btw, some of the mails
that started all this new ocmpletion stuff) point in the same
direction. Maybe we should re-build the example code in a completely
context-based way.
I haven't thought that much about all this yet, but it would go like
this:

We have a collection of context-names (completely shell-code based,
this is only vaguely connected to the CONTEXT parameter). For each
such context we have a definition saying what should be completed in
this context (such a definition could again be a function or an array, 
re-using some of the code). There would be a dispatcher-function that
gets the context-name as its argument and then invokes the
corresponding handler. Some of these handler-functions will do some
testing, and decide on the result of the tests, which contexts are
really to be used. The task of the main completion widget (note: the
widget, not the dispatcher) would be to check parameters like CONTEXT, 
decide which context(s) to try and then call the dispatcher.
We would then come up with files containing context-definitions (some
of the files are exactly that already and have names that show it, I
think that the fact that for other files names of commands seemed more
natural may be a hint that something went wrong).
Then we would need a way to connect command-names to context-names,
preferably in some automagically collected way.
When implementing the example stuff I had already had this kind of
feeling and thus the code already looks a bit like the above, but not
throughout. So, we wouldn't need that much changes (I think), we would
mainly need a shift in our perception of the problem (well, at least I
would need it) and change the code to reflect this everywhere.

Bye
 Sven


--
Sven Wischnowsky                         wischnow@informatik.hu-berlin.de


^ permalink raw reply	[flat|nested] 4+ messages in thread

* Re: PATCH: compadd (+ questions)
  1999-02-12  8:42 PATCH: compadd (+ questions) Sven Wischnowsky
@ 1999-02-13 19:19 ` Bart Schaefer
  1999-02-14  0:30 ` Thinking about Functions/Completion/* differently Bart Schaefer
  1 sibling, 0 replies; 4+ messages in thread
From: Bart Schaefer @ 1999-02-13 19:19 UTC (permalink / raw)
  To: zsh-workers

On Feb 12,  9:42am, Sven Wischnowsky wrote:
} Subject: Re: PATCH: compadd (+ questions)
}
} > The point being that, although being able to add conditions in modules is
} > a cool thing, perhaps we should drop the ones that are easily replicated
} > using simple tests on the variables, and then look again at the rest to
} > see if there's a better way to express them.
} 
} Agreed, although I'm not sure if I'll produce a patch for this any
} time soon.

If you're about to do new doc, though, you shouldn't spend time documenting
something that's going to go away ...

} > 	tryprefix () {
} > 	    complocal	# Equivalent of "local CURRENT NMATCHES ...",
} > 	    		# if "local" worked on special parameters
} > 	    comptest ignored-prefix $1 && { shift ; "$@" }
} > 	}
} > 	tryprefix - complist -k "($signals[1,-3])"
} 
} Ah, now I finally understand why you wanting restting in function
} scopes. One problem I have with this is that we can come up with a few 
} functions for the tests, but the things to do if the test succeeds can 
} often not easily be put in a command line.

That's just a matter of more shell functions:

    listsignals() { complist -k "($signals[1,-3])" }
    tryprefix - listsignals

but you're right that a real syntactic construct would be much cleaner.

Maybe we can figure out a way to make "local" work inside { ... } ?  I've
seen some postings on zsh-users that lead me to believe some people think
it does already.

} Also we often need combinations of tests, for and'ed tests this would
} lead to things like `tryrange -exec \; tryprefix - ...', but for or'ed 
} tests.

You wouldn't do it that way.  You'd write a special function for each
particular combination of tests you wanted, and then just call them in
the appropriate spots.  Yes, this is crude, but it's easy to explain and
use, in terms of existing zsh scripting concepts.

} Also, will the non-parameter-modifying tests have equivalent
} functions or will this lead to combinations of such function calls and 
} `[[...]]' conditions? This can get very hard to read, I think.

My thought was that only the tests with side-effects get a builtin.  I
don't think the mixture will occur often enough to get that confusing.
If you don't need to "local"-ize the side-effects, which in at least
some cases you won't, you can just write stuff like

    if test $NMATCHES -eq 0; then
	...
    elif comptest ignored-prefix - ; then
	...
    elif ...
	...
    fi

I.e., use "test" if it's really important to avoid mixing [[ ... ]] with
plain ol' command syntax.

} About using argv for the stuff from the command line. I'm not too
} happy with this anyway. If the new style completion is slower than the 
} old one, this may partly be caused by several calls to sub-functions where 
} we have to use "$@" to propagate the positional parameters.

So don't propagate it via positional parameters after the first level.
Having them in the positionals is a convenience for when the top-level
completion function doesn't need to be as complex as __main_complete,
e.g., when writing special key completions.

For Functions/Completion/*, something like this:

    alias compsub 'words=( "$@" ) __normal || return 1'

or just have __main_complete assign to a local $words and then use it in
place of $argv in all of the sub-functions.

} > So why don't we actually DO that?  That is, make a "recursive" call to
} > the whole completion system?  Add -iprefix, -position, -string, -class
} > options to "compcall" (or some similar new command) and have THAT adjust
} > the variables, re-invoke the appropriate main completion function, and
} > then restore everything
} 
} Only seldom do we want to do normal completion after adjusting the
} word/words to use. In most cases we want to do some very specialised
} calls to complist/compadd or whatever and we want only those
} matches.

OK, then, give compcall another parameter that's the name of a different
top-level completion function to use.  That'd also solve the positional
parameter propagation problem we just talked about.

Note that I'm talking here about adding functionality to the compcall
builtin, not to some new piece in Functions/Completion/.

More later ...

-- 
Bart Schaefer                                 Brass Lantern Enterprises
http://www.well.com/user/barts              http://www.brasslantern.com


^ permalink raw reply	[flat|nested] 4+ messages in thread

* Thinking about Functions/Completion/* differently
  1999-02-12  8:42 PATCH: compadd (+ questions) Sven Wischnowsky
  1999-02-13 19:19 ` Bart Schaefer
@ 1999-02-14  0:30 ` Bart Schaefer
  1 sibling, 0 replies; 4+ messages in thread
From: Bart Schaefer @ 1999-02-14  0:30 UTC (permalink / raw)
  To: zsh-workers

On Feb 12,  9:42am, Sven Wischnowsky wrote:
} Subject: Re: PATCH: compadd (+ questions)
}
} The problem is that the current approach is completion-centered, not
} command-centered. Just think about someone who wants to use one of the
} example completions for another command.

I think we need to be neither command-centered nor completion-centered.
We need a relational rather than operational view.

We have all these files, many of which amount to nothing more than a map
from a name like __aliases to an old compctl option letter.  We need to
recognize that this mapping is even more important than what the files
contain.

} I haven't thought that much about all this yet, but it would go like
} this:
} 
} We have a collection of context-names (completely shell-code based,
} this is only vaguely connected to the CONTEXT parameter). For each
} such context we have a definition saying what should be completed in
} this context (such a definition could again be a function or an array, 
} re-using some of the code).

This sounds OK so far ...

} There would be a dispatcher-function that
} gets the context-name as its argument and then invokes the
} corresponding handler.

What's the reason to do this with a dispatcher-function that takes an
argument, rather than simply using a "case ... esac"?

} We would then come up with files containing context-definitions (some
} of the files are exactly that already and have names that show it, I
} think that the fact that for other files names of commands seemed more
} natural may be a hint that something went wrong).

Not entirely.  By contexts, here, you mean things like "now we are
completing path names," "here we complete user names," "here we want
alias names," "now file names ending in .gz," etc.  Right?

There are only a few such contexts that are general enough to apply to
a wide range of commands.  Most of those are already represented by old
compctl options that take no arguments, and therefore can be directly
implemented by calling complist.  I don't think it's very useful to
have separate files for defining each of these cases.

Then there are a few other cases, like __x_options and __dvi, that are
specific to a particular class of commands; and lastly there are the
commands like __dd that have so many unique options that they form a
context of their own.  So if the goal was to have one file per context,
you aren't far off.

The hint that something is wrong is not the file names; it's the fact that
some only do match generation, whereas others do what might be called
"second-level dispatch."  Match generation is actually NOT the hard part!
The hard part is determining, from the command line, which context is the
one in which you're completing.

} Then we would need a way to connect command-names to context-names,
} preferably in some automagically collected way.

Connecting command-names to context-names isn't enough.  A single command
may have multiple contexts.

So it seems that what we need (with the usual caveats about contexts like
parameter substitution that need some other key than "command name") are:

1.  For each command, something that maps the command line and cursor
    postion to one of a set of a contexts.  (Some commands will never
    have more than one context.)

2.  For each context that can be generated by (1), something that maps
    the command name to the set of possible matches for that context.
    (Some contexts will never care about the command name or other key.)

3.  For each set of matches that can be generated by (2), something that
    maps the command and context to the code to insert the matches into
    the command line.  (There's lots of builtin support for this already.)

Some of those mappings may actually be multi-level, e.g., to implement
completion after the -exec for "find ... -exec ...".

Does this make sense?

-- 
Bart Schaefer                                 Brass Lantern Enterprises
http://www.well.com/user/barts              http://www.brasslantern.com


^ permalink raw reply	[flat|nested] 4+ messages in thread

* Re:  Thinking about Functions/Completion/* differently
@ 1999-02-15 13:12 Sven Wischnowsky
  0 siblings, 0 replies; 4+ messages in thread
From: Sven Wischnowsky @ 1999-02-15 13:12 UTC (permalink / raw)
  To: zsh-workers


Bart Schaefer wrote:

> ...
>
> ... (with the usual caveats about contexts like
> parameter substitution that need some other key than "command name") ...

And before I forget to mention this: people using the new style
completion stuff may have noticed that completion after `$', `=', and
`~' still automagically works if complist is used independent of the
options given to it. This is due to the fact that this is still
handled very deeply in the completion code. I was thinking about
skipping over this code when the completion code is called from a
completion widget and instead testing for it and reporting this to the 
widget as different CONTEXTs. This sound like the correct behavior, right?

Bye
 Sven


--
Sven Wischnowsky                         wischnow@informatik.hu-berlin.de


^ permalink raw reply	[flat|nested] 4+ messages in thread

end of thread, other threads:[~1999-02-15 13:14 UTC | newest]

Thread overview: 4+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
1999-02-12  8:42 PATCH: compadd (+ questions) Sven Wischnowsky
1999-02-13 19:19 ` Bart Schaefer
1999-02-14  0:30 ` Thinking about Functions/Completion/* differently Bart Schaefer
1999-02-15 13:12 Sven Wischnowsky

Code repositories for project(s) associated with this public inbox

	https://git.vuxu.org/mirror/zsh/

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).