zsh-users
 help / color / mirror / code / Atom feed
From: ZyX <kp-pav@yandex.ru>
To: Ray Andrews <rayandrews@eastlink.ca>,
	"zsh-users@zsh.org" <zsh-users@zsh.org>
Subject: Re: greps pipes and eval bad patterns
Date: Mon, 26 Oct 2015 16:04:07 +0300	[thread overview]
Message-ID: <1413271445864647@web24g.yandex.ru> (raw)
In-Reply-To: <562D9E85.7080006@eastlink.ca>



26.10.2015, 06:32, "Ray Andrews" <rayandrews@eastlink.ca>:
> On 10/25/2015 06:02 PM, Bart Schaefer wrote:
>>  On Oct 25, 12:47pm, Ray Andrews wrote:
>>  }
>>  } test1 ()
>>  } {
>>  } gstring=" | grep \[01;34m "
>>  } tree --du -haC | grep -Ev "^[^\[]{$levels}\[*" "$gstring"
>>  } }
>>
>>  One doesn't normally build up a pipeline that way, but if you must
>
> What would be the better way? I'm not wedded to anything, just looking
> for the
> appropriate method.
>>    do
>>  so, you're on the right track with "eval" -- you just haven't applied
>>  enough quoting. "eval" is going to re-parse everything, so you need
>>  to quote everyhing to the same depth:
>>
>>       eval 'tree --du -haC | grep -Ev "^[^\[]{$levels}\[*"' "$gstring"
>>
>>  The single quotes (before tree and after the levels pattern) keep the
>>  first pipeline (and importantly the double-quotes that are around the
>>  grep pattern) from being interpreted until eval does so.
>
> Enlightenment. We freeze all expansions with single quotes until eval
> sorts it all out in
> one go.
>
>>  The use of
>>  the parameter for $gstring has the same effect.
>>
>>  You might be able to see this better if you assign everything to
>>  variables before eval-ing, e.g.
>>
>>     test1 ()
>>     {
>>        gstring="| grep \[01;34m "
>>        glevels='| grep -Ev "^[^\[]{$levels}\[*"'
>>        tree="tree --du -haC"
>>        eval "$tree" $glevels" "$gstring"
>
> Yeah, that's the sort of thing I'm used to doing I just didn't know how
> to handle the
> tricky characters. It makes nothing but sense now that I see it. So the
> final product
> becomes:
>
> t ()
> {
>      local gstring=
>      [ "$1" = ',f' ] && { gstring=' | grep "\[01;34m" '; shift }
>      integer levels=$(( ($1 + 1) * 4 ))
>      eval ' tree --du -haC | grep -Ev "^[^\[]{$levels}\[*" ' $gstring
>      du -sh .
> }

Specifically this I would write as

    local -a gcmd
    gcmd=( cat )
    if [[ $1 == ,f ]] ; then
        gcmd=( grep '\[01;34m' )
        shift
    endif
    integer levels=$(( ($1 + 1) * 4 ))
    tree --du -haC | grep -Ev '^[^\[]{'"$levels"'\[* ' | $gcmd

>
> ... a better tree than tree.


  reply	other threads:[~2015-10-26 13:10 UTC|newest]

Thread overview: 9+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2015-10-25 19:47 Ray Andrews
2015-10-26  1:02 ` Bart Schaefer
2015-10-26  1:17   ` Kurtis Rader
2015-10-26  3:39     ` Ray Andrews
2015-10-26  3:31   ` Ray Andrews
2015-10-26 13:04     ` ZyX [this message]
2015-10-26 13:36       ` Ray Andrews
2015-10-26 14:30         ` Bart Schaefer
2015-10-26 14:49           ` Ray Andrews

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=1413271445864647@web24g.yandex.ru \
    --to=kp-pav@yandex.ru \
    --cc=rayandrews@eastlink.ca \
    --cc=zsh-users@zsh.org \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
Code repositories for project(s) associated with this public inbox

	https://git.vuxu.org/mirror/zsh/

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).