9fans - fans of the OS Plan 9 from Bell Labs
 help / color / mirror / Atom feed
From: Aharon Robbins <arnold@skeeve.com>
To: 9fans@cse.psu.edu
Subject: Re: [9fans] awk, not utf aware...
Date: Thu, 28 Feb 2008 20:54:42 +0200	[thread overview]
Message-ID: <200802281854.m1SIsg2m004288@skeeve.com> (raw)

> Date: Wed, 27 Feb 2008 21:01:33 +0100
> From: Uriel <uriel99@gmail.com>
> Subject: Re: [9fans] awk, not utf aware...
> To: Fans of the OS Plan 9 from Bell Labs <9fans@cse.psu.edu>
>
> None of those issues are specific to AWK, they apply just as well to
> sed(1) or any program dealing with regexps. I think the plan9 tools
> demonstrate that it is not so hard to find a 'good enough' solution;
> and the lunix locale debacle demonstrate that if you want to get it
> 'right' you will end up with a nightmare.

Plan 9 had the luxury of starting over with Unicode from the ground
up. Many of the C mb* interfaces predate Unicode, as do many of the
character encodings in use in different parts of the world. Unix vendors
(and standards bodies) have the very real problems of trying to make
their software work, and continue to work for the forseeable future,
in different countries, encodings, etc.

I am not saying that the POSIX locale stuff is wonderful, elegant,
clean, etc.  It has real problems, and for the most recent gawk
release, gawk no longer uses the locale's decimal point for numeric
output by default.

But one has to give the standards groups and Unix vendors credit for
trying to grapple with a real problem instead of side stepping it and
then crowing about it.

> The problem with awk is that it is not a native plan9 app, and it
> simian nature shows in too many places. For example system() and | are
> badly broken:
>
> %  echo |awk '{print |"echo $KSH_VERSION"}'
> @(#)PD KSH v5.2.14 99/07/13.2

Why is this broken?  If the shell that awk is running is PDKSH, or
KSH_VERSION exists in the environment, this is to be expected.

For awk specifically, off the top of my head, the functions that have to
be character-set aware are: index, substr, length, tolower, toupper, and
match.  Gawk has been multibyte aware for several years, although there
were some bugs initially.  And someone recently pointed out another one:

	str = sprintf("%.5s", otherstr)

has to work in terms of characters, not bytes, which I overlooked
and still have to fix.

> Boyd made a native port of awk that fixed most (all?) of this issues,
> it can be found somewhere in his contrib dir but I don't think is
> production-ready.

I remember talking to him about this some, since for a long while the Plan
9 awk was one that was forked from BWK's circa 1993 and needed updating.

> On Wed, Feb 27, 2008 at 4:54 PM, Sape Mullender
> <sape@plan9.bell-labs.com> wrote:
> > > There is split and other functions,
> >  > for example:
> >  >
> >  > toupper("aֳ­")
> >  > gives
> >  > Aֳ­
> >  >
> >  > My guess is that there are many more little (or not) corners where it
> >  > doesn't work.
> >
> >  Yes, and then there is locale: does [a-z] include ִ³ when you run it
> >  in Holland (it should)?  Does it include ֳ¡, ֳ¨, ֳ´ in France (it should)?
> >  Does it include ֳ¸, ֳ¥ in Norway (it should not)?  And what happens when
> >  you evaluate "ֳ¨" < "o" (it depends)?
> >
> >  Fixing awk is much harder than anyone things.  I had a chat about it with
> >  Brian Kernighan and he says he's been thinking about fixing awk for a
> >  long time, but that it really is a hard problem.

Indeed.  I bit the bullet; Brian hasn't been willing to suffer the complaints,
and I don't blame him. :-)  You can see some of his travails by looking
at the CHANGES file in his distribution, available from his Bell Labs
and Princeton web pages.

As far as I know, gawk and the Solaris /usr/xpg4/bin/awk are the only
awks that are multibyte aware.  The Solaris version is derived from the MKS
one (see the code from opensolaris.org) with multibyte fixes. I can supply
simple patches to make it compile on Linux if anyone wants.  This version
doesn't handle some dark corners, but has the advantage of being
very small.

Arnold


             reply	other threads:[~2008-02-28 18:54 UTC|newest]

Thread overview: 24+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2008-02-28 18:54 Aharon Robbins [this message]
2008-02-28 21:48 ` Uriel
2008-02-28 22:08   ` erik quanstrom
  -- strict thread matches above, loose matches on Subject: below --
2008-02-28 15:10 erik quanstrom
2008-03-03 23:48 ` Jack Johnson
2008-03-04  0:13   ` erik quanstrom
2008-02-27  9:57 erik quanstrom
2008-02-26 12:18 Gorka Guardiola
2008-02-26 13:16 ` Martin Neubauer
2008-02-26 14:54   ` Gorka Guardiola
2008-02-26 20:24 ` erik quanstrom
2008-02-26 21:08   ` geoff
2008-02-26 21:21     ` Pietro Gagliardi
2008-02-26 21:24       ` erik quanstrom
2008-02-26 21:32       ` Steven Vormwald
2008-02-26 21:40         ` Pietro Gagliardi
2008-02-26 21:42           ` Pietro Gagliardi
2008-02-26 23:59           ` Steven Vormwald
2008-02-27  2:38       ` Joel C. Salomon
2008-02-29 17:00         ` Douglas A. Gwyn
2008-02-26 21:34     ` erik quanstrom
2008-02-27  7:36   ` Gorka Guardiola
2008-02-27 15:54     ` Sape Mullender
2008-02-27 20:01       ` Uriel

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=200802281854.m1SIsg2m004288@skeeve.com \
    --to=arnold@skeeve.com \
    --cc=9fans@cse.psu.edu \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).