From mboxrd@z Thu Jan 1 00:00:00 1970 X-Msuck: nntp://news.gmane.io/gmane.comp.tex.context/2559 Path: main.gmane.org!not-for-mail From: Hans Hagen Newsgroups: gmane.comp.tex.context Subject: Re: New \setup... / sgml processing Date: Wed, 30 Aug 2000 00:04:17 +0200 Sender: owner-ntg-context@let.uu.nl Message-ID: <3.0.6.32.20000830000417.00b60380@pop.wxs.nl> References: <39AA8876.7F8993E@econ.muni.cz> <3.0.6.32.20000829133138.017ab350@pop.wxs.nl> NNTP-Posting-Host: coloc-standby.netfonds.no Mime-Version: 1.0 Content-Type: text/plain; charset="us-ascii" X-Trace: main.gmane.org 1035393345 9868 80.91.224.250 (23 Oct 2002 17:15:45 GMT) X-Complaints-To: usenet@main.gmane.org NNTP-Posting-Date: Wed, 23 Oct 2002 17:15:45 +0000 (UTC) Cc: Michal Kvasnicka , Context Original-To: Berend de Boer In-Reply-To: <39ABF1B1.335EC29@pobox.com> Xref: main.gmane.org gmane.comp.tex.context:2559 X-Report-Spam: http://spam.gmane.org/gmane.comp.tex.context:2559 At 07:24 PM 8/29/00 +0200, Berend de Boer wrote: >Hans Hagen wrote: > >> if there is enoigh interest for this, we may consider starting a discussion >> on this. > >That's for sure. I think we're not far of that people will code entirely >in XML instead of plain TeX. But perhaps live would be a lot easier if >we had a tex that could read XML natively... There are several conflicting demands and situations in processing XML which are complicated by inpropriate usage of HTML tags (abusing tags for makeup). To mention a few complications: (1) entities: when using tex to parse the document, one can do without a dtd, but when using xsl the dtd should be available; rather annoying when you are not permanently on line and xt wants to look on the net. So, concerning entitities, direct processing is great. (2) nesting: if you use macros that use delimiters, this can be coded in macros, but nesting is a problem since there is no way that tex can smuggle {} in the stream so that ... will be mapped onto \beginx {\beginx .. \endx} \endx; imagine that we have macros like \def\beginx#1\endx{...}. I can process files directly using tex (several implementations) or xsl (with entity problems) or a combination (with the nesting problem, especially painful in nested tables). [rather funny is defining transformations in tex, writing a xsl file, calling xsl using write18 and then reading the resulting file.] Actually what would be needed is a pure internal mapping, like replaced by an stream of tokens in the input, before further expansion. Sort of what omega does with its input filters. Although xslt has its advantages, it is primarily focussed on going from xml to xml/html. So far I have not seen any solid transformation engine that looks at documents in the way tex likes it. Handling the simple cases is not so much the trouble, but my mind is already spoiled by too much thinking of potential problems (knowing a bit what i want to do), which is why i still have no best way to handle things, It may be interesting to think of a good xml to tex preprocessor that can be fed with simple transformation tables as well as entity mappers. It should not be that hard, since there are some five kind of mappings we want to do: command, argument, environment, ignore, delimited, some with optional pre/post space stripping, outer level grouping, and a few more. It could be interesting to have this ready for next generation tex's. Hans ------------------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: +31 (0)38 477 53 69 | fax: +31 (0)38 477 53 74 | www.pragma-ade.com -------------------------------------------------------------------------