ntg-context - mailing list for ConTeXt users
 help / color / mirror / Atom feed
From: "Mojca Miklavec" <mojca.miklavec.lists@gmail.com>
Subject: Feature request: Creating tables from (tab-separated) values
Date: Sat, 22 Apr 2006 04:02:57 +0200	[thread overview]
Message-ID: <6faad9f00604211902m5c88d10apcd5c208f1825f51c@mail.gmail.com> (raw)

Hello,

This idea described below was partially inspired by "gnuplot
contemptations", but mainly because writing tables with both LaTeX &
ConTeXt is too complex in most cases.

Natural tables are great since they offer anough flexibility to do
"just about anything" with tables, but many tables are still simple "m
values in n rows" and for those it's an annoying task to write those
"\NC"s, "\eTR\bTR"s, ... even if it's just a matter of writing a
script to transform the values to a suitable form.

Christopher Creutzig has sent an interesting solution to the mailing
list some time ago (see
http://wiki.contextgarden.net/TABLE#Creating_tables_from_CSV_data_.28Comma_Separated_Values.29)
and since then I've been thinking about a similar, slightly more
powerful solution which could make table-typesetiting easier.

I don't have enough skills to implement it, but here are some
properties listed that such a solution should have. In case anyone
finds this little project doable ... It would solve quite some
headaches.

- [very important] a possibility to define own macros, so that it
would be easy to print any type of tables (natural tables, TaBlEs, ...
or whatever form that could possibly come to someone's mind); for
example \def\MyDef#1#2{\bTR\bTD#1\eTD\bTD#2\eTD\eTR}; the "machinery
behind" would only have to "feed" these macros properly according

- [important] possible to separate the data with different characters:
tab, comma, multiple spaces, ampersand (&), ...

- [very useful] a possibility to read the data from a block
(\start...\stop...) or from a file; it would be handly to be able to
use the same data file for making a plot and for printing the data
into a table

- [useful] possibility to ignore lines starting with '#' (optional;
usually these are comments)

- [not crucial] if comma (,) is separating the data, a possibility to escape it

- [useful] a possibility to select which rows (columns) to print (for
example rows "1-3,8-last" or "even", ...)

- [optional, but still useful] replacing decimal points with commas

- [optional] if the table was supposed to have 4 colums and only 2 are
available, ignore/fill with empty arguments (don't panic with
unnecessary errors)

- [with luaTeX in mind] calculating sums of rows & columns & other
Excel-like calculations & references accross tables ;) - well, that
one wasn't meant seriously.

Thanks a lot,
    Mojca

             reply	other threads:[~2006-04-22  2:02 UTC|newest]

Thread overview: 5+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2006-04-22  2:02 Mojca Miklavec [this message]
2006-04-22 18:02 ` Peter Münster
2006-04-23 11:13   ` Mojca Miklavec
2006-04-23 16:31     ` Peter Münster
2006-04-23 17:15 ` Taco Hoekwater

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=6faad9f00604211902m5c88d10apcd5c208f1825f51c@mail.gmail.com \
    --to=mojca.miklavec.lists@gmail.com \
    --cc=ntg-context@ntg.nl \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).