zsh-workers
 help / color / mirror / code / Atom feed
From: Peter Stephenson <pws@pwstephenson.fsnet.co.uk>
To: zw <zsh-workers@sunsite.dk>
Subject: Re: Build problem on terminfo.c on FreeBSD 4.7
Date: Sun, 16 Nov 2003 20:00:51 +0000	[thread overview]
Message-ID: <20031116200052.78B9D8543@pwstephenson.fsnet.co.uk> (raw)
In-Reply-To: "Oliver Kiddle"'s message of "Fri, 14 Nov 2003 23:46:07 +0100." <7338.1068849967@athlon>

Oliver Kiddle wrote:
> # ifdef HAVE_CURSES_H
> #  include <curses.h>
> # endif
> 
> So curses.h would be included. However, HAVE_CURSES_H is not defined.
> This seems to be because the curses.h header check is inside this in
> zshconfig.ac:
>   case "$LIBS" in
>   *curses*)
> This presumably fails because it uses just -ltermcap.
> 
> libtermcap.so is just a link to libncurses.so here. So how is that
> situation otherwise handled? Should the HAVE_SETUPTERM test perhaps
> similarly only be done if $LIBS = *curses*? Or can we perhaps safely
> substitute -1 for ERR?

The original idea seemed to be the terminfo module was restricted to
systems which used terminfo, which was taken to imply curses.  However,
it doesn't seem to have been taken into account everywhere, and I think
the distinction is all a bit historical now.  In particular, on many
systems (like this one) termcap if it exists seems to be a vague pointer
to curses anyway, so deliberately linking against -ltermcap is
pointless.  What's more, on some other systems termcap is only
statically linked (possiably with hacked-out bits of curses, though I
don't know that) while curses, with the same symbols and more, is
dynamic.

In short, it's a mess.

Here, if curses is actually being used, whether in disguise or not, I
think we have two choices (1) continue to use the logic that applies to
termcap, i.e. not provide the functionality for terminfo (2) apply *all*
the definitions that refer to curses, including any applicable HAVE_*
stuff, changing configure as necessary.  I don't think propagating a
hybrid is a good idea.

More generally, I think the thing to do is only compile a stub terminfo
unless everything necessary is around (subject to the complications
above of what this actually means).  Probably even better would be not
to attempt to compile it; this can be done with a suitable test (which
might not be trivial, however) in the .mdd file.  On my Linux system,
the library is present but echoti simply says `not available on this
system'.  We could only compile the library at all on systems where we
know it's going to work.

There have been endless problems of this sort.  I'd *really* like to get
this sorted out before 4.1 becomes 4.2.

-- 
Peter Stephenson <pws@pwstephenson.fsnet.co.uk>
Work: pws@csr.com
Web: http://www.pwstephenson.fsnet.co.uk


      reply	other threads:[~2003-11-16 19:57 UTC|newest]

Thread overview: 3+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2003-11-03  3:42 Felix Rosencrantz
2003-11-14 22:46 ` Oliver Kiddle
2003-11-16 20:00   ` Peter Stephenson [this message]

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=20031116200052.78B9D8543@pwstephenson.fsnet.co.uk \
    --to=pws@pwstephenson.fsnet.co.uk \
    --cc=zsh-workers@sunsite.dk \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
Code repositories for project(s) associated with this public inbox

	https://git.vuxu.org/mirror/zsh/

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).