* Large COLUMNS crashes zsh
@ 2006-05-05 14:19 Mads Martin Joergensen
2006-05-05 14:32 ` Peter Stephenson
0 siblings, 1 reply; 4+ messages in thread
From: Mads Martin Joergensen @ 2006-05-05 14:19 UTC (permalink / raw)
To: zsh-users
Hey all,
export COLUMNS=10000000000000000
When doing this zsh crashes. I know it seems stupid, but wouldn't it be
sane to ignore such large number, or simply set it to the largest
possible?
--
Mads Martin Joergensen, http://mmj.dk
"Why make things difficult, when it is possible to make them cryptic
and totally illogical, with just a little bit more effort?"
-- A. P. J.
^ permalink raw reply [flat|nested] 4+ messages in thread
* Re: Large COLUMNS crashes zsh
2006-05-05 14:19 Large COLUMNS crashes zsh Mads Martin Joergensen
@ 2006-05-05 14:32 ` Peter Stephenson
2006-05-05 15:33 ` Miek Gieben
0 siblings, 1 reply; 4+ messages in thread
From: Peter Stephenson @ 2006-05-05 14:32 UTC (permalink / raw)
To: Zsh users list
>export COLUMNS=10000000000000000
>
>When doing this zsh crashes. I know it seems stupid, but wouldn't it be
>sane to ignore such large number, or simply set it to the largest
>possible?
Probably, but the trouble is there's no single "largest possible".
We've just run into a similar problem with the maximum size of arrays.
If it doesn't crash the shell, it's potentially useful, but we don't
know a priori how large that is. Some checks on malloc might help, but
that's a big can of worms, too: once you need it in one place, you need
it all over. Furthermore, I've got a feeling that on many virtual
memory systems the malloc might succeed but cause havoc later.
--
Peter Stephenson <pws@csr.com> Software Engineer
CSR PLC, Churchill House, Cambridge Business Park, Cowley Road
Cambridge, CB4 0WZ, UK Tel: +44 (0)1223 692070
To access the latest news from CSR copy this link into a web browser: http://www.csr.com/email_sig.php
^ permalink raw reply [flat|nested] 4+ messages in thread
* Re: Large COLUMNS crashes zsh
2006-05-05 14:32 ` Peter Stephenson
@ 2006-05-05 15:33 ` Miek Gieben
2006-05-06 12:51 ` Peter Stephenson
0 siblings, 1 reply; 4+ messages in thread
From: Miek Gieben @ 2006-05-05 15:33 UTC (permalink / raw)
To: Zsh users list
[-- Attachment #1: Type: text/plain, Size: 1269 bytes --]
[On 05 May, @16:32, Peter Stephenson wrote in "Re: Large COLUMNS crashes zsh ..."]
> >export COLUMNS=10000000000000000
> >
> >When doing this zsh crashes. I know it seems stupid, but wouldn't it be
> >sane to ignore such large number, or simply set it to the largest
> >possible?
>
> Probably, but the trouble is there's no single "largest possible".
> We've just run into a similar problem with the maximum size of arrays.
> If it doesn't crash the shell, it's potentially useful, but we don't
> know a priori how large that is. Some checks on malloc might help, but
> that's a big can of worms, too: once you need it in one place, you need
> it all over. Furthermore, I've got a feeling that on many virtual
> memory systems the malloc might succeed but cause havoc later.
bash seems handle this just fine. And it resets the value to something
more sane (don't know if that is a bug or feature):
$ export COLUMNS=10000000000000000
$ echo $COLUMNS
10000000000000000
$ ls
...bunch of files...
$ echo $COLUMNS
100
$
This is with bash3. Zsh on the other hand is trying to trash my
machine :)
--
grtz,
- Miek
http://www.miek.nl http://www.nlnetlabs.nl
PGP: 6A3C F450 6D4E 7C6B C23C F982 258B 85CF 3880 D0F6
[-- Attachment #2: Digital signature --]
[-- Type: application/pgp-signature, Size: 191 bytes --]
^ permalink raw reply [flat|nested] 4+ messages in thread
* Re: Large COLUMNS crashes zsh
2006-05-05 15:33 ` Miek Gieben
@ 2006-05-06 12:51 ` Peter Stephenson
0 siblings, 0 replies; 4+ messages in thread
From: Peter Stephenson @ 2006-05-06 12:51 UTC (permalink / raw)
To: Zsh users list
Miek Gieben wrote:
> bash seems handle this just fine. And it resets the value to something
> more sane (don't know if that is a bug or feature):
>
> $ export COLUMNS=10000000000000000
> $ echo $COLUMNS
> 10000000000000000
> $ ls
> ...bunch of files...
> $ echo $COLUMNS
> 100
> $
>
> This is with bash3. Zsh on the other hand is trying to trash my
> machine :)
zsh actually needs to know the number of columns internally for zle,
which stores all the lines it's printing out; this is needed to get
multiline buffers to work. So if you get it wrong it's not going to
work properly anyway, even if we invent some kludge (and there are
plenty of possibilities). I can't really think of a better answer than
"don't do that", offhand, but I'm open to specific suggestions.
(The question is not "is it bad for the shell to crash?" The question
is "what do we do about it?")
By the way, if you simply don't want the shell to see a value you're
using in a function or script, there's a workaround:
fn() {
local -hx COLUMNS=10000000000000000
}
The -h makes the parameter cover the special value; it's set in the
environment for the duration of the function but isn't coupled to the
shell's internall column setting.
--
Peter Stephenson <p.w.stephenson@ntlworld.com>
Web page now at http://homepage.ntlworld.com/p.w.stephenson/
^ permalink raw reply [flat|nested] 4+ messages in thread
end of thread, other threads:[~2006-05-06 12:52 UTC | newest]
Thread overview: 4+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2006-05-05 14:19 Large COLUMNS crashes zsh Mads Martin Joergensen
2006-05-05 14:32 ` Peter Stephenson
2006-05-05 15:33 ` Miek Gieben
2006-05-06 12:51 ` Peter Stephenson
Code repositories for project(s) associated with this public inbox
https://git.vuxu.org/mirror/zsh/
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).