* Re: [TUHS] reviving a bit of WWB
2020-09-20 20:58 ` Steve Nickolas
@ 2020-09-20 21:33 ` Brantley Coile
2020-10-07 5:43 ` scj
2020-09-20 21:35 ` John Cowan
` (2 subsequent siblings)
3 siblings, 1 reply; 44+ messages in thread
From: Brantley Coile @ 2020-09-20 21:33 UTC (permalink / raw)
To: Steve Nickolas; +Cc: tuhs, Doug McIlroy
The fact that a pointer of zero generates a hardware trap is not defined in the language, whereas a 0 is is defined to be a null pointer.
I've worked on systems where a 0 pointer could be dereferenced without a trap. I wouldn't recommend it. System designers do things like make the first page of memory invalid so we will get a null pointer trap. On Plan 9 the beginning of the text segment starts at 0x1020.
But that's not part of the C language. The fact that 0 is a null pointer is.
Brantley
> On Sep 20, 2020, at 4:58 PM, Steve Nickolas <usotsuki@buric.co> wrote:
>
> On Sun, 20 Sep 2020, Doug McIlroy wrote:
>
>>> (Of course, that assumes NULL is 0, but I don't think I've run into any
>>> architecture so braindead as to not have NULL=0.)
>>
>> It has nothing to do with machine architecture. The C standard
>> says 0 coerces to the null pointer. NULL, defined in <stddef.h>,
>> is part of the library, not the language. I always use 0,
>> because NULL is a frill.
>>
>> Doug
>
> I was under the impression that there was explicitly no requirement that a null pointer be 0, and that there was at least one weird system where that wasn't true - that it just so happened that null points to 0 on certain CPUs and that 0=NULL *happens* to work on most CPUs but wasn't guaranteed. (In fact, I read that my habit of using 0 for NULL relied on a faulty assumption!)
>
> I mean, I've never actually used a CPU/OS/compiler where it wasn't true, but...
>
> -uso.
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2020-09-20 21:33 ` Brantley Coile
@ 2020-10-07 5:43 ` scj
0 siblings, 0 replies; 44+ messages in thread
From: scj @ 2020-10-07 5:43 UTC (permalink / raw)
To: Brantley Coile; +Cc: tuhs, Doug McIlroy
This discussion of null pointers reminds me of one of the hardest things
we came upon porting what became V7 to the Interdata 4/32. Earlier UNIX
systems had returned -1 as an error indicator from some system calls
that returned a pointer. We made a strong effort to find these, but
they were everywhere and we didn't get them all (another motivation for
Lint...).
We were finding crashes on the Interdata with the machine halted and
none of the error registers (including the location of the fault) making
any sense. After a couple of frustrating weeks, we found the problem.
The Interdata was a microcoded machine. If it tried to access -1 as an
address, it immediately got an "unaligned access" fault and dove into
the microcode. Before it has saved its status registers, the memory
system chimed in with another fault -- memory access out of bounds. The
combination trashed everything.
When we met with Interdata to explore whether they wanted to sell Unix
on their hardware, fixing this was one of our non-negotiable demands.
They said no. We walked.
Several years later, of course, Unix did show up there, but they missed
a great opportunity.
---
On 2020-09-20 14:33, Brantley Coile wrote:
> The fact that a pointer of zero generates a hardware trap is not
> defined in the language, whereas a 0 is is defined to be a null
> pointer.
>
> I've worked on systems where a 0 pointer could be dereferenced without
> a trap. I wouldn't recommend it. System designers do things like make
> the first page of memory invalid so we will get a null pointer trap.
> On Plan 9 the beginning of the text segment starts at 0x1020.
>
> But that's not part of the C language. The fact that 0 is a null
> pointer is.
>
> Brantley
>
>> On Sep 20, 2020, at 4:58 PM, Steve Nickolas <usotsuki@buric.co> wrote:
>>
>> On Sun, 20 Sep 2020, Doug McIlroy wrote:
>>
>>>> (Of course, that assumes NULL is 0, but I don't think I've run into
>>>> any
>>>> architecture so braindead as to not have NULL=0.)
>>>
>>> It has nothing to do with machine architecture. The C standard
>>> says 0 coerces to the null pointer. NULL, defined in <stddef.h>,
>>> is part of the library, not the language. I always use 0,
>>> because NULL is a frill.
>>>
>>> Doug
>>
>> I was under the impression that there was explicitly no requirement
>> that a null pointer be 0, and that there was at least one weird system
>> where that wasn't true - that it just so happened that null points to
>> 0 on certain CPUs and that 0=NULL *happens* to work on most CPUs but
>> wasn't guaranteed. (In fact, I read that my habit of using 0 for NULL
>> relied on a faulty assumption!)
>>
>> I mean, I've never actually used a CPU/OS/compiler where it wasn't
>> true, but...
>>
>> -uso.
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2020-09-20 20:58 ` Steve Nickolas
2020-09-20 21:33 ` Brantley Coile
@ 2020-09-20 21:35 ` John Cowan
2021-02-02 23:08 ` Greg A. Woods
2020-09-20 22:15 ` Clem Cole
2020-09-21 20:46 ` Steffen Nurpmeso
3 siblings, 1 reply; 44+ messages in thread
From: John Cowan @ 2020-09-20 21:35 UTC (permalink / raw)
To: Steve Nickolas; +Cc: The Eunuchs Hysterical Society, Doug McIlroy
[-- Attachment #1: Type: text/plain, Size: 1554 bytes --]
When 0 is coerced implicitly or explicitly to a pointer type, it becomes a
null pointer. That's true even on architectures where all-bits-zero is
*not* a null pointer. However, in contexts where there is no expected
type, as in a call to execl(), the null at the end of the args list has to
be explicitly cast to (char *)0 or some other null pointer.
As for the definition of NULL, it is indeed 0 on Linux, but can also be
defined as ((void *)0), as on FreeBSD and the Mac, or even as 0L on systems
where ints are half-size and pointers and longs are full-size.
On Sun, Sep 20, 2020 at 4:59 PM Steve Nickolas <usotsuki@buric.co> wrote:
> On Sun, 20 Sep 2020, Doug McIlroy wrote:
>
> >> (Of course, that assumes NULL is 0, but I don't think I've run into any
> >> architecture so braindead as to not have NULL=0.)
> >
> > It has nothing to do with machine architecture. The C standard
> > says 0 coerces to the null pointer. NULL, defined in <stddef.h>,
> > is part of the library, not the language. I always use 0,
> > because NULL is a frill.
> >
> > Doug
>
> I was under the impression that there was explicitly no requirement that a
> null pointer be 0, and that there was at least one weird system where that
> wasn't true - that it just so happened that null points to 0 on certain
> CPUs and that 0=NULL *happens* to work on most CPUs but wasn't guaranteed.
> (In fact, I read that my habit of using 0 for NULL relied on a faulty
> assumption!)
>
> I mean, I've never actually used a CPU/OS/compiler where it wasn't true,
> but...
>
> -uso.
>
[-- Attachment #2: Type: text/html, Size: 2017 bytes --]
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2020-09-20 21:35 ` John Cowan
@ 2021-02-02 23:08 ` Greg A. Woods
2021-02-02 23:47 ` Larry McVoy
0 siblings, 1 reply; 44+ messages in thread
From: Greg A. Woods @ 2021-02-02 23:08 UTC (permalink / raw)
To: The Unix Heritage Society mailing list
[-- Attachment #1: Type: text/plain, Size: 1176 bytes --]
At Sun, 20 Sep 2020 17:35:52 -0400, John Cowan <cowan@ccil.org> wrote:
Subject: Re: [TUHS] reviving a bit of WWB
>
> When 0 is coerced implicitly or explicitly to a pointer type, it becomes a
> null pointer. That's true even on architectures where all-bits-zero is
> *not* a null pointer. However, in contexts where there is no expected
> type, as in a call to execl(), the null at the end of the args list has to
> be explicitly cast to (char *)0 or some other null pointer.
Yeah, that's more to do with the good/bad choice in C to do or not do
integer promotion in various situations, and to default parameter types
to 'int' unless they are, or are cast to, a wider type (and of course
with the rather tricky and almost non-portable way C allows variable
length argument lists, along with the somewhat poor way C was cajoled
into offering function prototypes to support separate compilation of
code units and the exceedingly poor way prototypes deal with variable
length argument lists).
--
Greg A. Woods <gwoods@acm.org>
Kelowna, BC +1 250 762-7675 RoboHack <woods@robohack.ca>
Planix, Inc. <woods@planix.com> Avoncote Farms <woods@avoncote.ca>
[-- Attachment #2: OpenPGP Digital Signature --]
[-- Type: application/pgp-signature, Size: 195 bytes --]
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2021-02-02 23:08 ` Greg A. Woods
@ 2021-02-02 23:47 ` Larry McVoy
2021-02-03 0:11 ` Dave Horsfall
0 siblings, 1 reply; 44+ messages in thread
From: Larry McVoy @ 2021-02-02 23:47 UTC (permalink / raw)
To: The Unix Heritage Society mailing list
On Tue, Feb 02, 2021 at 03:08:42PM -0800, Greg A. Woods wrote:
> At Sun, 20 Sep 2020 17:35:52 -0400, John Cowan <cowan@ccil.org> wrote:
> Subject: Re: [TUHS] reviving a bit of WWB
> >
> > When 0 is coerced implicitly or explicitly to a pointer type, it becomes a
> > null pointer. That's true even on architectures where all-bits-zero is
> > *not* a null pointer. However, in contexts where there is no expected
> > type, as in a call to execl(), the null at the end of the args list has to
> > be explicitly cast to (char *)0 or some other null pointer.
>
> Yeah, that's more to do with the good/bad choice in C to do or not do
> integer promotion in various situations, and to default parameter types
> to 'int' unless they are, or are cast to, a wider type
I've dealt with this, here is a story of a super computer where native
pointers pointed at bits but C pointers pointed at bytes and you can
shake your head at the promotion problems:
https://minnie.tuhs.org/pipermail/tuhs/2017-September/012050.html
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2021-02-02 23:47 ` Larry McVoy
@ 2021-02-03 0:11 ` Dave Horsfall
2021-02-03 0:19 ` Larry McVoy
0 siblings, 1 reply; 44+ messages in thread
From: Dave Horsfall @ 2021-02-03 0:11 UTC (permalink / raw)
To: The Eunuchs Hysterical Society
On Tue, 2 Feb 2021, Larry McVoy wrote:
> I've dealt with this, here is a story of a super computer where native
> pointers pointed at bits but C pointers pointed at bytes and you can
> shake your head at the promotion problems:
>
> https://minnie.tuhs.org/pipermail/tuhs/2017-September/012050.html
Holy smoking inodes! I'd forgotten that story... And yes, I really was
approached by Pr1me, and really did turn them down on the basis that if
they thought that "1" was prime then what else did they get wrong?[*]
Oh, they went belly-up shortly afterwards, so it was probably just as well
(plainly the innumerate marketoids were in charge).
[*]
If you assume that "1" is prime then it breaks all sorts of higher (and
obscure) maths.
-- Dave
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2021-02-03 0:11 ` Dave Horsfall
@ 2021-02-03 0:19 ` Larry McVoy
2021-02-03 2:04 ` Richard Salz
0 siblings, 1 reply; 44+ messages in thread
From: Larry McVoy @ 2021-02-03 0:19 UTC (permalink / raw)
To: Dave Horsfall; +Cc: The Eunuchs Hysterical Society
On Wed, Feb 03, 2021 at 11:11:44AM +1100, Dave Horsfall wrote:
> On Tue, 2 Feb 2021, Larry McVoy wrote:
>
> >I've dealt with this, here is a story of a super computer where native
> >pointers pointed at bits but C pointers pointed at bytes and you can shake
> >your head at the promotion problems:
> >
> >https://minnie.tuhs.org/pipermail/tuhs/2017-September/012050.html
>
> Holy smoking inodes! I'd forgotten that story...
I tend to log in to forums these days as "luckydude" and that story is
just the first of many examples. I have seemed to have gotten lucky
being building up the right experiences and then falling into a job that
can use them. It really made me look better than I am, things just sort
of flowed.
And having almost 6 months to work on whatever I wanted and deciding to
port the networking stack? Priceless down the road. No way they would
have given me that task, I was too green.
Fun times.
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2021-02-03 0:19 ` Larry McVoy
@ 2021-02-03 2:04 ` Richard Salz
2021-02-03 3:32 ` Dave Horsfall
0 siblings, 1 reply; 44+ messages in thread
From: Richard Salz @ 2021-02-03 2:04 UTC (permalink / raw)
To: Larry McVoy; +Cc: The Eunuchs Hysterical Society
[-- Attachment #1: Type: text/plain, Size: 208 bytes --]
> pointer to a bit
BBN made a machine "optimized" for C. It was used in the first generation
ARPAnet gateways.
A word was 10bits. The amount of masking we had to do for some portable
software was unreal.
[-- Attachment #2: Type: text/html, Size: 308 bytes --]
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2021-02-03 2:04 ` Richard Salz
@ 2021-02-03 3:32 ` Dave Horsfall
2021-02-03 4:32 ` M Douglas McIlroy
0 siblings, 1 reply; 44+ messages in thread
From: Dave Horsfall @ 2021-02-03 3:32 UTC (permalink / raw)
To: The Eunuchs Hysterical Society
[-- Attachment #1: Type: text/plain, Size: 600 bytes --]
On Tue, 2 Feb 2021, Richard Salz wrote:
> BBN made a machine "optimized" for C. It was used in the first
> generation ARPAnet gateways.
>
> A word was 10bits. The amount of masking we had to do for some portable
> software was unreal.
I'm trying to get my head around a 10-bit machine optimised for C...
Well, if you accept that chars are 10 bits wide then there shouldn't be
(much of) a problem; just forget about the concept of powers of 2, I
guess.
Shades of the 60-bit CDC series, as handling strings was a bit of a
bugger; at least the 12-bit PDP-8 was sort of manageable.
-- Dave
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2021-02-03 3:32 ` Dave Horsfall
@ 2021-02-03 4:32 ` M Douglas McIlroy
2021-02-03 11:27 ` Peter Jeremy via TUHS
2021-02-03 22:19 ` Dave Horsfall
0 siblings, 2 replies; 44+ messages in thread
From: M Douglas McIlroy @ 2021-02-03 4:32 UTC (permalink / raw)
To: Dave Horsfall; +Cc: The Eunuchs Hysterical Society
[-- Attachment #1: Type: text/plain, Size: 1022 bytes --]
> I 'm trying to get my head around a 10-bit machine optimised for C.
How about 23-bits? That was one of the early ESS machines, evidently
optimized to make every bit count. (Maybe a prime wordwidth helps
with hashing?)
Whirlwind II (built in 1952), was 16 bits. It took a long while for that
to become common wisdom.
Doug
On Tue, Feb 2, 2021 at 10:32 PM Dave Horsfall <dave@horsfall.org> wrote:
> On Tue, 2 Feb 2021, Richard Salz wrote:
>
> > BBN made a machine "optimized" for C. It was used in the first
> > generation ARPAnet gateways.
> >
> > A word was 10bits. The amount of masking we had to do for some portable
> > software was unreal.
>
> I'm trying to get my head around a 10-bit machine optimised for C...
> Well, if you accept that chars are 10 bits wide then there shouldn't be
> (much of) a problem; just forget about the concept of powers of 2, I
> guess.
>
> Shades of the 60-bit CDC series, as handling strings was a bit of a
> bugger; at least the 12-bit PDP-8 was sort of manageable.
>
> -- Dave
[-- Attachment #2: Type: text/html, Size: 1501 bytes --]
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2021-02-03 4:32 ` M Douglas McIlroy
@ 2021-02-03 11:27 ` Peter Jeremy via TUHS
2021-02-03 20:09 ` Dave Horsfall
2021-02-03 22:19 ` Dave Horsfall
1 sibling, 1 reply; 44+ messages in thread
From: Peter Jeremy via TUHS @ 2021-02-03 11:27 UTC (permalink / raw)
To: M Douglas McIlroy; +Cc: The Eunuchs Hysterical Society
[-- Attachment #1: Type: text/plain, Size: 981 bytes --]
On 2021-Feb-02 23:32:29 -0500, M Douglas McIlroy <m.douglas.mcilroy@dartmouth.edu> wrote:
>> I 'm trying to get my head around a 10-bit machine optimised for C.
>How about 23-bits? That was one of the early ESS machines, evidently
>optimized to make every bit count. (Maybe a prime wordwidth helps
>with hashing?)
>Whirlwind II (built in 1952), was 16 bits. It took a long while for that
>to become common wisdom.
I'm not sure that 16 (or any other 2^n) bits is that obvious up front.
Does anyone know why the computer industry wound up standardising on
8-bit bytes?
Scientific computers were word-based and the number of bits in a word
is more driven by the desired float range/precision. Commercial
computers needed to support BCD numbers and typically 6-bit characters.
ASCII (when it turned up) was 7 bits and so 8-bit characters wasted
⅛ of the storage. Minis tended to have shorter word sizes to minimise
the amount of hardware.
--
Peter Jeremy
[-- Attachment #2: signature.asc --]
[-- Type: application/pgp-signature, Size: 963 bytes --]
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2021-02-03 11:27 ` Peter Jeremy via TUHS
@ 2021-02-03 20:09 ` Dave Horsfall
2021-02-03 20:13 ` Niklas Karlsson
2021-02-03 23:46 ` Tom Lyon
0 siblings, 2 replies; 44+ messages in thread
From: Dave Horsfall @ 2021-02-03 20:09 UTC (permalink / raw)
To: The Eunuchs Hysterical Society
[-- Attachment #1: Type: text/plain, Size: 1074 bytes --]
On Wed, 3 Feb 2021, Peter Jeremy wrote:
> I'm not sure that 16 (or any other 2^n) bits is that obvious up front.
> Does anyone know why the computer industry wound up standardising on
> 8-bit bytes?
Best reason I can think of is System/360 with 8-bit EBCDIC (Ugh! Who said
that "J" should follow "I"?). I'm told that you could coerce it into
using ASCII, although I've never seen it.
> Scientific computers were word-based and the number of bits in a word is
> more driven by the desired float range/precision. Commercial computers
> needed to support BCD numbers and typically 6-bit characters. ASCII
> (when it turned up) was 7 bits and so 8-bit characters wasted ⅛ of the
> storage. Minis tended to have shorter word sizes to minimise the amount
> of hardware.
Why would you want to have a 7-bit symbol? Powers of two seem to be
natural on a binary machine (although there is a running joke that CDC
boxes has 7-1/2 bit bytes...
I guess the real question is why did we move to binary machines at all;
were there ever any ternary machines?
-- Dave
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2021-02-03 20:09 ` Dave Horsfall
@ 2021-02-03 20:13 ` Niklas Karlsson
2021-02-03 23:46 ` Tom Lyon
1 sibling, 0 replies; 44+ messages in thread
From: Niklas Karlsson @ 2021-02-03 20:13 UTC (permalink / raw)
To: Dave Horsfall; +Cc: The Eunuchs Hysterical Society
[-- Attachment #1: Type: text/plain, Size: 2513 bytes --]
According to Wikipedia:
The first modern, electronic ternary computer, Setun
<https://en.wikipedia.org/wiki/Setun>, was built in 1958 in the Soviet
Union at the Moscow State University
<https://en.wikipedia.org/wiki/Moscow_State_University> by Nikolay
Brusentsov <https://en.wikipedia.org/wiki/Nikolay_Brusentsov>,[4]
<https://en.wikipedia.org/wiki/Ternary_computer#cite_note-cmr-4>[5]
<https://en.wikipedia.org/wiki/Ternary_computer#cite_note-5> and it had
notable advantages over the binary
<https://en.wikipedia.org/wiki/Binary_numeral_system> computers that
eventually replaced it, such as lower electricity consumption and lower
production cost.[4]
<https://en.wikipedia.org/wiki/Ternary_computer#cite_note-cmr-4> In 1970
Brusentsov built an enhanced version of the computer, which he called
Setun-70.[4]
<https://en.wikipedia.org/wiki/Ternary_computer#cite_note-cmr-4> In the
United States, the ternary computing emulator Ternac
<https://en.wikipedia.org/wiki/Ternac> working on a binary machine was
developed in 1973.[6]
<https://en.wikipedia.org/wiki/Ternary_computer#cite_note-comp1974-6>:22
The ternary computer QTC-1 was developed in Canada.[7]
<https://en.wikipedia.org/wiki/Ternary_computer#cite_note-7>
Doesn't seem like they caught on otherwise, though.
Niklas
Den ons 3 feb. 2021 kl 21:10 skrev Dave Horsfall <dave@horsfall.org>:
> On Wed, 3 Feb 2021, Peter Jeremy wrote:
>
> > I'm not sure that 16 (or any other 2^n) bits is that obvious up front.
> > Does anyone know why the computer industry wound up standardising on
> > 8-bit bytes?
>
> Best reason I can think of is System/360 with 8-bit EBCDIC (Ugh! Who said
> that "J" should follow "I"?). I'm told that you could coerce it into
> using ASCII, although I've never seen it.
>
> > Scientific computers were word-based and the number of bits in a word is
> > more driven by the desired float range/precision. Commercial computers
> > needed to support BCD numbers and typically 6-bit characters. ASCII
> > (when it turned up) was 7 bits and so 8-bit characters wasted ⅛ of the
> > storage. Minis tended to have shorter word sizes to minimise the amount
> > of hardware.
>
> Why would you want to have a 7-bit symbol? Powers of two seem to be
> natural on a binary machine (although there is a running joke that CDC
> boxes has 7-1/2 bit bytes...
>
> I guess the real question is why did we move to binary machines at all;
> were there ever any ternary machines?
>
> -- Dave
[-- Attachment #2: Type: text/html, Size: 3704 bytes --]
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2021-02-03 20:09 ` Dave Horsfall
2021-02-03 20:13 ` Niklas Karlsson
@ 2021-02-03 23:46 ` Tom Lyon
1 sibling, 0 replies; 44+ messages in thread
From: Tom Lyon @ 2021-02-03 23:46 UTC (permalink / raw)
To: Dave Horsfall; +Cc: The Eunuchs Hysterical Society
[-- Attachment #1: Type: text/plain, Size: 1749 bytes --]
System/360s, or at least 370s, could do ASCII perfectly well.
When we started UNIX on VM/370, it was clear to us that we wanted to run
with ASCII. But some otherwise intelligent people told us that it *just
couldn't be done* - the instructions depended on EBCDIC.
But I think there was only 1 machine instruction with any hint of EBCDIC -
and it was an instruction that no-one could imagine being used by a
compiler,
Of course, plenty of EBCDIC/ASCII conversions went on in drivers, etc, but
that was easy.
On Wed, Feb 3, 2021 at 12:09 PM Dave Horsfall <dave@horsfall.org> wrote:
> On Wed, 3 Feb 2021, Peter Jeremy wrote:
>
> > I'm not sure that 16 (or any other 2^n) bits is that obvious up front.
> > Does anyone know why the computer industry wound up standardising on
> > 8-bit bytes?
>
> Best reason I can think of is System/360 with 8-bit EBCDIC (Ugh! Who said
> that "J" should follow "I"?). I'm told that you could coerce it into
> using ASCII, although I've never seen it.
>
> > Scientific computers were word-based and the number of bits in a word is
> > more driven by the desired float range/precision. Commercial computers
> > needed to support BCD numbers and typically 6-bit characters. ASCII
> > (when it turned up) was 7 bits and so 8-bit characters wasted ⅛ of the
> > storage. Minis tended to have shorter word sizes to minimise the amount
> > of hardware.
>
> Why would you want to have a 7-bit symbol? Powers of two seem to be
> natural on a binary machine (although there is a running joke that CDC
> boxes has 7-1/2 bit bytes...
>
> I guess the real question is why did we move to binary machines at all;
> were there ever any ternary machines?
>
> -- Dave
--
- Tom
[-- Attachment #2: Type: text/html, Size: 2579 bytes --]
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2021-02-03 4:32 ` M Douglas McIlroy
2021-02-03 11:27 ` Peter Jeremy via TUHS
@ 2021-02-03 22:19 ` Dave Horsfall
2021-02-03 22:55 ` M Douglas McIlroy
1 sibling, 1 reply; 44+ messages in thread
From: Dave Horsfall @ 2021-02-03 22:19 UTC (permalink / raw)
To: The Eunuchs Hysterical Society
On Tue, 2 Feb 2021, M Douglas McIlroy wrote:
> > I'm trying to get my head around a 10-bit machine optimised for C.
>
> How about 23-bits? That was one of the early ESS machines, evidently
> optimized to make every bit count. (Maybe a prime wordwidth helps with
> hashing?)
23 bits? I think I'm about to throw up... Yeah, being prime I suppose it
would help with hashing (and other crypto stuff).
> Whirlwind II (built in 1952), was 16 bits. It took a long while for that
> to become common wisdom.
Now that goes back...
-- Dave
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2021-02-03 22:19 ` Dave Horsfall
@ 2021-02-03 22:55 ` M Douglas McIlroy
0 siblings, 0 replies; 44+ messages in thread
From: M Douglas McIlroy @ 2021-02-03 22:55 UTC (permalink / raw)
To: Dave Horsfall; +Cc: The Eunuchs Hysterical Society
[-- Attachment #1: Type: text/plain, Size: 834 bytes --]
>> Whirlwind II (built in 1952), was 16 bits. It took a long while for that
>> to become common wisdom.
> Now that goes back...
Yup. Before my time. I didn't get to use it until 1954.
Doug
On Wed, Feb 3, 2021 at 5:19 PM Dave Horsfall <dave@horsfall.org> wrote:
> On Tue, 2 Feb 2021, M Douglas McIlroy wrote:
>
> > > I'm trying to get my head around a 10-bit machine optimised for C.
> >
> > How about 23-bits? That was one of the early ESS machines, evidently
> > optimized to make every bit count. (Maybe a prime wordwidth helps with
> > hashing?)
>
> 23 bits? I think I'm about to throw up... Yeah, being prime I suppose it
> would help with hashing (and other crypto stuff).
>
> > Whirlwind II (built in 1952), was 16 bits. It took a long while for that
> > to become common wisdom.
>
> Now that goes back...
>
> -- Dave
>
[-- Attachment #2: Type: text/html, Size: 1347 bytes --]
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2020-09-20 20:58 ` Steve Nickolas
2020-09-20 21:33 ` Brantley Coile
2020-09-20 21:35 ` John Cowan
@ 2020-09-20 22:15 ` Clem Cole
2020-09-20 22:47 ` John Cowan
2020-09-21 20:46 ` Steffen Nurpmeso
3 siblings, 1 reply; 44+ messages in thread
From: Clem Cole @ 2020-09-20 22:15 UTC (permalink / raw)
To: Steve Nickolas; +Cc: The Eunuchs Hysterical Society, Doug McIlroy
[-- Attachment #1: Type: text/plain, Size: 261 bytes --]
On Sun, Sep 20, 2020 at 4:59 PM Steve Nickolas <usotsuki@buric.co> wrote:
> I was under the impression that there was explicitly no requirement that a
> null pointer be 0,
>
Indeed, section 7.19 states it is *implementation-defined*. See my
previous message.
[-- Attachment #2: Type: text/html, Size: 917 bytes --]
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2020-09-20 22:15 ` Clem Cole
@ 2020-09-20 22:47 ` John Cowan
2020-09-21 20:48 ` Steffen Nurpmeso
0 siblings, 1 reply; 44+ messages in thread
From: John Cowan @ 2020-09-20 22:47 UTC (permalink / raw)
To: Clem Cole; +Cc: The Eunuchs Hysterical Society, Doug McIlroy
[-- Attachment #1: Type: text/plain, Size: 906 bytes --]
The confusion (I dare not call it a flame war) is arising out of the
difference between an object with all bits zero and a 0 constant (or
equivalently 2*0 or 3-3 or what not). 0 in pointer context is always a
null pointer, but it may or may not be all-bits-zero. 0 in integer context
is, on any sane machine, all-bits-zero (on 1's-complement machines it may
also be all-bits-one).
Personally, when I was programming in C I defined a macro #define
NULLPTR(t) ((t)0), so that I would write NULLPTR(char *) or NULLPTR(int *)
or whatever the Right Thing was.
On Sun, Sep 20, 2020 at 6:16 PM Clem Cole <clemc@ccc.com> wrote:
>
>
> On Sun, Sep 20, 2020 at 4:59 PM Steve Nickolas <usotsuki@buric.co> wrote:
>
>> I was under the impression that there was explicitly no requirement that
>> a
>> null pointer be 0,
>>
> Indeed, section 7.19 states it is *implementation-defined*. See my
> previous message.
>
[-- Attachment #2: Type: text/html, Size: 1867 bytes --]
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2020-09-20 22:47 ` John Cowan
@ 2020-09-21 20:48 ` Steffen Nurpmeso
0 siblings, 0 replies; 44+ messages in thread
From: Steffen Nurpmeso @ 2020-09-21 20:48 UTC (permalink / raw)
To: John Cowan; +Cc: The Eunuchs Hysterical Society, Doug McIlroy
John Cowan wrote in
<CAD2gp_SV19s6yHsdApn8Rfkgb3OZCVFdd++_MWTsX5b7c_Jbig@mail.gmail.com>:
|The confusion (I dare not call it a flame war) is arising out of the
|difference between an object with all bits zero and a 0 constant (or
|equivalently 2*0 or 3-3 or what not). 0 in pointer context is always a
|null pointer, but it may or may not be all-bits-zero. 0 in integer context
|is, on any sane machine, all-bits-zero (on 1's-complement machines it may
|also be all-bits-one).
|
|Personally, when I was programming in C I defined a macro #define
|NULLPTR(t) ((t)0), so that I would write NULLPTR(char *) or NULLPTR(int *)
|or whatever the Right Thing was.
And i think too that POSIX is about to define this explicitly in
the future (regarding all bits zero).
--steffen
|
|Der Kragenbaer, The moon bear,
|der holt sich munter he cheerfully and one by one
|einen nach dem anderen runter wa.ks himself off
|(By Robert Gernhardt)
^ permalink raw reply [flat|nested] 44+ messages in thread
* Re: [TUHS] reviving a bit of WWB
2020-09-20 20:58 ` Steve Nickolas
` (2 preceding siblings ...)
2020-09-20 22:15 ` Clem Cole
@ 2020-09-21 20:46 ` Steffen Nurpmeso
3 siblings, 0 replies; 44+ messages in thread
From: Steffen Nurpmeso @ 2020-09-21 20:46 UTC (permalink / raw)
To: Steve Nickolas; +Cc: tuhs, Doug McIlroy
Steve Nickolas wrote in
<alpine.DEB.2.21.2009201654300.10605@sd-119843.dedibox.fr>:
|On Sun, 20 Sep 2020, Doug McIlroy wrote:
|
|>> (Of course, that assumes NULL is 0, but I don't think I've run into any
|>> architecture so braindead as to not have NULL=0.)
|>
|> It has nothing to do with machine architecture. The C standard
|> says 0 coerces to the null pointer. NULL, defined in <stddef.h>,
|> is part of the library, not the language. I always use 0,
|> because NULL is a frill.
|>
|> Doug
|
|I was under the impression that there was explicitly no requirement that a
|null pointer be 0, and that there was at least one weird system where that
|wasn't true - that it just so happened that null points to 0 on certain
|CPUs and that 0=NULL *happens* to work on most CPUs but wasn't guaranteed.
|(In fact, I read that my habit of using 0 for NULL relied on a faulty
|assumption!)
|
|I mean, I've never actually used a CPU/OS/compiler where it wasn't true,
|but...
I remember having to use __null for __GNUC__>=3 because 0x0 (what
is used for my NIL macro before, this was C++) did not work out
well. (Could have been compiler bug, of course .. but i just
remembered it when reading your post.)
--steffen
|
|Der Kragenbaer, The moon bear,
|der holt sich munter he cheerfully and one by one
|einen nach dem anderen runter wa.ks himself off
|(By Robert Gernhardt)
^ permalink raw reply [flat|nested] 44+ messages in thread