Eugene Miya visited by last week and accidentally left his copy of the book here so I decided to read it before he came back to pick it up. My overall impression is that while it contained a lot of information, it wasn't presented in a manner that I found interesting. I don't know the intended target audience, but it's not me. A good part of it is that my interest is in the evolution of technology. I think that a more accurate title for the book would be "A New History of the Business of Modern Computing". The book was thorough in covering the number of each type of machine sold and how much money was made, but that's only of passing interest to me. Were it me I would have just summarized all that in a table and used the space to tell some engaging anecdotes. There were a number of things that I felt the book glossed over or missed completely. One is that I didn't think that they gave sufficient credit to the symbiosis between C and the PDP-11 instruction set and the degree to which the PDP-11 was enormously influential. Another is that I felt that the book didn't give computer graphics adequate treatment. I realize that it was primarily in the workstation market segment which was not as large as some of the other segments, but in my opinion the development of the technology was hugely important as it eventually became commodified and highly profitable. Probably due to my personal involvement I felt that the book missed some important steps along the path toward open source. In particular, it used the IPO of Red Hat as the seminal moment while not even mentioning the role of Cygnus. My opinion is that Cygnus was a huge icebreaker in the adoption of open source by the business world, and that the Red Hat IPO was just the culmination. I also didn't feel that there was any message or takeaways for readers. I didn't get any "based on all this I should go and do that" sort of feeling. If the purpose of the book was to present a dry history then it pretty much did it's job. Obviously the authors had to pick and choose what to write about and I would have made some different choices. But, not my book. Jon
Is there a symbiosis between C and the PDP-11 instruction set? The
machine was vital to C and Unix's success, but primarily due to the
availability of a department-sized machine. Was the instruction set a
significant component? Most Unix programmers wrote little to no
assembly, although perhaps more read what came out of the compiler.
But did it matter? Auto-increment and -decrement are often cited in
this story, but they are not that important, really, and were around
well before the PDP-11 made its appearance.
I'm curious to hear arguments on either side.
-rob
On Mon, Nov 29, 2021 at 7:29 AM Jon Steinhart <jon@fourwinds.com> wrote:
>
> Eugene Miya visited by last week and accidentally left his copy of the
> book here so I decided to read it before he came back to pick it up.
>
> My overall impression is that while it contained a lot of information,
> it wasn't presented in a manner that I found interesting. I don't know
> the intended target audience, but it's not me.
>
> A good part of it is that my interest is in the evolution of technology.
> I think that a more accurate title for the book would be "A New History
> of the Business of Modern Computing". The book was thorough in covering
> the number of each type of machine sold and how much money was made, but
> that's only of passing interest to me. Were it me I would have just
> summarized all that in a table and used the space to tell some engaging
> anecdotes.
>
> There were a number of things that I felt the book glossed over or missed
> completely.
>
> One is that I didn't think that they gave sufficient credit to the symbiosis
> between C and the PDP-11 instruction set and the degree to which the PDP-11
> was enormously influential.
>
> Another is that I felt that the book didn't give computer graphics adequate
> treatment. I realize that it was primarily in the workstation market segment
> which was not as large as some of the other segments, but in my opinion the
> development of the technology was hugely important as it eventually became
> commodified and highly profitable.
>
> Probably due to my personal involvement I felt that the book missed some
> important steps along the path toward open source. In particular, it used
> the IPO of Red Hat as the seminal moment while not even mentioning the role
> of Cygnus. My opinion is that Cygnus was a huge icebreaker in the adoption
> of open source by the business world, and that the Red Hat IPO was just the
> culmination.
>
> I also didn't feel that there was any message or takeaways for readers. I
> didn't get any "based on all this I should go and do that" sort of feeling.
>
> If the purpose of the book was to present a dry history then it pretty much
> did it's job. Obviously the authors had to pick and choose what to write
> about and I would have made some different choices. But, not my book.
>
> Jon
Rob Pike writes:
> Is there a symbiosis between C and the PDP-11 instruction set? The
> machine was vital to C and Unix's success, but primarily due to the
> availability of a department-sized machine. Was the instruction set a
> significant component? Most Unix programmers wrote little to no
> assembly, although perhaps more read what came out of the compiler.
> But did it matter? Auto-increment and -decrement are often cited in
> this story, but they are not that important, really, and were around
> well before the PDP-11 made its appearance.
>
> I'm curious to hear arguments on either side.
>
> -rob
Well, might just be my personal experience, but most of the machines
that I had used before the 11 were classic accumulator architectures.
I feel that the 11's pointer architecture combined with autoincrement
and autodecrement was an amazing fit for C. If I remember correctly,
it was very cool to have *p++ = *q++ be a single instruction.
BTW, one thing that I forgot in my earlier post is that I think that
the book also omitted any mention of Creative Commons. The book did
talk about the business of the web and such, and it's my opinion that
CC was an an essential third prong. The machines were one, the software
was another, the third was content and CC was a big enabler.
Jon
I heard that the null terminated string was a 11-build-in. --- ------------------------------ Is there a symbiosis between C and the PDP-11 instruction set? The machine was vital to C and Unix's success, but primarily due to the availability of a department-sized machine. Was the instruction set a significant component? Most Unix programmers wrote little to no assembly, although perhaps more read what came out of the compiler. But did it matter? Auto-increment and -decrement are often cited in this story, but they are not that important, really, and were around well before the PDP-11 made its appearance. I'm curious to hear arguments on either side. -rob On Mon, Nov 29, 2021 at 7:29 AM Jon Steinhart <jon@fourwinds.com> wrote: > > Eugene Miya visited by last week and accidentally left his copy of the > book here so I decided to read it before he came back to pick it up. > > My overall impression is that while it contained a lot of information, > it wasn't presented in a manner that I found interesting. I don't know > the intended target audience, but it's not me. > > A good part of it is that my interest is in the evolution of technology. > I think that a more accurate title for the book would be "A New History > of the Business of Modern Computing". The book was thorough in covering > the number of each type of machine sold and how much money was made, but > that's only of passing interest to me. Were it me I would have just > summarized all that in a table and used the space to tell some engaging > anecdotes. > > There were a number of things that I felt the book glossed over or missed > completely. > > One is that I didn't think that they gave sufficient credit to the symbiosis > between C and the PDP-11 instruction set and the degree to which the PDP-11 > was enormously influential. > > Another is that I felt that the book didn't give computer graphics adequate > treatment. I realize that it was primarily in the workstation market segment > which was not as large as some of the other segments, but in my opinion the > development of the technology was hugely important as it eventually became > commodified and highly profitable. > > Probably due to my personal involvement I felt that the book missed some > important steps along the path toward open source. In particular, it used > the IPO of Red Hat as the seminal moment while not even mentioning the role > of Cygnus. My opinion is that Cygnus was a huge icebreaker in the adoption > of open source by the business world, and that the Red Hat IPO was just the > culmination. > > I also didn't feel that there was any message or takeaways for readers. I > didn't get any "based on all this I should go and do that" sort of feeling. > > If the purpose of the book was to present a dry history then it pretty much > did it's job. Obviously the authors had to pick and choose what to write > about and I would have made some different choices. But, not my book. > > Jon
[-- Attachment #1: Type: text/plain, Size: 1816 bytes --] The PDP-11 had very little the syntax of B expressions. All of that was in place in B long before the PDP-11. To be honest, the byte addressing of the 11 was a significant hindrance. It was the genius of Dennis that was able to conquer the 11 as he installed types into the language. So, my opinion, the PDP-11 had no design on the type system of C and moreover it was not even helpful. On Sun, Nov 28, 2021 at 1:17 PM Jon Steinhart <jon@fourwinds.com> wrote: > Rob Pike writes: > > Is there a symbiosis between C and the PDP-11 instruction set? The > > machine was vital to C and Unix's success, but primarily due to the > > availability of a department-sized machine. Was the instruction set a > > significant component? Most Unix programmers wrote little to no > > assembly, although perhaps more read what came out of the compiler. > > But did it matter? Auto-increment and -decrement are often cited in > > this story, but they are not that important, really, and were around > > well before the PDP-11 made its appearance. > > > > I'm curious to hear arguments on either side. > > > > -rob > > Well, might just be my personal experience, but most of the machines > that I had used before the 11 were classic accumulator architectures. > I feel that the 11's pointer architecture combined with autoincrement > and autodecrement was an amazing fit for C. If I remember correctly, > it was very cool to have *p++ = *q++ be a single instruction. > > BTW, one thing that I forgot in my earlier post is that I think that > the book also omitted any mention of Creative Commons. The book did > talk about the business of the web and such, and it's my opinion that > CC was an an essential third prong. The machines were one, the software > was another, the third was content and CC was a big enabler. > > Jon > [-- Attachment #2: Type: text/html, Size: 2315 bytes --]
On Sun, 28 Nov 2021, Thomas Paulsen wrote:
> I heard that the null terminated string was a 11-build-in.
It's a fairly good fit for 6502, too. When I write 6502 code, all my
messages are stored as C strings. Because on an Apple, something like
this...
putch = $FDED
entry: ldy #$00
@1: lda msg, y
beq @2
eor #$80
jsr putch
iny
bne @1
@2: rts
msg: .byte "Hello, cruel world.", 13, 0
...is pretty easy to do.
-uso.
It's been a long time but my memory is that PDP-11 instructions were way cleaner than any other system I've seen. My TA for my PDP-11 assembly class could read octal like it was C, I was never that good. He told me it was actually pretty easy once you memorized the instruction set, which he claimed was not hard because it was so uniform. I never learned it well enough to know, just did a handful of programs in assembler, but his description has stuck with me. I've had to learn enough SPARC, MIPS, and (shudder) x86, to do kernel debugging and I've never gotten the sense that they were are clean as PDP-11 was. On Mon, Nov 29, 2021 at 08:07:57AM +1100, Rob Pike wrote: > Is there a symbiosis between C and the PDP-11 instruction set? The > machine was vital to C and Unix's success, but primarily due to the > availability of a department-sized machine. Was the instruction set a > significant component? Most Unix programmers wrote little to no > assembly, although perhaps more read what came out of the compiler. > But did it matter? Auto-increment and -decrement are often cited in > this story, but they are not that important, really, and were around > well before the PDP-11 made its appearance. > > I'm curious to hear arguments on either side. > > -rob > > On Mon, Nov 29, 2021 at 7:29 AM Jon Steinhart <jon@fourwinds.com> wrote: > > > > Eugene Miya visited by last week and accidentally left his copy of the > > book here so I decided to read it before he came back to pick it up. > > > > My overall impression is that while it contained a lot of information, > > it wasn't presented in a manner that I found interesting. I don't know > > the intended target audience, but it's not me. > > > > A good part of it is that my interest is in the evolution of technology. > > I think that a more accurate title for the book would be "A New History > > of the Business of Modern Computing". The book was thorough in covering > > the number of each type of machine sold and how much money was made, but > > that's only of passing interest to me. Were it me I would have just > > summarized all that in a table and used the space to tell some engaging > > anecdotes. > > > > There were a number of things that I felt the book glossed over or missed > > completely. > > > > One is that I didn't think that they gave sufficient credit to the symbiosis > > between C and the PDP-11 instruction set and the degree to which the PDP-11 > > was enormously influential. > > > > Another is that I felt that the book didn't give computer graphics adequate > > treatment. I realize that it was primarily in the workstation market segment > > which was not as large as some of the other segments, but in my opinion the > > development of the technology was hugely important as it eventually became > > commodified and highly profitable. > > > > Probably due to my personal involvement I felt that the book missed some > > important steps along the path toward open source. In particular, it used > > the IPO of Red Hat as the seminal moment while not even mentioning the role > > of Cygnus. My opinion is that Cygnus was a huge icebreaker in the adoption > > of open source by the business world, and that the Red Hat IPO was just the > > culmination. > > > > I also didn't feel that there was any message or takeaways for readers. I > > didn't get any "based on all this I should go and do that" sort of feeling. > > > > If the purpose of the book was to present a dry history then it pretty much > > did it's job. Obviously the authors had to pick and choose what to write > > about and I would have made some different choices. But, not my book. > > > > Jon -- --- Larry McVoy lm at mcvoy.com http://www.mcvoy.com/lm
Ken Thompson writes:
>
> The PDP-11 had very little the syntax of B expressions.
> All of that was in place in B long before the PDP-11.
> To be honest, the byte addressing of the 11 was a
> significant hindrance. It was the genius of Dennis
> that was able to conquer the 11 as he installed types
> into the language.
>
> So, my opinion, the PDP-11 had no design on the
> type system of C and moreover it was not even helpful.
OK then. You *would* be the expert.
It's a just-so story. We have nostalgia for Unix, C, and the PDP-11
and its instruction set, and we combine them all into the story about
why it all succeeded. But nostalgia can mislead.
I loved the PDP-11 and its instruction set, I loved C, and I loved
Unix. Memory has put causation in there that is not altogether true.
The PDP-11 as an affordable commercial computer, now _that_ was important.
-rob
On Mon, Nov 29, 2021 at 8:50 AM Jon Steinhart <jon@fourwinds.com> wrote:
>
> Ken Thompson writes:
> >
> > The PDP-11 had very little the syntax of B expressions.
> > All of that was in place in B long before the PDP-11.
> > To be honest, the byte addressing of the 11 was a
> > significant hindrance. It was the genius of Dennis
> > that was able to conquer the 11 as he installed types
> > into the language.
> >
> > So, my opinion, the PDP-11 had no design on the
> > type system of C and moreover it was not even helpful.
>
> OK then. You *would* be the expert.
The ++ operator appears to have been. The PDP-11 had addressing modes to so predecrement and postincrement.
> On Nov 28, 2021, at 16:41, Steve Nickolas <usotsuki@buric.co> wrote:
>
> On Sun, 28 Nov 2021, Thomas Paulsen wrote:
>
>> I heard that the null terminated string was a 11-build-in.
>
> It's a fairly good fit for 6502, too. When I write 6502 code, all my messages are stored as C strings. Because on an Apple, something like this...
>
> putch = $FDED
>
> entry: ldy #$00
> @1: lda msg, y
> beq @2
> eor #$80
> jsr putch
> iny
> bne @1
> @2: rts
>
> msg: .byte "Hello, cruel world.", 13, 0
>
> ...is pretty easy to do.
>
> -uso.
> The ++ operator appears to have been. One would expect that most people on this list would have read "The Development of the C Language", by Dennis Ritchie, which makes perfectly clear (at 'More History') that the PDP-11 had nothing to do with it: Thompson went a step further by inventing the ++ and -- operators, which increment or decrement; their prefix or postfix position determines whether the alteration occurs before or after noting the value of the operand. They were not in the earliest versions of B, but appeared along the way. People often guess that they were created to use the auto-increment and auto-decrement address modes provided by the DEC PDP-11 on which C and Unix first became popular. This is historically impossible, since there was no PDP-11 when B was developed. https://www.bell-labs.com/usr/dmr/www/chist.html thereby alleviating the need for Ken to chime in (although they do allow a very efficient implementation of it). Too much to hope for, I guess. Noel
[-- Attachment #1: Type: text/plain, Size: 2667 bytes --] Getting a bit far afield from Unixes, but A Quick Rundown Of Instruction Sets I Have Known, more or less in the order I learned them: 6502: you never forget your first love, and, sure, it's constrained, but it's elegant and concise and I still adore it. 68k: Lovely. I used it before I ever used the PDP-11, but in retrospect it's like the PDP-11 but more so. Roomy, comfortable, regular. Too bad it lost to x86 in the marketplace. 8051: I mean, OK, I get it, you need a low-cost embedded architecture and it's the 1980s, but...yuck. x86-and-descendents: the less said the better. Maybe I just don't like Intel's designs? SPARC: It's not bad. Having lots of registers is nice. But by the time it came along compilers were good enough that I never actually needed to use it in anger. S/360-and-descendents: The S/360 is OK, even nice, in a very 1960s IBM way. And then its evolution just KEPT adding ever more baroque filigrees onto it. Don't get me wrong, I love SIE, because I love VM, but even that is kind of a bag on the side, and by the time you get to System z...this is what happens when you don't start over from a clean sheet every so often. PDP-11: There's a very good reason it was used as a model architecture in coursework for decades. Also regular and comfortable. TI-99/4A (more or less TI 9900): I like microcode as much as anyone but honestly this is pretty silly here, folks. These days I'm kinda sorta poking at RISC-V and ARM. Not that I need to, but they seem nifty. Adam On Sun, Nov 28, 2021 at 4:15 PM Noel Chiappa <jnc@mercury.lcs.mit.edu> wrote: > > The ++ operator appears to have been. > > One would expect that most people on this list would have read "The > Development of the C Language", by Dennis Ritchie, which makes perfectly > clear > (at 'More History') that the PDP-11 had nothing to do with it: > > Thompson went a step further by inventing the ++ and -- operators, which > increment or decrement; their prefix or postfix position determines > whether > the alteration occurs before or after noting the value of the operand. > They > were not in the earliest versions of B, but appeared along the way. > People > often guess that they were created to use the auto-increment and > auto-decrement address modes provided by the DEC PDP-11 on which C and > Unix > first became popular. This is historically impossible, since there was no > PDP-11 when B was developed. > > https://www.bell-labs.com/usr/dmr/www/chist.html > > thereby alleviating the need for Ken to chime in (although they do allow a > very efficient implementation of it). > > Too much to hope for, I guess. > > Noel > > [-- Attachment #2: Type: text/html, Size: 3391 bytes --]
[-- Attachment #1: Type: text/plain, Size: 2858 bytes --] Rob, I offer a small tweak to your statement, that I hope you will consider On Sun, Nov 28, 2021 at 5:20 PM Rob Pike <robpike@gmail.com> wrote: > The PDP-11 as an affordable commercial computer, now _that_ was important. > s/computer/mini-computer/ I really believe that this distinction is important. Bell coined the term in the late 1950s/early 1960s when he called it a minicomputer. The key is that he meant >>minimal computer - in function and price<< (not small). (This would event eventual lead to Bell's law for the birth and death of computer classes). To me, the PDP-111 ISA is the epitome the *minimal computer architecture* - just want you to need to get the job done be it commercial or scientific and it was affordable as you said. The solution is elegant, nothing fancy, little extra added - just the right set of features for a system to do real work. It was also extremely regular as Larry points out, so it was not filled with a ton of special cases. It did have a few more features like addressing modes, and multiple registers that made it more complex than say an accumulator-based PDP-8. But the small set of new features made sense and were* of** use for almost all programmers*. [FWIW: IMHO, most new features we add to Intel*64 is all for some special cases for a specific customer]. I note that the VAX (was is the epitome of the CISC and while extraordinarily successful), has always been an easy target as way too complicated, filled with many special cases (just for the Fortran compiler, or for Cutler's as an assembly programmer). IMHO: C is also made from the same minimal ideal. It took the simplicity of the B and added typing and better data structures, but did not overdo it. Again, what was added was useful to almost all programmers. I note that while the follow-on to both the 11 (the Vax) and C (C++) became working horses, but both are ugly as can be, and neither would I call elegant. I've used them both, however, I have moved on since that time. I do pine for something more like a 64-bit PDP-11 (more in a minute), and still use C when I can in the kernel or Go when in userspace. Having kicked around DEC during some of the Alpha discussions, other than the original lack of byte addressing, I think the PDP-11 influenced the Alpha more than VAX did. There was a definition -- why is the needed -- thinking. Keep it simple a minimal. As for Unix (since this is a Unix history list), again I think it is the minimal view I miss from Sixth and Seventh Edition. I look at Linux and mostly turn green with how much has been lost from those days. But like the PDP-11, I can not really go back. My hope is that something will appear that is "good enough" and '"simple enough" to get people excited again. my 2 cents, Clem ᐧ ᐧ [-- Attachment #2: Type: text/html, Size: 7439 bytes --]
On Sun, Nov 28, 2021 at 07:19:08PM -0500, Clem Cole wrote: > To me, the PDP-111 ISA is the epitome the *minimal computer architecture* > - just want you to need to get the job done be it commercial or scientific > and it was affordable as you said. The solution is elegant, nothing fancy, > little extra added - just the right set of features for a system to do real > work. It was also extremely regular as Larry points out, so it was not > filled with a ton of special cases. I remember Ken Witte (my TA for the PDP-11 class) trying to get me to see how easy it was to read the octal. If I remember correctly (and I probably don't, this was ~40 years ago), the instructions were divided into fields, so instruction, operand, operand and it was all regular, so you could see that this was some form of an add or whatever, it got the values from these registers and put it in that register. I remember Ken trying to get me to see how uniform it all was and I guess I sort of got it but what I remember the most is his passion for it. We were pretty friendly and if I had some big octal listing that wasn't working, he'd come over and drink a beer and read through it. For him, it was just faster to read the octal than to look at my tortured assembly. > 64 bit PDP-11 That would be pretty cool. Your comments about minimalist approaches ring really true for me. The last conversation I had with Greg Chesson was a 2 hour rant from him about the fact that nobody who is doing anything these days understands the value of a minimalist approach, it's one complex framework or whatever after another. There is a reason that the people I respect the most tend to spend a lot of time on what not to put in, rather than what to put in. I became friends with Linus Torvalds because we spent probably almost a year talking about what not to put in to LMbench, we wanted to get it right. I know people look at Linux and recoil in horror, it's a long way from v6 or v7. But v7 was a uniprocessor Unix that had no networking. Linux scales pretty well on SMPs with lots of CPUs, it has generalized NUMA support, it has a /proc that I'd argue is way more true to Unix than the SysV /proc (I don't know Ron Gomes but I was friends with Rodger Faulkner), the Linux /proc is all strings, it's so useful. Linux just handles way way way way way more than v7 could even imagine handling. Pretty much all of the supercomputers are Linux so it scales up and it scales down to a rasberry pi. A thing that blew my mind in Linux was drivers. PCI drivers. They were portable to different byte order machines. I was so used to drivers being specific to the CPU, that was eye opening. I'd say more but the wife is calling, I just wanted you to know that Linus definitely understands the minimalist approach, Linux started that way but it has been asked to do a lot so you get what you get. --lm
[-- Attachment #1: Type: text/plain, Size: 3286 bytes --] I suspect because we believed we understood the pdp11 we felt we'd understand a good operating system on it. If more tertiary education people had been on other hardware of the day, we'd probably have invented the same myths for that host. G On Mon, 29 Nov 2021, 10:22 am Clem Cole, <clemc@ccc.com> wrote: > Rob, I offer a small tweak to your statement, that I hope you will consider > > On Sun, Nov 28, 2021 at 5:20 PM Rob Pike <robpike@gmail.com> wrote: > >> The PDP-11 as an affordable commercial computer, now _that_ was important. >> > s/computer/mini-computer/ > > I really believe that this distinction is important. Bell coined the term > in the late 1950s/early 1960s when he called it a minicomputer. The key is > that he meant >>minimal computer - in function and price<< (not small). > (This would event eventual lead to Bell's law for the birth and death of > computer classes). > > To me, the PDP-111 ISA is the epitome the *minimal computer architecture* > - just want you to need to get the job done be it commercial or > scientific and it was affordable as you said. The solution is elegant, > nothing fancy, little extra added - just the right set of features for a > system to do real work. It was also extremely regular as Larry points out, > so it was not filled with a ton of special cases. It did have a few more > features like addressing modes, and multiple registers that made it more > complex than say an accumulator-based PDP-8. But the small set of new > features made sense and were* of** use for almost all programmers*. > [FWIW: IMHO, most new features we add to Intel*64 is all for some special > cases for a specific customer]. > > I note that the VAX (was is the epitome of the CISC and while > extraordinarily successful), has always been an easy target as way too > complicated, filled with many special cases (just for the Fortran > compiler, or for Cutler's as an assembly programmer). > > IMHO: C is also made from the same minimal ideal. It took the > simplicity of the B and added typing and better data structures, but did > not overdo it. Again, what was added was useful to almost all programmers. > > I note that while the follow-on to both the 11 (the Vax) and C (C++) > became working horses, but both are ugly as can be, and neither would I > call elegant. I've used them both, however, I have moved on since that > time. I do pine for something more like a 64-bit PDP-11 (more in a > minute), and still use C when I can in the kernel or Go when in userspace. > > > Having kicked around DEC during some of the Alpha discussions, other than > the original lack of byte addressing, I think the PDP-11 influenced the > Alpha more than VAX did. There was a definition -- why is the needed -- > thinking. Keep it simple a minimal. > > As for Unix (since this is a Unix history list), again I think it is the > minimal view I miss from Sixth and Seventh Edition. I look at Linux and > mostly turn green with how much has been lost from those days. But like > the PDP-11, I can not really go back. My hope is that something will > appear that is "good enough" and '"simple enough" to get people excited > again. > > my 2 cents, > Clem > ᐧ > ᐧ > [-- Attachment #2: Type: text/html, Size: 8121 bytes --]
On Nov 28, 2021, at 4:19 PM, Clem Cole <clemc@ccc.com> wrote:
>
> My hope is that something will appear that is "good enough" and
> '"simple enough" to get people excited again
My hope is for "Unix as a service". Just another service for
programs that need a Unix API. I think KeyNIX (Unix on top of
KeyKOS) had the right idea but the problem was and is that
typically hardware is not optimized for IPC & fast context
switching. That may change when we can put zillions of core
on a chip but can't speed up individual cores. UAAS may even
facilitate a move to a sea-of-cores architecture!
Was B, or rather BCPL, influenced by Algol68? It too had
<var> <op>:= <value>
as a shorthand for
<var> := <var> op <value>
Its declaration
<type> <name>
is the same as in C. Though in A68 this was a shorthand for
ref <type> <name> = loc <type>
> On Nov 28, 2021, at 1:31 PM, Ken Thompson <kenbob@gmail.com> wrote:
>
> The PDP-11 had very little the syntax of B expressions.
> All of that was in place in B long before the PDP-11.
> To be honest, the byte addressing of the 11 was a
> significant hindrance. It was the genius of Dennis
> that was able to conquer the 11 as he installed types
> into the language.
>
> So, my opinion, the PDP-11 had no design on the
> type system of C and moreover it was not even helpful.
>
> On Sun, Nov 28, 2021 at 1:17 PM Jon Steinhart <jon@fourwinds.com> wrote:
> Rob Pike writes:
> > Is there a symbiosis between C and the PDP-11 instruction set? The
> > machine was vital to C and Unix's success, but primarily due to the
> > availability of a department-sized machine. Was the instruction set a
> > significant component? Most Unix programmers wrote little to no
> > assembly, although perhaps more read what came out of the compiler.
> > But did it matter? Auto-increment and -decrement are often cited in
> > this story, but they are not that important, really, and were around
> > well before the PDP-11 made its appearance.
> >
> > I'm curious to hear arguments on either side.
> >
> > -rob
>
> Well, might just be my personal experience, but most of the machines
> that I had used before the 11 were classic accumulator architectures.
> I feel that the 11's pointer architecture combined with autoincrement
> and autodecrement was an amazing fit for C. If I remember correctly,
> it was very cool to have *p++ = *q++ be a single instruction.
>
> BTW, one thing that I forgot in my earlier post is that I think that
> the book also omitted any mention of Creative Commons. The book did
> talk about the business of the web and such, and it's my opinion that
> CC was an an essential third prong. The machines were one, the software
> was another, the third was content and CC was a big enabler.
>
> Jon
[-- Attachment #1: Type: text/plain, Size: 635 bytes --] On Sun, Nov 28, 2021 at 6:37 PM Adam Thornton <athornton@gmail.com> wrote: > PDP-11: There's a very good reason it was used as a model architecture in > coursework for decades. Also regular and comfortable. > MIPS II / MIPS32 has also been used as a model architecture: it's 32-bit and supported by current gcc (as is the PDP-11). A short paper on running C programs on the JVM is at <http://www.xwt.org/mips2java/>: you compiled them to MIPS R2K statically linked executables and then compiled the resulting executables to Java. The regularity of the MIPS ISA made the compiled code decently fast even before the JVM JIT. > >> [-- Attachment #2: Type: text/html, Size: 1558 bytes --]
On Nov 28, 2021, at 5:12 PM, Larry McVoy <lm@mcvoy.com> wrote:
>
> On Sun, Nov 28, 2021 at 07:19:08PM -0500, Clem Cole wrote:
>
>> 64 bit PDP-11
>
> That would be pretty cool. Your comments about minimalist approaches ring
> really true for me. The last conversation I had with Greg Chesson was
> a 2 hour rant from him about the fact that nobody who is doing anything
> these days understands the value of a minimalist approach, it's one
> complex framework or whatever after another.
Indeed.
As far as processor design is concerned, I believe one of the
problems is that there are fewer and fewer people who can do
both h/w and s/w design competently. This is why I think more
programmers should roll up their sleeves and design a
processor and understand the issues involved, especially now
that programming FPGAs is becoming common. May be start with
an existing RISC-V core in some HDL, and push and pull it
into (what you think is) an ideal minimalist design. Even
adding a codegen target for such a processor (to at least tcc
or Ken's C compiler) won't be all that hard. I believe this
sort of co-design is what is needed to move past the current
designs. It will likely be some young whippersnapper who
doesn't know what is impossible, rather than one of us
greybeards!
Bakul Shah <bakul@iitbombay.org> wrote:
> Was B, or rather BCPL, influenced by Algol68? It too had
> <var> <op>:= <value>
> as a shorthand for
> <var> := <var> op <value>
> Its declaration
> <type> <name>
> is the same as in C. Though in A68 this was a shorthand for
> ref <type> <name> = loc <type>
I don't know if it was purposeful or not, but Algol 68 had the notion
of deproceduring - i.e. function call, which seems to have carried over
into C where the name of function is a pointer to it. You can do
void myproc();
void (*functptr) = myproc;
...
funcptr()
to call through the pointer. (Even though the K&R book taught us
to use (*funcptr)(), the syntax above worked at least as far back
as PCC.)
Did C pick this up from Algol 68? I have no idea, but it would not
surprise me if it had.
Arnold
arnold@skeeve.com wrote:
> void myproc();
> void (*functptr) = myproc;
> ...
> funcptr()
Make that
void (*funcptr)() = myproc.
Sorry.
On 28 Nov 2021 17:47 -0800, from bakul@iitbombay.org (Bakul Shah): > Was B, or rather BCPL, influenced by Algol68? It too had > <var> <op>:= <value> > as a shorthand for > <var> := <var> op <value> The already mentioned https://www.bell-labs.com/usr/dmr/www/chist.html states that "For example, B introduced generalized assignment operators, using x=+y to add y to x. The notation came from Algol 68 [Wijngaarden 75] via McIlroy, who had incorporated it into his version of TMG." -- Michael Kjörling • https://michael.kjorling.se • michael@kjorling.se “Remember when, on the Internet, nobody cared that you were a dog?”
On 11/28/21 6:35 PM, Adam Thornton wrote:
> Getting a bit far afield from Unixes, but A Quick Rundown Of
> Instruction Sets I Have Known, more or less in the order I learned them:
>
> 6502: you never forget your first love, and, sure, it's constrained,
> but it's elegant and concise and I still adore it.
> 68k: Lovely. I used it before I ever used the PDP-11, but in
> retrospect it's like the PDP-11 but more so. Roomy, comfortable,
> regular. Too bad it lost to x86 in the marketplace.
> 8051: I mean, OK, I get it, you need a low-cost embedded architecture
> and it's the 1980s, but...yuck.
> x86-and-descendents: the less said the better. Maybe I just don't
> like Intel's designs?
> SPARC: It's not bad. Having lots of registers is nice. But by the
> time it came along compilers were good enough that I never actually
> needed to use it in anger.
> S/360-and-descendents: The S/360 is OK, even nice, in a very 1960s IBM
> way. And then its evolution just KEPT adding ever more baroque
> filigrees onto it. Don't get me wrong, I love SIE, because I love VM,
> but even that is kind of a bag on the side, and by the time you get to
> System z...this is what happens when you don't start over from a clean
> sheet every so often.
> PDP-11: There's a very good reason it was used as a model architecture
> in coursework for decades. Also regular and comfortable.
> TI-99/4A (more or less TI 9900): I like microcode as much as anyone
> but honestly this is pretty silly here, folks.
>
When I was in high school, I loved reading about instruction sets. I
recommend the first five volumes of Annual Review in Automatic
Programming, if you are interested.
The DEC instructions sets were all quite elegant, from the minimal PDP-8
(nee PDP-5) 12-bit machine to the PDP-10 (nee 6). I maintained the BCPL
compiler at BBN for a while in the 1970's, and it was a pleasure to
figure out what machine code to generate.
Then there was RISC vs CISC, where the VAX was a major punching bag. I
was at Berkeley for RISC-I, and was a part of the small student group
that did its register windows scheme.
On Mon, Nov 29, 2021 at 12:52:06AM -0700, arnold@skeeve.com wrote: > arnold@skeeve.com wrote: > > > void myproc(); > > void (*functptr) = myproc; > > ... > > funcptr() > > Make that > > void (*funcptr)() = myproc. > > Sorry. Function pointers are the one part of C that I have to relearn every time. -- --- Larry McVoy lm at mcvoy.com http://www.mcvoy.com/lm
I had ordered a used copy of the 2nd edition of the book (came from the Cherry Hill Public Library!) before the plug for it on the list came out (because it looked like it covered/credited MIT/Lincoln and DEC systems in shaping interactive computing). I found the book a mile wide, and a millimeter deep, and while I've only randomly scanned it (mostly looking up index references), the organization by time period shatters the stories of each manufacturer into snippets, to a point that I'm hard pressed to believe that most readers would be able to stitch back together in a way that would give them a coherent idea of any particular strain of computing history. Nonetheless, I'd be happy to hear that it was assigned reading for CS students.
On Sun, Nov 28, 2021 at 05:12:44PM -0800, Larry McVoy wrote: > I remember Ken Witte (my TA for the PDP-11 class) trying to get me to see > how easy it was to read the octal. If I remember correctly (and I probably > don't, this was ~40 years ago), the instructions were divided into fields, > so instruction, operand, operand and it was all regular, so you could see > that this was some form of an add or whatever, it got the values from > these registers and put it in that register. I've looked it up and it is pretty much as Ken described. The weird thing is that there is no need to do it like the PDP-11 did it, you could use random numbers for each instruction and lots of processors did pretty much that. The PDP-11 didn't, it was very uniform to the point that Ken's ability to read octal made perfect sense. I was never that good but a little google and reading and I can see how he got there. Charles Sauer contacted me off list and sent me this: https://notes.technologists.com/notes/2008/01/10/a-brief-history-of-dell-unix/ Turns out that Ken was a big deal there. Not surprised at all. --lm
Hi Bakul, > As far as processor design is concerned, I believe one of the problems > is that there are fewer and fewer people who can do both h/w and s/w > design competently. The ARM chip came about because the UK's Acorn Computers couldn't find a decent cost:performance 16-bit CPU to replace the 6502 in new models. Furber and Wilson realised from a visit to Bill Mensch's WDC that chip design could be done by a small outfit. The recent UCB reports on RISC suggested a simple design could still have high performance so they set about skipping 16-bit CPUs and rolling their own 32-bit RISC CPU, the Acorn RISC Machine. The point of that bit of history is they were not chip designers, but knew electronics and programming. Wilson designed the ARM's instruction set and it was a delight to code: very orthogonal, and every instruction had four-bits of condition-flag test, e.g. Carry Set, and a bit to indicate if this instruction should set the condition flags. Thus several instructions in a row could test the condition flags set by an instruction a few earlier and unaltered since; this cut the need for quite a few branches. I think Wilson did such a good job because she had coded extensively in assembler on several different architectures. Not just the odd device driver or context-switch but 16 KiB of 6502 instructions which were BBC BASIC, a structured BASIC with WHILE, PROC, integers, floats, etc. (The BASIC ROM remains a good test of any 6502 emulator today because of all the corner cases the hand-written assembly exercised.) This was just for the BASIC interpreter; the OS, file system, etc., were all in other 16 KiB ROMs of hand-written assembly. The ability to add a co-processor to the BBC Microcomputer meant Z80 and other implementations followed. So with all that experience it's not surprising that the instruction set, though RISC, gave just what the assembly writer wanted, nothing more, whilst being easy to learn and remember. I don't think a hardware-chip designer could have done such a good job. The later pressure to drop 32-bit instructions for a mixture of 16-bit and 32-bit due to mobile's small flash capacity beget Thumb, still used today as Thumb-2, which dropped much of the nice features which made hand-written assembly such a pleasure, but then by then compilers were better quality and more common. > This is why I think more programmers should roll up their sleeves and > design a processor and understand the issues involved, especially now > that programming FPGAs is becoming common. May be start with an > existing RISC-V core in some HDL, and push and pull it into (what you > think is) an ideal minimalist design. I agree with the sentiment, but it sounds quite a big leap for most programmers. I knew assembly and logic but not how to fill the gap to create a CPU. The popular book from a few years ago which plugged the hole for me was ‘The Elements of Computing Systems’, which I see has now had a second edition: ISBN 0-262-53980-2, https://amzn.to/3xE0IZo It's a book of two halves: the first builds a simple CPU for a primitive computer from nothing but NAND gates; the second half writes an assembler, then a compiler for a language targeting a VM and then implements the VM on the CPU built in part one. It's the reader who has to do all this. Initially in the book's HDL run against the provided simulators and test cases. Then in the programming language of his choice as just text I/O is required. So to start the other binary-operator logic gates have to be build with NANDs, then multiplexers, etc. There's a target of how many gates to use so an efficient design has to be produced. By chapter two we're implementing the ALU. After the sequential logic of chapter three the instruction set is introduced in chapter four. It proceeds at pace, just giving the barest tuition needed to understand each conceptual part. Any discussion of subtleties or alternative approaches is eschewed for weightier textbooks. I think it's a better introduction to CPU design and from that one can read up on Verilog, etc., and start experimenting with little CPU designs on FPGAs. -- Cheers, Ralph.
> On Nov 30, 2021, at 11:27, Ralph Corderoy <ralph@inputplus.co.uk> wrote: > > ... The point of that bit of history is they were not chip designers, but > knew electronics and programming. Wilson designed the ARM's instruction > set and it was a delight to code: very orthogonal, and every instruction > had four-bits of condition-flag test, e.g. Carry Set, and a bit to > indicate if this instruction should set the condition flags. Thus > several instructions in a row could test the condition flags set by an > instruction a few earlier and unaltered since; this cut the need for > quite a few branches. ... I wrote a fair amount of PDP-11 assembler, back in the early 70's (about 10K LOC). I was particularly happy with a routine that moved a cell between a pair of doubly-linked, circular linked lists (a "free" list and a "busy" list). The routine only had to modify six pointers, which isn't a hard problem. The cute part was that it was able to do so using (IIRC) only eight or nine instructions. The PDP-11's auto-increment mode obviated the need for separate index modification code. I had previously written a fair amount of code for a Varian 620i, which had an AQX architecture. The 620i wasn't _hard_ to program, but it _was_ a bit tedious. The PDP-11, by comparison, was a programmer's delight. Which brings me to a historical notion: The DG Nova (https://en.wikipedia.org/wiki/Data_General_Nova) came out in 1969, just a bit before the PDP-11 (https://en.wikipedia.org/wiki/PDP-11). My impression, when I (later on) looked at the Nova ISA, was that they had moved in the right direction from the AQX approach, but not quite far enough. DEC, IMNSHO, got it right. (ducking). -r
On 12/1/21 12:46 AM, Rich Morin wrote: > The DG Nova (https://en.wikipedia.org/wiki/Data_General_Nova) came out in 1969, just a bit before the PDP-11 (https://en.wikipedia.org/wiki/PDP-11). My impression, when I (later on) looked at the Nova ISA, was that they had moved in the right direction from the AQX approach, but not quite far enough. DEC, IMNSHO, got it right. (ducking). https://history-computer.com/dec-pdp-11-computer/ is a concise summary of where the pdp-11 architecture came from (Harold McFarland at CMU) and the origins of the nova from the and the connection to the dec pdp-x original source documents can be found at bitsavers