The Unix Heritage Society mailing list
 help / color / mirror / Atom feed
* [TUHS] RIP John Backus
       [not found] <mailman.21.1521314548.3788.tuhs@minnie.tuhs.org>
@ 2018-03-17 20:14 ` Paul McJones
  2018-03-17 22:27   ` Steve Johnson
  0 siblings, 1 reply; 21+ messages in thread
From: Paul McJones @ 2018-03-17 20:14 UTC (permalink / raw)


On 3/17/2018 12:22 PM, Arthur Krewat <krewat at kilonet.net> wrote:
> Leave it to IBM to do something backwards.
>
> Of course, that was in 1954, so I can't complain, it was 11 years before
> I was born. But that's ... odd.
>
> Was subtraction easier than addition with digital electronics back then?
> I would think that they were both the same level of effort (clock
> cycles) so why do something obviously backwards logically?

Subtraction was done by taking the two's complement and adding. I 
suspect the CPU architect (Gene Amdahl -- not exactly a dullard) 
intended programmers store array elements at increasing memory 
addresses, and reference an array element relative to the address of the 
last element plus one. This would allow a single index register (and 
there were only three) to be used as the index and the (decreasing) 
count. See the example on page 97 of:

James A. Saxon
Programming the IBM 7090: A Self-Instructional Programmed Manual
Prentice-Hall, 1963
http://www.bitsavers.org/pdf/ibm/7090/books/Saxon_Programming_the_IBM_7090_1963.pdf

The Fortran compiler writers decided to reverse the layout of array 
elements so a Fortran subscript could be used directly in an index register.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://minnie.tuhs.org/pipermail/tuhs/attachments/20180317/4da38e01/attachment.html>


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] RIP John Backus
  2018-03-17 20:14 ` [TUHS] RIP John Backus Paul McJones
@ 2018-03-17 22:27   ` Steve Johnson
  2018-03-22 21:05     ` [TUHS] long lived programs (was " Bakul Shah
  0 siblings, 1 reply; 21+ messages in thread
From: Steve Johnson @ 2018-03-17 22:27 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 1224 bytes --]


Let me offer a somewhat different perspective on FORTRAN.  When an
airplane is designed, the design undergoes a number of engineering
tests under simulation at the design stage.  Many countries require
that these simulation runs be archived for the lifetime of the
airplane so that, in the event of a crash, they can be run again with
the conditions experienced by the aircraft to see whether the problem
was in the design.  Airplanes commonly take 10 years from first
design to first shipment.  And then are sold for 10 years or so. 
And the planes can fly for up to 30 years after that.   So these
tests need to be written in a computer language that can be run 50
years in the future -- that is a stipulation of the archive
requirement.  There really aren't any alternative languages that I'm
aware of that could meet this criterion -- that's particularly true
today, when there is a sea change from serial to parallel programming
and it's hard to pick a winner with five decades of life ahead of
it...

Does anyone have any candidates?

Steve


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://minnie.tuhs.org/pipermail/tuhs/attachments/20180317/632600a7/attachment.html>


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re:  RIP John Backus
  2018-03-17 22:27   ` Steve Johnson
@ 2018-03-22 21:05     ` Bakul Shah
  2018-03-22 21:35       ` Clem Cole
  2018-03-23 10:43       ` Tim Bradshaw
  0 siblings, 2 replies; 21+ messages in thread
From: Bakul Shah @ 2018-03-22 21:05 UTC (permalink / raw)


On Mar 17, 2018, at 3:27 PM, Steve Johnson <scj at yaccman.com> wrote:
> 
> Let me offer a somewhat different perspective on FORTRAN.  When an airplane is designed, the design undergoes a number of engineering tests under simulation at the design stage.  Many countries require that these simulation runs be archived for the lifetime of the airplane so that, in the event of a crash, they can be run again with the conditions experienced by the aircraft to see whether the problem was in the design.  Airplanes commonly take 10 years from first design to first shipment.  And then are sold for 10 years or so.  And the planes can fly for up to 30 years after that.   So these tests need to be written in a computer language that can be run 50 years in the future -- that is a stipulation of the archive requirement.  There really aren't any alternative languages that I'm aware of that could meet this criterion -- that's particularly true today, when there is a sea change from serial to parallel programming and it's hard to pick a winner with five decades of life ahead of it...
> 
> Does anyone have any candidates?

I was thinking about a similar issue after reading Bradshaw's
message about FORTRAN programs being critical to his country's
security. What happens in 50-100 years when such programs have
been in use for a long time but none of the original authors
may be alive? The world may have moved on to newer languages
and there may be very few people who study "ancient" computer
languages and even they won't have in-depth experience to
understand the nuances & traps of these languages well enough.
No guarantee that FORTRAN will be in much use then! Will it be
like in science fiction where ancient spaceships continue
working but no one knows what to do when they break?

Even on a much much smaller time scale I have seen million+
line code base, with original programmers long gone. Newer
programmers understand enough to add new layers of code or fix
some bugs but not enough to fix any deep problems. Such
programs are difficult to understand *today* due to their poor
structure but as they serve a useful purpose and continued to
be used.

We move archived data to newer media & may be make multiple
copies so as to be able to continue accessing them but when it
comes to "moving" critical programs to newer programming
languages, we are stuck. I think "y2k" was just a small taste
of this. We don't have enough tests to fully characterize
them, not clear specifications etc. You can reimplement a
small programs (up to 5K lines of C/C++ or so) with some
effort by analyzing their IO behavior but this gets
exponentially harder for larger programs.

The only thing I can think of is to use have programs that
translate programs in todays languages to a common but very
simple universal language for their "long term storage". May
be something like Lamport's TLA+? A very tough job.

We may be incurring a lot of "technical debt" that future
generations may have to pay!




^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-22 21:05     ` [TUHS] long lived programs (was " Bakul Shah
@ 2018-03-22 21:35       ` Clem Cole
  2018-03-23 19:28         ` Bakul Shah
  2018-03-23 10:43       ` Tim Bradshaw
  1 sibling, 1 reply; 21+ messages in thread
From: Clem Cole @ 2018-03-22 21:35 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 3399 bytes --]

On Thu, Mar 22, 2018 at 5:05 PM, Bakul Shah <bakul at bitblocks.com> wrote:

>
> I was thinking about a similar issue after reading Bradshaw's
> message about FORTRAN programs being critical to his country's
> security. What happens in 50-100 years when such programs have
> been in use for a long time but none of the original authors
> may be alive?
>
​....​


>
> We may be incurring a lot of "technical debt" that future
> generations may have to pay!
>

​Maybe and maybe not.​ I worried about this a bit myself.  But I think the
difference I think is that while computer scientist may be using Fortran,
it is still pretty heavily used in the 'hard science' - certainly physics
and chemistry - because as was already pointed out -- it is still the best
language for doing complex mathematics.   And more over, the language (like
UNIX) has changed.  If you look at modern Fortran code, it looks more like
Algol than Fortran-IV.

Every scientific code that I know of (with the exception of SPICE), that
got recoded into C or C++ -- *has slowed down or gotten more complex.*
 And in SPICE3's case, TQ spend hours staring at Ellis code in SPICE2.  Tom
was on his own a 'bad ass' Fortran programmer, so he knew the trick Ellis
was taking.   And, like other codes I know, it took a bit before TQ got
SPICE3 in parity much less better.   In the end, what made SPICE3 replace
its older brother, was a new feature (easy addition of new transistor
models), but I think TQ would we one the first to tell you, it was very
hard to beat Ellis's FORTRAN.

The reality is Fortran is a great tool for what it is designed to do.   An
most of us on this list don't do that work, so we don't value it.   But if
you have to nasty math is partial differentials, complex numbers, etc -
Python, C, C++ are fine for very small things.   But if you have a
production code that is going to run for hours, if not days, on a
supercomputer, chances are it is Fortran.

For instance the last time I looked at WRF about a a year ago, which is the
premier weather code (and you hear about every night on the news with the
different 'models' the weatherman talk about), that is a FORTRAN-90 code.
 We were looking at the how to speed of the messaging stack under the
covers and were interested in how it stressed the networking stack.  It has
begin/end style looping, and its very modular.   The complex  and
unreadable part is MPI and the stuff to make it run in parallel.  It's not
the Fortran-ness.   I suspect anyone on this list like me, that reads C
could look at that code and understand it pretty quick.

The question (and a good one) is if you are not 'a Fortran person,' are you
going to be able to understand it well enough to not do damage to it, if
you modify it?  Which is of course the crux the question Bakul is asking.

I suspect, it is not as bad the science fiction movies profess.   Because
the codes have matured over time, which is not what happened with Y2K.
 Are their 'dusty decks' sure - but they are not quite so dusty as it might
think, and as importantly, the code we care about are and have been in
continuous use for years.  So there has been a new generation of
programmers that took of the maintenance of them.

ᐧ
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://minnie.tuhs.org/pipermail/tuhs/attachments/20180322/990a5dde/attachment-0001.html>


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re:  RIP John Backus
  2018-03-22 21:05     ` [TUHS] long lived programs (was " Bakul Shah
  2018-03-22 21:35       ` Clem Cole
@ 2018-03-23 10:43       ` Tim Bradshaw
       [not found]         ` <CAC20D2P1CZwaD0uJpMhWg11muDeH9rEn3X+AUjXvwMKsNjs7ng@mail.gmail.com>
  1 sibling, 1 reply; 21+ messages in thread
From: Tim Bradshaw @ 2018-03-23 10:43 UTC (permalink / raw)


On 22 Mar 2018, at 21:05, Bakul Shah <bakul at bitblocks.com> wrote:
> 
> I was thinking about a similar issue after reading Bradshaw's
> message about FORTRAN programs being critical to his country's
> security. What happens in 50-100 years when such programs have
> been in use for a long time but none of the original authors
> may be alive? The world may have moved on to newer languages
> and there may be very few people who study "ancient" computer
> languages and even they won't have in-depth experience to
> understand the nuances & traps of these languages well enough.
> No guarantee that FORTRAN will be in much use then! Will it be
> like in science fiction where ancient spaceships continue
> working but no one knows what to do when they break?

My experience of large systems like this is that this isn't how they work at all.  The program I deal with (which is around 5 million lines naively (counting a lot of stuff which probably is not source but is in the source tree)) is looked after by probably several hundred people.  It's been through several major changes in its essential guts and in the next ten years or so it will be entirely replaced by a new version of itself to deal with scaling problems inherent in the current implementation.  We get a new machine every few years onto which it needs to be ported, and those machines have not all been just faster versions of the previous one, and will probably never be so.

What it doesn't do is to just sit there as some sacred artifact which no-one understands, and it's unlikely ever to do so.  The requirements for it to become like that would be at least that the technology of large-scale computers was entirely stable, compilers, libraries and operating systems had become entirely stable and people had stopped caring about making it do what it does better.  None of those things seems very likely to me.

(Just to be clear: this thing isn't simulating bombs: it's forecasting the weather.)

--tim

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://minnie.tuhs.org/pipermail/tuhs/attachments/20180323/100a7e18/attachment.html>


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-22 21:35       ` Clem Cole
@ 2018-03-23 19:28         ` Bakul Shah
  2018-03-23 19:44           ` Larry McVoy
                             ` (2 more replies)
  0 siblings, 3 replies; 21+ messages in thread
From: Bakul Shah @ 2018-03-23 19:28 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 3453 bytes --]

On Mar 22, 2018, at 2:35 PM, Clem Cole <clemc at ccc.com> wrote:
> 
> The question (and a good one) is if you are not 'a Fortran person,' are you going to be able to understand it well enough to not do damage to it, if you modify it?  Which is of course the crux the question Bakul is asking.

This is indeed the case, but I am asking not just about 
Fortran.  Will we continue programming in 50-100 years in
C/C++/Java/Fortran?  That is, are we going through the
Cambrian Explosion of programming languages now and it will
settle down in a few decades or have we just started?

> I suspect, it is not as bad the science fiction movies profess.   Because the codes have matured over time, which is not what happened with Y2K.   Are their 'dusty decks' sure - but they are not quite so dusty as it might think, and as importantly, the code we care about are and have been in continuous use for years.  So there has been a new generation of programmers that took of the maintenance of them.

Perhaps a more important question is what % of programs are
important enough and will be around in 50-100 years.

On Mar 23, 2018, at 3:43 AM, Tim Bradshaw <tfb at tfeb.org> wrote:
> 
> My experience of large systems like this is that this isn't how they work at all.  The program I deal with (which is around 5 million lines naively (counting a lot of stuff which probably is not source but is in the source tree)) is looked after by probably several hundred people.  It's been through several major changes in its essential guts and in the next ten years or so it will be entirely replaced by a new version of itself to deal with scaling problems inherent in the current implementation.  We get a new machine every few years onto which it needs to be ported, and those machines have not all been just faster versions of the previous one, and will probably never be so.

By now most major systems has been computerized. Banks,
govt, finance, communication, shipping, various industries,
research, publishing, medicine etc. Will the critical
systems within each area have as many resources as & when
needed as weather forecasting system Tim is talking about?
[Of course, the same question can be asked in relation to
the conversion I am wondering about!]

On Mar 23, 2018, at 8:51 AM, Ron Natalie <ron at ronnatalie.com> wrote:
> 
> A core package in a lot of the geospatial applications is a old piece of mathematical code originally written in Fortran (probably in the sixties).   Someone probably in the 80’s recoded the thing pretty much line for line (maintaining the horrendous F66 variable names etc…) into C.     It’s probably ripe for a jump to something else now.
>  
> We’ve been through four major generations of the software.    The original was all VAX based with specialized hardware (don’t know what it was written in).    We followed that on with a portable UNIX (but mostly Suns, but ours worked on SGI, Ardent, Stellar, various IBM AIX platofrms, Apollo DN1000’s, HP, DEC Alphas).   This was primarily a C application.    Then right about the year 2000, we jumped to C++ on Windows.    Subsequently it got back ported to Linux.     Yes there are some modules that have been unchanged for decades, but the system on the whole has been maintained.

I wonder if we will continue doing this sort of adhoc
but expensive rewrites for a long time.... 

[This may be somewhat relevant to TUHS, from a future
 historian's perspective :-)]



^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-23 19:28         ` Bakul Shah
@ 2018-03-23 19:44           ` Larry McVoy
  2018-03-23 21:23           ` Clem Cole
  2018-03-26 13:43           ` Tim Bradshaw
  2 siblings, 0 replies; 21+ messages in thread
From: Larry McVoy @ 2018-03-23 19:44 UTC (permalink / raw)


I'm really not worried about it.  When I got into kernel programming
I had no idea what I was doing.  I just kept at it, did small stuff,
the bigger picture slowly came into focus.  After a couple of years there
wasn't much that I wouldn't take on.  I stayed away from drivers because I
had figured out that Sun had their stuff but it was going to be different
than SGI's stuff and both different from PC stuff (even the PC stuff was
different, I would have been working on ISA bus devices and that's long
gone so far as I know).  But file systems, networking, VM, processes,
signals, all of that stuff was pretty easy after you got to know the code.

Every year someone takes some young hotshot and points them at some
"impossible" thing and one of them makes it work.  I don't see that
changing.

On Fri, Mar 23, 2018 at 12:28:47PM -0700, Bakul Shah wrote:
> On Mar 22, 2018, at 2:35 PM, Clem Cole <clemc at ccc.com> wrote:
> > 
> > The question (and a good one) is if you are not 'a Fortran person,' are you going to be able to understand it well enough to not do damage to it, if you modify it?  Which is of course the crux the question Bakul is asking.
> 
> This is indeed the case, but I am asking not just about 
> Fortran.  Will we continue programming in 50-100 years in
> C/C++/Java/Fortran?  That is, are we going through the
> Cambrian Explosion of programming languages now and it will
> settle down in a few decades or have we just started?
> 
> > I suspect, it is not as bad the science fiction movies profess.   Because the codes have matured over time, which is not what happened with Y2K.   Are their 'dusty decks' sure - but they are not quite so dusty as it might think, and as importantly, the code we care about are and have been in continuous use for years.  So there has been a new generation of programmers that took of the maintenance of them.
> 
> Perhaps a more important question is what % of programs are
> important enough and will be around in 50-100 years.
> 
> On Mar 23, 2018, at 3:43 AM, Tim Bradshaw <tfb at tfeb.org> wrote:
> > 
> > My experience of large systems like this is that this isn't how they work at all.  The program I deal with (which is around 5 million lines naively (counting a lot of stuff which probably is not source but is in the source tree)) is looked after by probably several hundred people.  It's been through several major changes in its essential guts and in the next ten years or so it will be entirely replaced by a new version of itself to deal with scaling problems inherent in the current implementation.  We get a new machine every few years onto which it needs to be ported, and those machines have not all been just faster versions of the previous one, and will probably never be so.
> 
> By now most major systems has been computerized. Banks,
> govt, finance, communication, shipping, various industries,
> research, publishing, medicine etc. Will the critical
> systems within each area have as many resources as & when
> needed as weather forecasting system Tim is talking about?
> [Of course, the same question can be asked in relation to
> the conversion I am wondering about!]
> 
> On Mar 23, 2018, at 8:51 AM, Ron Natalie <ron at ronnatalie.com> wrote:
> > 
> > A core package in a lot of the geospatial applications is a old piece of mathematical code originally written in Fortran (probably in the sixties).   Someone probably in the 80???s recoded the thing pretty much line for line (maintaining the horrendous F66 variable names etc???) into C.     It???s probably ripe for a jump to something else now.
> >  
> > We???ve been through four major generations of the software.    The original was all VAX based with specialized hardware (don???t know what it was written in).    We followed that on with a portable UNIX (but mostly Suns, but ours worked on SGI, Ardent, Stellar, various IBM AIX platofrms, Apollo DN1000???s, HP, DEC Alphas).   This was primarily a C application.    Then right about the year 2000, we jumped to C++ on Windows.    Subsequently it got back ported to Linux.     Yes there are some modules that have been unchanged for decades, but the system on the whole has been maintained.
> 
> I wonder if we will continue doing this sort of adhoc
> but expensive rewrites for a long time.... 
> 
> [This may be somewhat relevant to TUHS, from a future
>  historian's perspective :-)]

-- 
---
Larry McVoy            	     lm at mcvoy.com             http://www.mcvoy.com/lm 


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-23 19:28         ` Bakul Shah
  2018-03-23 19:44           ` Larry McVoy
@ 2018-03-23 21:23           ` Clem Cole
  2018-03-23 21:36             ` Warner Losh
  2018-03-26 13:43           ` Tim Bradshaw
  2 siblings, 1 reply; 21+ messages in thread
From: Clem Cole @ 2018-03-23 21:23 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 2173 bytes --]

On Fri, Mar 23, 2018 at 3:28 PM, Bakul Shah <bakul at bitblocks.com> wrote:

>
> By now most major systems has been computerized. Banks,
> govt, finance, communication, shipping, various industries,
> research, publishing, medicine etc. Will the critical
> systems within each area have as many resources as & when
> needed as weather forecasting system Tim is talking about?
> [Of course, the same question can be asked in relation to
> the conversion I am wondering about!]


​I suspect we agree more than we disagree.

I offer the following observation.    Except for high end HPC particularly
DoD, DoE and big science types of applications, there has been a
'Christiansen' style​ disruption were a 'worse' technology was created and
loved by a new group of users and that new technology eventually got better
and replaced (disrupted) the earlier one (Banks/Finance were cobol - now
its Oracle and the like, SAP et al; Communications was SS7 over custom HW,
now its IP running on all sorts of stuff).  The key is the disruptor was on
a economic curve that make it successful.

But HE HPC is the same people, doing the same things they did before -- the
difference is the data sets are larger, need for better precision,
different data representation (e.g., graphics). Again, the math has not
changed.  But I don't see a new customer for those style of applications,
which is what is needed to basically bank roll the replacement (originally
less 'good') technology.  The economics are their to replace it - at least
so far.

The idea of the 'under served' or 'missing middle' market for HPC has been
discussed for a bit.  I used to believe it.  I'm not so sure now.

Which bringing this back to UNIX.  Linux is the UNIX disruptor - which is
great.  Linux keeps 'UNIX' alive and getting better.  I don't see an
economic reason to replace it, but who knows.  Maybe that's what the new
good folks at Goggle, Amazon, Intel or some University is doing.  But so
far, the economics is not there.

Clem
ᐧ
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://minnie.tuhs.org/pipermail/tuhs/attachments/20180323/8e454207/attachment.html>


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-23 21:23           ` Clem Cole
@ 2018-03-23 21:36             ` Warner Losh
  2018-03-23 22:02               ` Steve Johnson
  0 siblings, 1 reply; 21+ messages in thread
From: Warner Losh @ 2018-03-23 21:36 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 883 bytes --]

On Fri, Mar 23, 2018 at 3:23 PM, Clem Cole <clemc at ccc.com> wrote:
>
> Which bringing this back to UNIX.  Linux is the UNIX disruptor - which is
> great.  Linux keeps 'UNIX' alive and getting better.  I don't see an
> economic reason to replace it, but who knows.  Maybe that's what the new
> good folks at Goggle, Amazon, Intel or some University is doing.  But so
> far, the economics is not there.
>

Speaking of how ancient code works... There's still AT&T code dating to v5
or older in *BSD.... It's been updated, improved upon, parts replaced, etc.
But there's still some bits dating all the way back to those early times.
Having competition from Linux is great and keeps the BSDs honest...

Warner

> ᐧ
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://minnie.tuhs.org/pipermail/tuhs/attachments/20180323/7f03f6d6/attachment.html>


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-23 21:36             ` Warner Losh
@ 2018-03-23 22:02               ` Steve Johnson
  0 siblings, 0 replies; 21+ messages in thread
From: Steve Johnson @ 2018-03-23 22:02 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 1348 bytes --]


Reminds me a bit of the old saying: "This is George Washington's
Axe.  Of course, it's had six new handles and four new heads since he
owned it..."

Steve

----- Original Message -----
From:
 "Warner Losh" <imp at bsdimp.com>

To:
"Clem Cole" <clemc at ccc.com>
Cc:
"TUHS main list" <tuhs at minnie.tuhs.org>
Sent:
Fri, 23 Mar 2018 15:36:02 -0600
Subject:
Re: [TUHS] long lived programs (was Re: RIP John Backus

On Fri, Mar 23, 2018 at 3:23 PM, Clem Cole <clemc at ccc.com [1]>
 wrote:Which bringing this back to UNIX.  Linux is the UNIX 

disruptor - which is great.  Linux keeps 'UNIX' alive and getting
better.  I don't see an economic reason to replace it, but who
knows.  Maybe that's what the new good folks at Goggle, Amazon, Intel
or some University is doing.  But so far, the economics is not
there.

Speaking of how ancient code works... There's still AT&T code dating
to v5 or older in *BSD.... It's been updated, improved upon, parts
replaced, etc. But there's still some bits dating all the way back to
those early times. Having competition from Linux is great and keeps
the BSDs honest...

Warner
ᐧ

 

Links:
------
[1] mailto:clemc at ccc.com

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://minnie.tuhs.org/pipermail/tuhs/attachments/20180323/3aaf806f/attachment-0001.html>


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] Fwd:  long lived programs (was Re: RIP John Backus
       [not found]         ` <CAC20D2P1CZwaD0uJpMhWg11muDeH9rEn3X+AUjXvwMKsNjs7ng@mail.gmail.com>
@ 2018-03-26  0:53           ` Clem Cole
  0 siblings, 0 replies; 21+ messages in thread
From: Clem Cole @ 2018-03-26  0:53 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 5195 bytes --]

[try-II]




On Fri, Mar 23, 2018 at 6:43 AM, Tim Bradshaw <tfb at tfeb.org> wrote:

> On 22 Mar 2018, at 21:05, Bakul Shah <bakul at bitblocks.com> wrote:
>
>
> I was thinking about a similar issue after reading Bradshaw's
> message about FORTRAN programs being critical to his country's
> security. What happens in 50-100 years when such programs have
> been in use for a long time but none of the original authors
> may be alive? The world may have moved on to newer languages
> and there may be very few people who study "ancient" computer
> languages and even they won't have in-depth experience to
> understand the nuances & traps of these languages well enough.
> No guarantee that FORTRAN will be in much use then! Will it be
> like in science fiction where ancient spaceships continue
> working but no one knows what to do when they break?
>
>
> My experience of large systems like this is that this isn't how they work
> at all.  The program I deal with (which is around 5 million lines naively
> (counting a lot of stuff which probably is not source but is in the source
> tree)) is looked after by probably several hundred people.  It's been
> through several major changes in its essential guts and in the next ten
> years or so it will be entirely replaced by a new version of itself to deal
> with scaling problems inherent in the current implementation.  We get a new
> machine every few years onto which it needs to be ported, and those
> machines have not all been just faster versions of the previous one, and
> will probably never be so.
>
> What it doesn't do is to just sit there as some sacred artifact which
> no-one understands, and it's unlikely ever to do so.  The requirements for
> it to become like that would be at least that the technology of large-scale
> computers was entirely stable, compilers, libraries and operating systems
> had become entirely stable and people had stopped caring about making it do
> what it does better.  None of those things seems very likely to me.
>
> (Just to be clear: this thing isn't simulating bombs: it's forecasting the
> weather.)
>

​+1 - exactly
​my ​
point.​

We have drifted a bit from pure UNIX, but I actually do think this is
relevant to UNIX history.   Once UNIX started to run on systems targeting
HPC loads where Fortran was the dominate programming language, UNIX quickly
displaced custom OSs and became the dominant target even if at the
beginning of that transition
​as ​
the 'flavor' of UNIX did vary (we probably can and should discuss how that
happened and why independently
​-- although
 I will point out the UNIX/Linux implementation running at say LLNL != the
version running at say Nasa Moffitt).    And the truth is today, for small
experiments you probably run Fortran on Windows on your desktop.   But for
'production'  - the primary OS for Fortran is a UNIX flavor of some type
and has been that way since the mid-1980s - really starting with the UNIX
wars of that time.

As I also have said here and elsewhere, while HPC and very much its
lubricant, Fortran, are not something 'academic CS types' like to study
these days
​ - even though​
 Fortran (HPC) pays my
​ and many of our​
salar
​ies​
.    Yet it runs on the system the those same academic types all prefer -
*i.e.* Ken and Dennis' ideas.    The primary difference is the type of
program the users are running.   But Ken and Dennis ideas work well for
almost all users and spans
​specific ​
application market
​s.​

Here is a
​picture
 I did a few years ago for a number of Intel exec's.  At the time I was
trying to explain to them that HPC is not a single style of application and
also help them understand that there two types of value - the code itself
and the data.  Some markets (
​*e.g.* ​
Financial) use public data but the methods they use
​to crunch it ​
(
​*i.e.* ​the
code
​s​
)
​are
 private, while others
​market segments ​
might have private data (*e.g.*
​ ​
oil and gas) but
​different customers ​
use the same or similar codes to crunch it.
​F
or this discussion, think about how much of the code I sho
​w​
below is complex arithmetics -
​while ​
much of it is searching
​ google style​
, but a lot is
​just plain ​
nasty math.   The 'nasty math' that has not changed
​over the years ​
and thus those codes are dominated by Fortran.
​ [Note Steve has pointed out that with AI maybe the math could change in
the future - but certainly so far, history of these markets is basically
differential equations solvers].​


 As Tim says, I really can not
​see ​
that changing and
​the ​
reason (I believe) is I do not see any
​compelling ​
economic reason to do so.
Clem​



ᐧ

ᐧ
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://minnie.tuhs.org/pipermail/tuhs/attachments/20180325/c411356f/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: HPC_CloudBubble_nomarks20180323.png
Type: image/png
Size: 444678 bytes
Desc: not available
URL: <http://minnie.tuhs.org/pipermail/tuhs/attachments/20180325/c411356f/attachment-0001.png>


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-23 19:28         ` Bakul Shah
  2018-03-23 19:44           ` Larry McVoy
  2018-03-23 21:23           ` Clem Cole
@ 2018-03-26 13:43           ` Tim Bradshaw
  2018-03-26 16:19             ` Paul Winalski
                               ` (2 more replies)
  2 siblings, 3 replies; 21+ messages in thread
From: Tim Bradshaw @ 2018-03-26 13:43 UTC (permalink / raw)


On 23 Mar 2018, at 19:28, Bakul Shah <bakul at bitblocks.com> wrote:
> 
> By now most major systems has been computerized. Banks,
> govt, finance, communication, shipping, various industries,
> research, publishing, medicine etc. Will the critical
> systems within each area have as many resources as & when
> needed as weather forecasting system Tim is talking about?

I think that this is indeed a problem: it just isn't a problem for the kind of huge numerical simulation that gave rise to this thread.  In general programs where

- you continually are looking for more performance,
- you are continually updating what the program can do (adding better cloud models, say),

are pretty safe.  But programs which get deployed *and then just work* are liable to rot.  So, for instance, a retail bank's writes or buys a system to deal with mortgages: once this thing works then the chances are it will keep working for a very long time because the number of mortgages might double in ten years or something, but it won't go up by enormous factors and mortgages (at the retail bank end, not at the mortgage-backs end) are kind of boring.

Retail banks are risk-averse so they like to avoid the risks associated with porting the thing to new platforms.  And since there's no development most of the developers leave.

And then ten or twenty years later you have this arcane thing which no-one understands any more running on a platform which is falling off the end of support.

And if that was the only problem everything would be fine.  In fact, several times during the life of this thing, the bank wanted to offer some new kind of mortgage.  But no-one understood the existing mortgage platform any more, *so they deployed a new one alongside it*.  So in fact you have *four* of these platforms, all of them critical, and none of them understood by anyone.

To bring this back to Unix history, I think this is an example of a place where Unix has failed (or, perhaps, where people have failed to make use of it properly).  Half the reason these things are such a trouble is that they aren't written to any really portable API, so the bit that runs on Solaris isn't going to run on AIX or Linux, and it only might run on the current version of Solaris in fact.

I don't know what the solution to this problem: I think it is essentially teleological to assume that there *is* a solution.

--tim
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://minnie.tuhs.org/pipermail/tuhs/attachments/20180326/989dbb4e/attachment.html>


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-26 13:43           ` Tim Bradshaw
@ 2018-03-26 16:19             ` Paul Winalski
  2018-03-26 16:41               ` Arthur Krewat
  2018-03-27  1:08               ` Clem Cole
  2018-03-26 19:04             ` Bakul Shah
  2018-03-27  1:21             ` Steve Johnson
  2 siblings, 2 replies; 21+ messages in thread
From: Paul Winalski @ 2018-03-26 16:19 UTC (permalink / raw)


On 3/26/18, Tim Bradshaw <tfb at tfeb.org> wrote:
> On 23 Mar 2018, at 19:28, Bakul Shah <bakul at bitblocks.com> wrote:
> Retail banks are risk-averse so they like to avoid the risks associated with
> porting the thing to new platforms.  And since there's no development most
> of the developers leave.
>
> And then ten or twenty years later you have this arcane thing which no-one
> understands any more running on a platform which is falling off the end of
> support.
>
After grad school (1978) one of my first job interviews was for a job
as system manager for an insurance company.  Their data center took
the "don't risk porting software" to an extreme.  As new technology
came out they bought it, but only to run new applications.  Existing
applications were never ported and continued to run on their existing
hardware.  Their machine room looked like a computer museum.  They had
two IBM 1400s (one in use; one was cannabilized for parts to keep the
active machine going), two System/360 model 50s, with a drum and a
2321 data cell drive.  Their only modern hardware was a System/370
model 158.

Operating systems seem to have taken one of two policies regarding
legacy programs.  IBM's OS and DEC's VMS emphasized backwards
compatibility--new features mustn't break existing applications.  VMS
software developers called the philosophy of strict backward
compatibility "The Promise" and took it very seriously.  Unix, on the
other hand, has always struck me as being less concerned with backward
compatibility and more about innovation and experimentation.  I think
the assumption with Unix is that you have the sources for your
programs, so you can recompile or modify them to keep up with
incompatible changes.  This is fine for research and HPTC
environments, but it doesn't fly in corporate data centers, where even
a simple recompile means that the new version of the application has
to undergo expensive qualification testing before it can be put into
production.  Which philosophy regarding backwards compatibility is
better?  It depends on your target audience.

-Paul W.


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-26 16:19             ` Paul Winalski
@ 2018-03-26 16:41               ` Arthur Krewat
  2018-03-26 19:53                 ` Larry McVoy
  2018-03-27  1:08               ` Clem Cole
  1 sibling, 1 reply; 21+ messages in thread
From: Arthur Krewat @ 2018-03-26 16:41 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 1000 bytes --]

On 3/26/2018 12:19 PM, Paul Winalski wrote:
>   Unix, on the
> other hand, has always struck me as being less concerned with backward
> compatibility and more about innovation and experimentation.
For Sun, it was quite the contrary.

It was normal to run binaries from SunOS on Solaris. For the longest 
time, the "xv" binary I used on SPARC hardware was compiled on SunOS. 
It's even an X-windows application, and the libraries work.

Even in Solaris 10, it still runs:

-bash-3.00$ ./xv.sparc
ld.so.1: xv.sparc: warning: /usr/4lib/libX11.so.4.3: has older revision 
than expected 10
ld.so.1: xv.sparc: warning: /usr/4lib/libc.so.1.9: has older revision 
than expected 160
-bash-3.00$ file xv.sparc
xv.sparc:       Sun demand paged SPARC executable dynamically linked
-bash-3.00$ uname -a
SunOS redacted 5.10 Generic_120011-11 sun4u sparc SUNW,SPARC-Enterprise


This has been deprecated as of Solaris 11, supposedly. Backwards 
compatibility for Solaris binaries is also a "thing".

art k.



^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-26 13:43           ` Tim Bradshaw
  2018-03-26 16:19             ` Paul Winalski
@ 2018-03-26 19:04             ` Bakul Shah
  2018-03-27  1:21             ` Steve Johnson
  2 siblings, 0 replies; 21+ messages in thread
From: Bakul Shah @ 2018-03-26 19:04 UTC (permalink / raw)




> On Mar 26, 2018, at 6:43 AM, Tim Bradshaw <tfb at tfeb.org> wrote:
> 
> On 23 Mar 2018, at 19:28, Bakul Shah <bakul at bitblocks.com> wrote:
>> 
>> By now most major systems has been computerized. Banks,
>> govt, finance, communication, shipping, various industries,
>> research, publishing, medicine etc. Will the critical
>> systems within each area have as many resources as & when
>> needed as weather forecasting system Tim is talking about?
> 
> I think that this is indeed a problem: it just isn't a problem for the kind of huge numerical simulation that gave rise to this thread.  In general programs where

I started thinking about Fortran programs but soon expanded to
the more general problem.
> 
> - you continually are looking for more performance,
> - you are continually updating what the program can do (adding better cloud models, say),
> 
> are pretty safe.  But programs which get deployed *and then just work* are liable to rot.  So, for

Even here attention can flag over time.

> And if that was the only problem everything would be fine.  In fact, several times during the life of this thing, the bank wanted to offer some new kind of mortgage.  But no-one understood the existing mortgage platform any more, *so they deployed a new one alongside it*.  So in fact you have *four* of these platforms, all of them critical, and none of them understood by anyone.

This is the modification problem I was talking about. Running an
unchanged binary on an emulated processor is much easier.

> 
> To bring this back to Unix history, I think this is an example of a place where Unix has failed (or, perhaps, where people have failed to make use of it properly).  Half the reason these things are such a trouble is that they aren't written to any really portable API, so the bit that runs on Solaris isn't going to run on AIX or Linux, and it only might run on the current version of Solaris in fact.

1) This is when the OS doesn't live as long as the application.
2) The rate of change in open source technologies is far too high.
   Open source is often open loop. Hundreds of bugs remain unfixed
   while a new feature will be added.

> I don't know what the solution to this problem: I think it is essentially teleological to assume that there *is* a solution.

It is an interesting problem even if there is no clear solution.
But now I think may be it doesn't matter in the long run.  We
will let our new AI lords worry about it :-)


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-26 16:41               ` Arthur Krewat
@ 2018-03-26 19:53                 ` Larry McVoy
  0 siblings, 0 replies; 21+ messages in thread
From: Larry McVoy @ 2018-03-26 19:53 UTC (permalink / raw)


On Mon, Mar 26, 2018 at 12:41:15PM -0400, Arthur Krewat wrote:
> On 3/26/2018 12:19 PM, Paul Winalski wrote:
> >  Unix, on the
> >other hand, has always struck me as being less concerned with backward
> >compatibility and more about innovation and experimentation.
> For Sun, it was quite the contrary.
> 
> It was normal to run binaries from SunOS on Solaris. For the longest time,
> the "xv" binary I used on SPARC hardware was compiled on SunOS. It's even an
> X-windows application, and the libraries work.

Yeah, Sun was very good about that.  You got smacked if you broke compat.


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-26 16:19             ` Paul Winalski
  2018-03-26 16:41               ` Arthur Krewat
@ 2018-03-27  1:08               ` Clem Cole
  1 sibling, 0 replies; 21+ messages in thread
From: Clem Cole @ 2018-03-27  1:08 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 2632 bytes --]

On Mon, Mar 26, 2018 at 12:19 PM, Paul Winalski <paul.winalski at gmail.com>
wrote:

> ​... ​
> VMS
> ​ ​
> software developers called the philosophy of strict backward
> compatibility "The Promise" and took it very seriously.  Unix, on the
> other hand, has always struck me as being less concerned with backward
> compatibility and more about innovation and experimentation.  I think
> the assumption with Unix is that you have the sources for your
> programs, so you can recompile or modify them to keep up with
> incompatible changes.


​Paul be careful here.   Yes, BSD did that.  I'll never forget hearing Joy
once saying he thought it was a good idea to make people recompile because
their code stayed fresh.

But as UNIX move from Universities to commercial firms, ​binaries became
really important.
DEC, Masscomp (being ex-DECies) and eventually Sun took that same promise
with them.

Linux has been mixed.  The problem is that UNIX is more than just the
kernel itself.  As the SPEC1170 work revealed in the late 1980s/early
1990's - there were 1170 different interfaces that ISVs had to think about
and agreeing between vendors much less releases within vendors was
difficult.

And here is an interesting observation ...

The ideas behind UNIX was (more or less) HW independant.   Just think, how
hard it was for DEC, Masscomp or SUN to keep VMS or Ultrix/Tru64, RTU,
SunOS/Solaris binary compatible.   It's part of why the whole UNIX
standards of API *vs.* ABI wars raged.   It was and is a control problem.

Linux (sort of) solves it by keeping the Kernel as their definition.   But
that really only partly works.   kernel.org streams out new kernels way
faster than the distros.  And the distros can not agree on the different
placements for things.   Then you get into things like Messaging stacks.
 It why Intel had to develop 'Cluster Ready.'   I will not say to protect
the guilty, but one very well known HPC ISV had a test matrix of 144
different linux combinations before they could ship....

Just to give you an idea .. if they developed say on IBM/Lenovo under RHEL
version mumble, but tried to release a binary on the same RHEL but on say
HP gear, it would not work, because IBM had used Qlogic IB and HP Mellanox
say.   Or worse they both had used the same vendor of the gear but
different releases of the IB stack (it gets worse and worse).

The issue is that each vendor wants (needs) to have some level of control.


Clem
ᐧ
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://minnie.tuhs.org/pipermail/tuhs/attachments/20180326/20c7d5d0/attachment.html>


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-26 13:43           ` Tim Bradshaw
  2018-03-26 16:19             ` Paul Winalski
  2018-03-26 19:04             ` Bakul Shah
@ 2018-03-27  1:21             ` Steve Johnson
  2018-03-27  1:46               ` Clem Cole
  2 siblings, 1 reply; 21+ messages in thread
From: Steve Johnson @ 2018-03-27  1:21 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 2521 bytes --]

Portability and standard API are very good for users, but
manufacturers hate them.  Unix portability was supported by AT&T in
part because they were getting flak from non-DEC manufacturers about
Unix running only on the PDP-11.   Once Unix was ported,
applications could run on hardware that was cheapest and most
appropriate to the application rather than where it was first
written.  Users, much more than computer companies, benefited from
portability.  And DARPA, with a similarly broad view, for the most
part did things that helped users rather than specific manufacturers.

In fact, the first thing most manufacturers did with Unix was to
change it.  The rot set in quickly, leading to long boring chaotic
standards efforts over POSIX and C (remember OSF?).

On the other hand, manufacturers love open source.   There are no
apparent limits on growth, and few guiding hands to prevent silly, or
downright dangerous features from creeping into the endlessly bloating
code bases.  Each company can get their own version at low cost and
keep their customers happy, serene in the knowledge that if the
customers try to use another system they will have to deal with the
100+ pages of incompatible GCC options and have to tame piles of
poorly concieved and documented code.  None of the stakeholders in
open source have anything to gain by being compatible, or even letting
people know when they change something incompatibly without warning. 
After all, it can only hurt the other stakeholders, not them...

Yes, I'm old and cynical, and yes there are some islands of sanity
fighting the general trend.   And yes, I think this is a cyclical
problem that will swing back towards sanity, hopefully soon.  But
where is the AT&T or DARPA with enough smarts and resources to do
things simply, and motivation to make users happy rather than
increasing profits?

Steve

----- Original Message -----
From: "Tim Bradshaw" <tfb@tfeb.org>. . .

To bring this back to Unix history, I think this is an example of a
place where Unix has failed (or, perhaps, where people have failed to
make use of it properly).  Half the reason these things are such a
trouble is that they aren't written to any really portable API, so the
bit that runs on Solaris isn't going to run on AIX or Linux, and it
only might run on the current version of Solaris in fact.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://minnie.tuhs.org/pipermail/tuhs/attachments/20180326/596b740f/attachment.html>


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-27  1:21             ` Steve Johnson
@ 2018-03-27  1:46               ` Clem Cole
  0 siblings, 0 replies; 21+ messages in thread
From: Clem Cole @ 2018-03-27  1:46 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 1306 bytes --]

On Mon, Mar 26, 2018 at 9:21 PM, Steve Johnson <scj at yaccman.com> wrote:

> On the other hand, manufacturers love open source.
>
​HW manufacturers love FOSS because it got them out of yet another thing
that cost them money.   We saw the investment in compilers going away, now
we see the same in the OS.   But many ISV's are still not so sure.​

Funny compilers are a strange thing ... it's not in Gnu, much less
Microsoft's interest to get that last few percent out of any chip, as much
as it is for the chip developer.   Firms like Intel have their own compiler
team, ARM and AMD pay third parties.   But because if the competition, the
FOSS or even proprietary compilers get better [Certainly for languages like
Fortran were performance is everything - which is why 'production' shops
will pay for a high end compiler - be it from PCG, Intel or Cray say].

Truth is FOSS has changed the model.  But there are only some people who
will pay for support (particularly large HPC sites we all can name).   They
will pay some things, but those sites want to change everything (yet
another rant I'll leave for another day ;-)



​
ᐧ
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://minnie.tuhs.org/pipermail/tuhs/attachments/20180326/61ecbf1a/attachment.html>


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
  2018-03-23  2:53 [TUHS] " Doug McIlroy
@ 2018-03-23 18:27 ` Bakul Shah
  0 siblings, 0 replies; 21+ messages in thread
From: Bakul Shah @ 2018-03-23 18:27 UTC (permalink / raw)


On Mar 22, 2018, at 7:53 PM, Doug McIlroy <doug at cs.dartmouth.edu> wrote:
> 
> "The only thing I can think of is to use have programs that
> translate programs in todays languages to a common but very
> simple universal language for their "long term storage". May
> be something like Lamport's TLA+? A very tough job.
> "
> 
> Maybe not so hard. An existence proof is Brenda Baker's "struct",
> which was in v7. It converted Fortran to Ratfor (which of course
> turned it back to Fortran). Interestingly, authors found their
> completely reorganized code easier to read than what they had
> written in the first place.
> 
> Her big discovery was a canonical form--it was not a matter of
> taste or choice how the code got rearranged.
> 
> It would be harder to convert the code to say, Matlab,
> because then you'd have to unravel COMMON statements and
> format strings. It's easy to cook up nasty examples, like
> getting away with writing behyond the end of an array, but
> such things are rare in working code.

I can believe that for Fortran but for C/C++ or other such 
less well defined languages this may be much harder. Far
easier to write an emulator for x86 and that is fine if all
you want to do is run the same old compiled program but if you
want to make changes, de-compiling x86 code to something
structured would be much harder. Compiling the original code
to a multi-paradigm language such as Scheme or Lisp may be
another alternative....
 
[Aside:
Thanks for mentioning the name of this program as I had 
forgotten it.  I used "struct" once to convert a Fortran 
program to Ratfor and then manually to C. This was for 
programming PALs and we wanted to make some local changes.
IIRC, this was back in '82, back when vendors gave you
programs with sources for their devices, unlike the Xilinx &
Altera of today who don't even publish bitstream formats used
to program their devices.]


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [TUHS] long lived programs (was Re: RIP John Backus
@ 2018-03-23  2:53 Doug McIlroy
  2018-03-23 18:27 ` Bakul Shah
  0 siblings, 1 reply; 21+ messages in thread
From: Doug McIlroy @ 2018-03-23  2:53 UTC (permalink / raw)


"The only thing I can think of is to use have programs that
translate programs in todays languages to a common but very
simple universal language for their "long term storage". May
be something like Lamport's TLA+? A very tough job.
"

Maybe not so hard. An existence proof is Brenda Baker's "struct",
which was in v7. It converted Fortran to Ratfor (which of course
turned it back to Fortran). Interestingly, authors found their
completely reorganized code easier to read than what they had
written in the first place. 

Her big discovery was a canonical form--it was not a matter of
taste or choice how the code got rearranged.

It would be harder to convert the code to say, Matlab,
because then you'd have to unravel COMMON statements and
format strings. It's easy to cook up nasty examples, like
getting away with writing behyond the end of an array, but
such things are rare in working code.

Doug


^ permalink raw reply	[flat|nested] 21+ messages in thread

end of thread, other threads:[~2018-03-27  1:46 UTC | newest]

Thread overview: 21+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
     [not found] <mailman.21.1521314548.3788.tuhs@minnie.tuhs.org>
2018-03-17 20:14 ` [TUHS] RIP John Backus Paul McJones
2018-03-17 22:27   ` Steve Johnson
2018-03-22 21:05     ` [TUHS] long lived programs (was " Bakul Shah
2018-03-22 21:35       ` Clem Cole
2018-03-23 19:28         ` Bakul Shah
2018-03-23 19:44           ` Larry McVoy
2018-03-23 21:23           ` Clem Cole
2018-03-23 21:36             ` Warner Losh
2018-03-23 22:02               ` Steve Johnson
2018-03-26 13:43           ` Tim Bradshaw
2018-03-26 16:19             ` Paul Winalski
2018-03-26 16:41               ` Arthur Krewat
2018-03-26 19:53                 ` Larry McVoy
2018-03-27  1:08               ` Clem Cole
2018-03-26 19:04             ` Bakul Shah
2018-03-27  1:21             ` Steve Johnson
2018-03-27  1:46               ` Clem Cole
2018-03-23 10:43       ` Tim Bradshaw
     [not found]         ` <CAC20D2P1CZwaD0uJpMhWg11muDeH9rEn3X+AUjXvwMKsNjs7ng@mail.gmail.com>
2018-03-26  0:53           ` [TUHS] Fwd: " Clem Cole
2018-03-23  2:53 [TUHS] " Doug McIlroy
2018-03-23 18:27 ` Bakul Shah

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).