From mboxrd@z Thu Jan 1 00:00:00 1970 Message-ID: To: 9fans@cse.psu.edu Subject: Re: [9fans] Google search of the day From: Brantley Coile Date: Fri, 15 Feb 2008 09:56:57 -0500 In-Reply-To: <8acf684047f1930df45bba2b9208ea3e@quanstro.net> MIME-Version: 1.0 Content-Type: text/plain; charset="US-ASCII" Content-Transfer-Encoding: 7bit Topicbox-Message-UUID: 54b14c0e-ead3-11e9-9d60-3106f5b1d025 >> (2) C, as well as many other PLs, has always had a problem in >> that there is no clean, standard mechanism to handle the >> situation in which a function invocation finds insufficient >> stack remaining to complete the linkage (instance allocation). >> This is especially problematic in memory-constrained apps such >> as many embedded systems, when the algorithm is sufficiently >> dynamic that it is impossible to predict the maximum nesting >> depth. At least with malloc failure, the program is informed >> when there is a problem and can take measures to cope with it. >> >> I hope people working on run-time environments will find ways >> to do better. FORTRAN never had this problem. Its memory needs were fixed at compile time. Neither did COBOL. But then again, you couldn't write recursive programs either; all locals were static storage class. The trade off to get recursion has been worth it and doesn't cause problems actual use. It wasn't a problem with Algol, PL/1, Pascal, and C programs on very small machines. Why should it be a problem with today's memory sizes? Brantley