The full standard c library isn't included in a statically linked executable. Only what's needed is, at least on plan 9, i have no idea what gcc does.

On Nov 14, 2010 3:14 AM, "Gary V. Vaughan" <gary@vaughan.pe> wrote:
> Hi Erik et. al,
>
> Thanks for the feedback, all.
>
> On 14 Nov 2010, at 13:24, erik quanstrom wrote:
>>> You may well be right that there's too much momentum behind
>>> autoconf/automake to change GNU. But that doesn't mean it's the right
>>> thing to do, or something sensible people ought to choose to
>>> participate in.
>>
>> to be a bit more blunt, the argument that the tyrrany of the
>> auto* is unstoppable and will persist for all time is persuasive.
>
> Well, I wouldn't take it quite as far as that. My point is really that
> there is already a vast amount of (often good) software written by
> (often skilled) programmers who have invested a huge amount of time
> and energy into the existing eco-system, and (quite reasonably) want to
> enjoy the advantages of installing and utilising dynamic shared objects.
>
> I doubt that anyone would argue for a full static copy of the C runtime
> in every binary, and between there and making every code library a
> runtime linkable shared library is just a matter of degrees. Since you
> really need to solve the shared compilation unit problem at the lowest
> level anyway, you might as well expose it to at least the next few layers
> in the application stack at least.
>
>> so i choose at this point to get off the gnu train and do something
>> that affords more time for solving problems, rather than baby
>> sitting tools (that baby sit tools)+. i believe "no" is a reasoned answer,
>> when faced with an argument that takes the form of "everybody's
>> doing it, and you can't stop it". i suppose everybody has had that ex-boss.
>
> I would be the last person to sing the praises of the existing GNU
> build system, and I hope the fact that I lurk on this list shows that
> I like to hang around smart people in the hope of picking up some good
> ideas. However, I don't really have the time to write the next big
> build system that solves all of the growing pains of the GNU eco-system,
> and I'm almost entirely certain that even if I did... my efforts would
> go almost entirely unnoticed. Similarly, I don't have the luxury of
> letting the train leave the station without me, unless I first have
> another way of earning a living - and neither would I want to, I
> consider myself blessed that I can earn my living by being involved in,
> (and to a very small extent help to steer a proportion of) the Free
> Software community.
>
> On the other hand, I think that there must be room for incremental
> improvements to the incumbent GNU build system, but I doubt that I
> would see them right away when I'm so close to development of what
> is already in fashion. My ears pricked up when I saw someone claim
> that GNU Libtool is insane, because I'm interested to hear where the
> insanity lurks, and maybe even gain some insight into what the cure
> is. Not only that, I have the rare opportunity of being able to push
> the GNU build system forward if anyone can help me to understand where
> the bad design decisions were made.
>
>> i also think it's reasonable, as anthony points out, just to avoid shared
>> libraries, if that's the pain point.
>
> :-o For an embedded system I would agree, up to a point. But when I'm
> trying to support hundreds of users each running dozens of simultaneous
> binaries, then forcing each binary to keep it's own copy of whatever version
> of each library (and it's dependent libraries) were around at link time
> in memory simultaneously surely can't be the best solution? Or even a
> reasonable solution. I'm not even sure that statically relinking everything
> on the system (actually 30 different architectures in my own case) each
> time a low-level library revs so that the OS memory management can optimise
> away all those duplicate libraries is a reasonable solution.
>
>> sure, one can point out various
>> intracacies of bootstrapping gnu c. but i think that's missing the
>> point that the plan 9 community is making. many of these wounds
>> are self-inflicted, and if side-stepping them gets you to a solution faster,
>> then please side step them. there's no credit for picking a scab.
>
> I have no doubt that the plan 9 community is doing something good for
> the future development of operating systems and software, but that doesn't
> solve anything for my customers who want to run Gnome, KDE and Emacs on
> their AIX, Solaris and HP-UX systems. I still have to build that software
> for them to make a living... and GNU Libtool makes my life immeasurably
> easier. I know this because porting an application written by a GNU
> build system using developer who only ever builds and tests on Mac OS
> usually takes much less than a day, and often no more than an hour to
> pass it's testsuite on all the platforms our customers care about. The
> packages that use cmake and scons and all the other "portable" build
> systems rarely take me less than a week and often somewhat longer to port
> to systems the developer hadn't considered... to the point where nowadays,
> it's easier to port all but the very largest software packages to the GNU
> build system first.
>
> I'm still waiting to hear someone who can make a convincing argument that
> GNU Libtool is not the least bad solution to the problems it wants to
> help developers solve.
>
>> please do take a look at plan9ports. it's portable across cpu type and
>> os without fanfare, or even much code. plan 9 is similar, but much
>> simpler, since it doesn't need to fend off the os.
>
> I have looked at length already, although upgrading to VMWare 4 last year
> killed my Plan 9 VMs, and I didn't yet have the time to try to get them
> running again yet.
>
> Cheers,
> --
> Gary V. Vaughan (gary@gnu.org)