Should binary-only SW be written against glibc ?

David Hamilton davidh at bearoak.com
Fri Aug 28 13:58:01 PDT 1998


Davide Bolcioni wrote:
> 
> Hi everybody,
>   first linux list with no new messages for one day, so I thought I'd
> submit one. I am not an active Linux developer, but have been observing
> for some time (administering part time also). The question is in the
> subject: maybe developers wishing to distribute binary-only applications
> should be encouraged to write against a library whose evolution is not
> as fast as the evolution of glibc.
>   As a consequence, glibc could be updated when necessary (I do not mean
> bug fixes, but changes which require recompilation) with changes
> unhindered (to some extent) by compatibility with existing binaries.
>   The same argument may be extended to most public domain libraries, but
> it occurred to me for glibc because of its pervasiveness.

This has been a continuing problem for all unix distributions for as
far back as you want to go.  This is nothing new to Linux.  This is
not normally a problem for specific applications, since they know
which library they need to link with.  It is a potential problem for
the user and/or administrator, since each library to which
applications link must be loaded, thus consuming memory (real and/or
virtual).  

On the admin side, there is also the problem of determining when the
older versions of specific libraries can be safely deleted.  On some
of my systems, the library versions go back more than 8 revisions. 
Which ones can be safely deleted?  Without a standard package
management system, it is very difficult to determine this (it isn't
easy evey with a std package management system...).

This problem is magnified by the number of libraries beyond glibc,
to which applications must link.

IF a pkg mngmnt system is rigorously used (no tgz installations) and
IF it is a good pkg mgmnt system (RH, debian, SuSu), then it should
be possible to write admin applications that can adequately report
this situation.  But this is almost never the situation, at least at
present.  Less that 50% of the packages out there have complete
information regarding dependencies.  The situation is further
complicated by minor releases or patches to resolve security
problems.

The situation gets even more difficult when egcs vs gcc comes into
the picture.  Recently, I have been seeing a number of software
packages that will only build under egcs.  This is problematic,
since others will only build under gcc.  I understand that this is a
period of transition, but it presents significant problems for those
of us that manage Linux installations that must work uniformly with
other unix installations.

I agree that glibc should be handled as major and minor version
releases and that all minor releases should be handled as patches. 
Applications should be coded to the most recent stable major release
- not to minor releases, unless absolutely necessary.

Assuming that we all buy into the file system standard, and I
definately do, then any additional standards become a matter of
snapshots into the development process.  I have no problem at all in
coding to and supporting an LSB-99 standard that freezes various
libraries for the purposes of standardization.  If something I'm
coding needs to go to "interim" or "experimental" libraries, that's
fine, as long as I establish the dependencies based on the
standard.  But if I am building a big commercial product, then I
would build it to LSB-99.  In fact, unless there was a compelling
reason to use the "experimental" stuff, I would code to LSB-99.

-- 
dh
_____________________________________________________
David Hamilton                      Bear Oak Design



More information about the lsb-discuss mailing list