In a recent post, someone hinted that package
management was for the clueless. I don't feel
that this is the case. Package management and
"./configure; make; make install" are simply
two separate beasts.
The autoconfigure stuff was created because of
the differences in OSes, the location of
header files, existence of function calls,
function call parameters, and so on.
Package management is an attempt to solve
- What files are associated with each other?
- Have any files gone missing?
- For this package to function, what other
packages (or services) are required?
- Where the Hell did THAT file come from?
- How's the integrity (file ownership,
modes, contents) of my system?
- Are all of the systems I admin running
the same version of a given package?
Yes, a system created with a series of
"./configure; make install" (or Slackware's
untar this tarball) will run. Usually, though,
these systems end up with a LOT of cruft and
generally over time become a mixed bag of sh*t.
For the Linux systems that I admin, I have a
simple rule. If an .rpm or .deb is available,
use it. If not, or if I need the latest version
out of a CVS repository, or if I have a requirement
to highly customize something (like Apache), only
then will I fall back to the "./configure"
(or cc -o foo foo.c; mv foo /usr/local/bin)
method and *document*. If I were managing LOTS of
systems, I might take the time to create my own
customized packages. As it is, the package manager
takes care of 99.9% of the files on my systems.
I can track the others manually. Is it because
I'm clueless? I don't think so. It's because I'm
lazy and ambivalent. I really don't care to
manually track the latest "ls" and "vim"
developments.
FWIW, I've used several *nix package managers--
SYSV, AIX installp, (Free|Open)BSD, rpm. installp
is EXTREMELY thorough, but Debian's .deb and apt
system wins, hands down.
D