On 12/24/2014 08:48 AM, Stephen
Partington wrote:
i like the Ubuntu release cycle a great deal.
they have a long term support release, and then incremental
releases on a stability and then feature swing each year. this
to me is a great model.
I fell in love with Ubuntu from the 6.04 to 10.04 days after
learning to hate using RH in any linux environment I'd worked in,
but after 10.04, it's pretty much just a lesser evil. I like the
release cycle for a server, as rarely do I need anything bleeding
edge there. As a desktop, it more or less sucks however. More
often than not, I find that in order to fix some terrible bug
annoying me, I have to upgrade the distribution. Not a big deal,
but every release between 10.04 to 14.04 was a horrid process,
almost always bricking my system in some new, creative way and
ruining gconf profile data that caused weird gtk issues across the
whole desktop even when I did get the system back.
The parts i did not like about red hat, even as
a server, i spent more time compiling applications than
anything else and fiddling them all into place. while
educational its REALLY nice to have repositories that do this
for you. and yes there are a number of bundled repositories
you can bolt on to redhat/centos, but they never quire gave me
the breadth of access i ever needed so i was back to building
applications i wanted to use. in the end what i wanted to do
was just easier with debian, then ubuntu came up with a much
more modern installer and that was where i really became
comfortable. easy to use, and able to recognize 95% of all the
hardware i have ever thrown at it. and to top it off some of
the easiest and complete chunks of documentation and support.
To this day, I find RH and Cent to still be dysfunctional in this
fashion, where anything in repos is so dated or horribly
buggy/unusable that isn't common application, you end up compiling
it yourself. Then I get taken back to 1999 and get reminded of when
I learned the term "dependency hell" that old solaris guys used to
joke about RH being "immature", but this is still common. Anything
newer you might want to compile will require you update enough of
the os you'll likely break old and new system components alike,
ending up with some broken abortion of an os in the process.
The equivalent in debian-ish builds is breaking apt trying to force
in 3rd party packages out of necessity. Luckily Ubuntu tends to
keep somewhat modern that you don't end up having to rebuild the os
to compile something, where I've generally had good luck doing that
when needed, but finding compiled packages for new software is a
crapshoot.
Throw in a GPU for desktop use (or specialized network nics with
vendor-provided blob drivers), and you create all sorts of new
adventure trying to find a stable driver build that works with
anything but a "stable" release on any distro. So much time is
spent working around xorg these days to make buggy software like
compiz work (you know you *need* wobbly windows), ubuntu often
outpaces the gpu vendors, especially amd to make a driver work in
anything newly release should you *need* to upgrade distributions.
It's about impossible to win these days with a general solution for
everyone, both server and desktops.
I guess what i am saying this is likely a
similar path that allot of people have taken, and this is
giving ubuntu its real market share.
I'd tried recently mint, fedora, and cent as an out from Ubuntu,
finding all to be horribly buggier for my needs than Ubuntu. I
simply fell back into complacency, figuring out a way to live with
ubuntu again with a clean build until the next release cycle breaks
it again. At least canonical didn't ruin their netinstall iso, yet.
-mb
---------------------------------------------------
PLUG-discuss mailing list -
PLUG-discuss@lists.phxlinux.org
To subscribe, unsubscribe, or to change your mail settings:
http://lists.phxlinux.org/mailman/listinfo/plug-discuss