The size of Linux’s waistline has long been the focus of recurring attention here in the Linux blogosphere, even drawing occasional criticism from Linus Torvalds himself.
Recently, however, a fresh weight-related complaint was made — not about the kernel itself, but about today’s Linux distros.
“Linux fatware? These distros need to slim down” was the title of the InfoWorld piece that got the conversational ball rolling, and it’s sparked quite a lively discourse.
“As I prepped a new virtual server template the other day, it occurred to me that we need more virtualization-specific Linux distributions or at least specific VM-only options when performing an install,” author Paul Venezia began.
“What I’d like to see is sanctioned and supported VM-only cuts of major Linux distributions, perhaps tailored for specific hypervisors,” Venezia explained. “This would not necessarily increase the number of distributions available, but could be made possible through a single install option.”
Over on Slashdot, bloggers had no shortage of thoughts to share.
‘It’s Called Debian Net Install’
“Got that,” wrote Anonymous Coward, for example. “It’s called Debian Net Install. Done.”
Similarly, “apt-get install what-I-need-and-nothing-else,” offered jedidiah.
“TFA was a complete exercise in BS,” opined Freshly Exhumed. “Here’s another example of how to do a slim Linux install: during a Mageia or Mandriva install, select the Custom option, deselect everything, click through to proceed but when it stops to check if you really, really want to have such a sparse choice, select ‘truly-minimal-install’ and you will get exactly what it says, without X or even man pages.”
Down at the blogosphere’s Punchy Penguin Saloon, freely flowing tequila generated even more ideas.
‘Do It Yourself’
“In the immortal words of The Divine Tuxiness himself, Linus Torvalds, ‘the Linux philosophy is, laugh in the face of danger,'” Linux Rants blogger Mike Stone began.
“Oops. Wrong one. ‘Do it yourself’ — that’s it,” Stone added.
“I’m not certain that I agree it’s worth a distro creator’s time to produce a VM-only version of their product just to solve a problem that most Linux users will not encounter,” Stone explained. “If Paul Venezia believes otherwise, it’s an open source OS, and he’s free to create them and distribute them himself.”
“Please Cater to My Whims’
Indeed, “couldn’t this be done by compiling from source?” asked Google+ blogger Kevin O’Brien.
“Somehow this complaint reads to me like, ‘Would the community please drop all of their other pressing business and cater to my whims?'” O’Brien added.
And again: “Why EXACTLY should they ‘slim down’ when the entire point is you can make the OS as fat or thin as you want?” Slashdot blogger hairyfeet agreed. “Even a Windows guy like me knows you can download ONLY the parts that you need and then package the results. If he doesn’t want a ‘fat’ distro, why use one?
“I just typed ‘Virtualbox Linux’ and found a dozen pre-made virtual images that are ready to go… so what exactly is the problem?” hairyfeet wondered. “Why should the distros themselves use their limited resources to solve a problem that others have already solved?”
‘I Don’t See the Need’
In short, “maybe I’m missing something, but this seems to me to be a solution to a problem that was solved ages ago,” hairyfeet concluded.
“I’m not sure if the distros need to slim down specifically for virtual machine applications, or if it is a more fundamental issue with the distros in general,” chimed in Google+ blogger Brett Legree. “Regardless, an experienced Linux user will have no problem slimming down the installation to suit his or her requirements for virtual machine work.”
There are also “more modular distros like Slackware, Gentoo and Arch,” noted Google+ blogger Gonzalo Velasco C. “I’m not a specialist, but I don’t see the need (yet) to have more than two or three good options.”
‘Removing Things Is Hard’
On the other hand, “I have to say, I agree” with Venezia, consultant and Slashdot blogger Gerhard Mack told Linux Girl. “Removing things is hard, and sometimes even experts leave things behind because (I’m guessing) one thing can look like another, such as an AWS install I did and wondered why it had XFS running (X font server, not the filesystem) in a stripped-down install.
“Some distros are better than others at this,” he added.
“I have watched Debian move in the right direction and install less and less with each default release and reduce weird dependency requirements by a lot, although there are still too many ways to pull in Apache by accident (really annoying when the machine runs another webserver) or the classic install nagios-plugins to monitor your webserver and end up with Samba (windows file sharing),” Mack went on.
Still, “I should at least give them credit for the fact that it no longer pulls in things like the Novell NetWare client,” he concluded.
‘I Install Just the Bare Minimum’
Last but not least, blogger Robert Pogson was also impressed with Debian’s efforts.
“As a fat person, I can relate to the problems of fat software,” Pogson began. “Fortunately, I use Debian GNU/Linux, so I can control the installation/maintenance process in detail and choose Xfce 4 over GNOME/KDE/Bloat.”
In fact, “rather than taking the default installation, I like to install just the bare minimum to get a bootable system and then add slimmed-down desktop environments and applications,” Pogson explained. “I skip otherwise worthy applications that depend on KDE/GNOME simply because I want applications that work for me without diverting the awesome power of my computer to trivial/useless features. My computer can easily run any GNU/Linux distro, but why waste the resource?”
‘Afloat in the Bloat’
There was a time “about a decade ago when all distros of GNU/Linux were lean and lively,” Pogson recalled. “Then developers began to assume that M$’s slapping on layers of paint on the old barn was the right thing to do to attract users.”
In fact, while “there may be some minority for whom feature-bloat is attractive, most of us have computers to create, modify, store and present information efficiently and quickly,” he added. “Bloat that may be competitive with that other OS is not attractive for most. A lean, athletic system with impressive performance is.”
Pogson’s advice? “Learn to use an installation program like Debian’s to keep things simple and fast,” he explained. “The benefits will last the lifetime of your hardware — a lot longer than the time your hardware can stay afloat in the bloat.”
When hardware gets to be about eight years old, “it may indeed become slow by modern standards, but even then its life may be extended by using it only to show pix and send clicks to GNU/Linux running on a newer, faster computer,” Pogson suggested.
“The venerable X window system still found in GNU/Linux makes that easy to do,” he concluded. “Learn to use it and any replacement that follows to create the ultimate in lean and lively systems, a thin client.”
So long as the ‘Linux’ people are afraid of making their system appealing to anybody but hobbyists, they will be a little people, a silly people.
The money you take from Microturkey is but a trifle, taken from a Great Chest they keep…..in Seattle!
My point exactly. What Microsoft got really really wrong was dominating the world for twenty years…..no….hang on…..wait a minute…..
Which world do you live in? MS gets faster with use? You must have a genie in your machine. The reason that you need 8GB of Ram is not because it is cheap, which is relative (if you are not from Europe or North America that changes really fast), it is because programmers dump everything that they thing a computer will need into RAM. There is one thing I am fairly sure of, fill up your 8GB and your machine doesn’t go much faster than someone with only 2GB. In fact, once it gets near 75% you’ll start to notice a significant change. There is nothing wrong with demanding better memory utilization (and for that matter CPU scheduling).
"How low can ya go" needs to be the mantra of all OSes now and I’ll state two reasons for that, mobile and latent. The march towards smaller form factor mean that if Linux want to truly compete in the mobile it has to reduce it’s foot print. while there is some scope for hauling around 6GB of system files which may or may not be used, it also means 6GB of data can can become corrupted. The second fact is that except for the hobbyist (like myself)who just dabbles to see what they can do with conky or see how many filesystems one can have before Linux starts to get dizzy, most people want an OS that doesn’t hog resources. On a machine 2GB of RAM which runs Debian (with XFCE)like a little sports car, "upgrade" to KDE or GNOME and it will canter along like nice little donkey-cart. It will get the job done, but don’t expect a prize for speed.
Also recognize that the fact that you can easily come up with a $100 to upgrade your toy doesn’t mean everyone can. There are people who still are happily using their "gud ole" PIII and just can’t afford to upgrade. You can always suggest that they don’t come to the party but I’m hoping you’re too kind for that.
I will agree with you that Pogson can be a bit over-the-top but that’s his way. Guess what? I think its funny and I like it. We all can’t be a piece of stiff necked beef jerky.
FUD, and not even original FUD. What’s next, gonna talk about how Windows BSODs daily? Since its pretty obvious you haven’t used one since 1998. Look up "readycache" there are pleenty of tests out there showing it works and just FYI? I’m using less than 900Mb of that 8GB and that is with most of the bling turned on, the rest of that RAM is being used for cache.
But you go ahead and play "how low can you go", Linux hasn’t gone anywhere in the consumer market in 20 years (No Android is NOT Linux, its an embedded OS controlled by Google made to run Java code, you might as well claim routers as desktops if you are gonna go there) and I doubt even with Ballmer putting out Windows 8 sized bombs Linux will go anywhere as if anything the people will just look at Macs, which hey what do ya know, they don’t care about RAM limbo either.
Linux developers have been falling into the same trap that Windows developers fell into along time ago. They became lazy and unimaginative. The attitude seems to be "it’s there so let us all try to max it out at once". When I was introduced to Linux in the days of Mandrake 6 or RedHat 6 (I can’t remember), the Linux was doing things that Microsoft user would have to wait till Vista to start to see work properly with a fraction of the resources. But the imagination has stopped and what they do now is try to see who can use the most memory and CPU power. Lots of eye candy and some of it is nice, but not needed as default; lots of options, but sometimes useless (or less used your choice)options that could fit as an additional software of choice.
Slitaz can run off a CD and provide sufficient software to get the average office job done, that’s (way)less than 700MB. How is it that Slitaz or puppy can run in 256MB of ram ab be useful? Why can’t doubling that provide a more feature rich OS? why do we need a 2GB system to enjoy the best that Linux has to enjoy what Linux has to offer without sacrificing utility?I wonder why for example, a browser needs to tie up 880MB of memory when the OS says it can run with 512MB? The argument is that memory is cheap, I say lazy developers.
The lazy and the lack of innovation makes for what we now have stuff that works but doesn’t show the genius of ‘yesteryear’. Well, maybe the innovative have migrated to the fruity band or to the Driod. Downloading a software that is 5MB in size can require as much as 30MB or more in dependencies. Since I can’t program and don’t know how to chip away at the fat, it adds up to a lot of wasted space. In windows it is easy to get rid of some of that junk but fool around in Linux and you might have to start all over.
Have you ever wondered how you can type a complete article, edit photos for that article and when you are done watch a movie in HD on a tablet with less than a 4GB storage? I have, I think its hard working and innovative people. But like I said, I’m not a developer so I really don’t know how hard it is so I might be wrong.
Is this 1997? or are you just getting all your gear from a dumpster? you DO know that no matter whether RAM is full or empty it STILL uses the same amount of power yes?
Frankly this is one thing the Linux camp can learn from MSFT, RAM unused is RAM wasted. With Windows 7 the longer you use the OS the faster it gets because it LEARNS and uses the memory that isn’t needed by the OS as a cache so that programs load instantly. This doesn’t affect any programs because since its only a cache if another program needs it the cache can be dumped.
So I’m sorry but this constant "how low can ya go" nonsense is just that, nonsense. this isn’t 1997 anymore and cores, HDD space, and RAM are cheap and plentiful, heck my little $350 netbook has 8GB of RAM and 320Gb of HDD for the love of Pete. Instead of just staring at some gauge and trying to shave another MB or two USE those resources to make the system BETTER, make it load programs faster, make it more responsive, remember nothing beats RAM when it comes to speed, not even SSDs, so use it.
Oh and please not that Pogson YET AGAIN makes the Linux community look like a bunch of loonies thanks to his "Voldemort syndrome" where he simply can not say Microsoft or MSFT or MS for fear that Bill Gates will jump out of his basement closet and beat him. His constant "other OS" and "M$" really does make the community look like basement whackos, honestly he makes a better argument FOR Apple and MSFT than anybody else, simply because it makes the community look loony.