You know a controversy is a big one when none other than RMS weighs in with his opinion, and sure enough, that’s what happened in the Mono debate late last week.
“Debian’s decision to include Mono in the default installation, for the sake of Tomboy which is an application written in C#, leads the community in a risky direction,” RMS wrote on the Free Software Foundation’s site. “It is dangerous to depend on C#, so we need to discourage its use.”
Of course, rather than settling the debate, Stallman’s proclamation only fanned the flames higher, resulting in a raging inferno that has since burned its way through Slashdot, LXer, Boycott Novell and beyond.
Linux Girl was glad she remembered her fireproof cape this week!
Slightly less scorching — though no less interesting — is a separate debate that’s been going on in recent days as well.
The cause this time? Nothing less than an assertion that software installation in GNU/Linux is broken.
“Every GNU/Linux distribution at the moment (including Ubuntu) confuses system software with end user software, whereas they are two very different beasts which should be treated very, very differently,” asserted Tony Mobily on Free Software Magazine.
As a result, software installation in GNU/Linux is not just broken, but “terribly broken,” he added.
Apple’s OS X, on the other hand, “got software installation just right,” Mobily asserted.
Hear that sound in the distance? That’s right — it’s the thundering hooves of the blogging Linux herds, rushing to get in their comments.
‘One Little Problem: Reality’
“I intend to show how there is one little problem” with Mobily’s assertions, wrote Roberto Alsina in his Lateral Opinion blog, for example. “Reality.”
On the other hand: “I agree that the system is flawed, but it is not broken,” wrote Thomas Teisberg on the Linux Loop. “The most important thing is that we not reinvent everything around the fragmented models that OS X and Windows use.”
Even more so: “When I first read Tony’s article I was tempted to post that he was a blithering idiot, but calmed down and moved on,” added GregE in the Loop’s comments. “You are correct in criticizing his post. We must never move to a Windows like system.”
Bloggers on LXer also picked up the topic — and Alsina’s response — in two separate threads, chiming in with a total of more than 40 comments.
We here at LinuxInsider knew it was time to dig a little deeper for some more insight.
Mobily’s article is “rather misguided,” Slashdot blogger drinkypoo told LinuxInsider. “The reason so many things can ‘just run’ on OSX is that they include all their required libraries. Fair enough, but this leads to two major failings.”
First, when the libraries are updated for some other application, “it doesn’t help all of them, so you now have to update more applications to patch security holes,” he noted.
Second, “if the applications come with different versions of libraries, they will be loaded concurrently, which will both waste memory and potentially cause unwanted behavior, especially if the programs are expected to interoperate in some way,” he said.
Unfortunately, “the author of the article repeatedly makes it clear that he is not qualified to even comment on these matters,” drinkypoo asserted. “He even goes so far as to say, ‘I can’t really fix this problem. It will take a lot of effort, and a lot of courage from major players to even start heading in the right direction.’
“If he can’t fix this problem, he shouldn’t be commenting,” drinkypoo opined.
“Every single one of his suggestions would be fairly simply implemented, many of them with a short shell script,” he added. “Some of them are basically wrongheaded, though.”
For example: “A list of libraries and versions expected to be included with the operating system? You can generate that yourself by examining the install media,” drinkypoo noted. “How will it help programmers trying to support multiple Linux distributions?”
‘Did This Guy EVER Write Any C?’
Adding libraries to the path before system libraries “is done trivially by having your ‘application’ start through a script which manipulates LD_PRELOAD and/or LD_LIBRARY_PATH, requiring no modifications to the system,” drinkypoo explained.
“Identifying the copying of a file would happen in libc, not the kernel — did this guy EVER write any C?” he wondered. “Updates could be handled by the same script that handles juggling LD_* variables, we already have numerous systems for digital signatures, hiding applications is trivial, and a ‘recipe system’ already exists for every major distribution.”
In short, “you could turn Debian, FreeBSD, or whatever into the system he wants largely by mucking with apt or ports, and making some small changes to a file manager,” drinkypoo concluded. “Anyone who has any business writing an article like this has the abilities to go forth and make it happen, without any effort or even courage from major players.”
‘No Clue How Things Work’
Mobily’s article “is written by someone who has no clue how things work in the Linux environment,” Montreal consultant and Slashdot blogger Gerhard Mack agreed. “I suspect one of his major problems is that he has never installed a third-party app that was not a part of his distro. The rebuttals have the same problem.”
There is nothing described that can’t be done already, “with the exception of having the overall package manager update locally installed software,” Mack told LinuxInsider. “The reason you don’t want the OS package manager to update locally installed software is that for all the complexity (and possible security bugs) it adds, you end up with the same problem you had in the first place: The OS would update everything to the latest version anyway.”
Keeping several versions around on a global system basis is “easy,” he added. “It’s all about how it’s packaged. Want to have three versions of Opera around? Fix the package to have separate installs for each branch of Opera.”
‘A Whole Article About Nothing’
The largest problem with the article, however, is that Mobily — and most of his detractors — “make the same mistake he accuses the package manager of making in the first place: confusing local and systemwide installs,” Mack said. “In this case it’s all about the packaging.
“OS-agnostic packages are easy,” he explained. “You can have a self-extracting batch file with the tar built in for ease of use:
1: check dependencies2: install to a local folder3: provide symlinks or install shell scripts to the user’s bin directory. (/home/user/bin) Most distros already check for executables in there.”
If there’s a need to install local libraries, “the LD search path can be overridden from the app’s shell script, or you can just statically compile and have the libraries built into the executable,” Mack explained.
In short, “this was a whole article about nothing, and the worst part is that had he done even a small amount of research, he could have found out people already do this,” Mack concluded.
‘A Legitimate Demand’
“I disagree with several of the main premises” of Mobily’s article, though it does “legitimately point to some areas where there is room for improvement,” Chris Travers, a Slashdot blogger who works on the LedgerSMB project, told LinuxInsider.
Software installation on Linux is not “terribly broken — just in need of some further development,” Travers added.
Regarding the first issue in Mobily’s assertions, “it is a legitimate demand to be able to do a local install into a user’s home directory,” Travers said. “There are two important tradeoffs that would occur with this approach, though, and admins of a system MUST have the opportunity to disable it.”
Allowing user-installed software “opens the system wide to viruses,” he explained. “Ideally such a feature should be OFF by default but should be an option when required. In reality, this currently is ON by default but unsupported by package managers — really the worst of both worlds.”
Administrators may also need full control over the client computer, he pointed out.
‘A Moot Point’
The second issue, Travers added, is ease of use. “When package managers work well, they solve this problem; however, when things break down, they break down badly,” he noted. “The solution to my mind is to provide application-specific repositories and have these loaded in an initial installation.”
Finally, the last issue cited — the difficulty in giving software away to a friend — really is “a moot point,” Travers asserted.
“Either you give away the packages (and dependencies), or give a tarball of files,” he said. “Having every program as a monolithic package (outside of system libraries) in order to solve the installation problem is quite unwise because it will cause far more problems than it will solve.”
The package manager is in essence an "app store".
Would you call application installation broken
on the iPhone? No; the app store works great
there. The challenge for each Linux distribution
is to make it easier for third parties to get
apps into their app store.
Ubuntu is actively working on this in two ways.
First, they have a partner repository, which
is exactly like the traditional app store;
third parties pay a small fee to Canonical,
and their app gets listed in "Add/Remove Software"
and hosted on Canonical’s servers.
Second, Ubuntu is considering doing the same
thing with apps hosted on the vendor’s own
servers. A few vendors – google and adobe come
to mind – already run Linux repositories, and
Canonical is willing to include those in their
software list so that users can easily discover
those apps via "Add/Remove Software". See
The Linux Foundation is working a different
angle: by maintaining the LSB and providing
tools for developers to check their software
for compliance, they’re making it much more
likely that software can be compiled once
and run on all Linux systems (of the same CPU
None of this comes easy for developers
who are new to Linux and are used to other
systems, but it can work well once you know
That was precisely the point of my response to the original article.