Hacker News new | past | comments | ask | show | jobs | submit login
Ingo Molnar on what ails the Linux desktop (plus.google.com)
205 points by keyist on March 17, 2012 | hide | past | favorite | 191 comments



The reason I stopped recommending Linux to "normal users" is _because_ of the concept of distributions.

Coupling the updates of single apps with the updates of the whole desktop or framework and libs, is just plain wrong. Having to upgrade the whole distro (including all the other installed apps you dont want to upgrade) just to install a new version of one single app you _want_ to update is a nightmare. Total bullshit. Users. Dont. Want. That. Users dont want one update to trigger another update, or even to trigger the upgrade of the whole desktop.

The blog post by ESR is one prominent example: http://esr.ibiblio.org/?p=3822

He basically wanted to upgrade just one (obscure) app, and the process triggered the automatic removal of Gnome2 and installation of Unity. Just _IMAGINE_ how nightmarish this must look for normal users. You simply dont remove somebodys installed desktop sneakily from under their feet. You simply dont. That feels like the total loss of control over your computer.

I personally had, during the last 10 years, people go from Linux (which I talked them into trying) back to windows, _precisely_ of this reason, of having to upgrade the whole distribution every few months just to be able to get new app versions. They dont have to put up with this insane bullshit on Windows, why should they put up with it on Linux?

This "distribution" bullshit is not what is killing desktop Linux, it is what _already_ killed desktop Linux.

The other reasons why desktop Linux never made it (no games, no preinstallations on hardware) are imho just consequences of the distribution concept and the 6-month planned-obsolescence cycle. Nobody wants to bother with something which will be obsolete half a year down the road. Nobody wants to develop for a target that moves _that_ fast.

Windows installations, once installed or preinstalled, run for a decade. Develop something, and it will run on a 10 yr old Windows your grandparents use. Most people encounter new Windows installations only when they buy a new computer. PC manufacturers know that customers will hate it when their new computer OS is obsolete within half a year and that they wont be able to install new apps, so they dont preinstall Linux, it's as simple as that.

If anybody _ever_ really wants to see Linux succeed on the desktop (before the desktop concept itself is gone), he will have to give up on the distribution concept first.


I've been saying this same thing for years. Package managers are a nice concept in theory, but Linux on the desktop will never, ever succeed until upgrading (for example) Firefox doesn't result in an install of Unity. The entire concept of a milestone-based monolithic distro is so broken for desktop use that I can't believe a better alternative hasn't been developed yet.

Even as a Linux nerd I'm constantly faced with problems caused by this. I'm stuck on Ubuntu Natty, for example, because it has the last stable version of Compiz that worked for me on the flgrx ATI drivers. If I wanted the latest version of Unity (I don't, I think Unity is terrible, but this is just an example) that means I'd have to upgrade Compiz and everything else and get stuck with all the horrible bugs new Compiz versions have with my drivers. It would also mean upgrading to Gnome 3 which still has many usability regressions (unrelated to Gnome Shell) and in my opinion isn't fully baked yet. I don't want all that shit just to get the latest version of a single package!

(You could perhaps pull it off with some PPA mumbo-jumbo, but you'd have to understand what a PPA is, luck out in finding a PPA in the first place, and messing with them can more than likely bork something up. Not something for the average-Joe audience Ubuntu is targeting.)

Shared libraries made sense in the days of limited space and resources. They still make sense from a few security perspectives. But from a practical perspective, Windows did it right by allowing programs to ship their own libraries and by doing backwards-compatibility right.


The entire concept of a milestone-based monolithic distro is so broken for desktop use that I can't believe a better alternative hasn't been developed yet.

I feel like even the original 1984 Macintosh was a better alternative. Each app is contained in one file; it has no dependencies other than the OS. There is no need for installers/uninstallers. For updates, even Sparkle (annoying as it is) at least doesn't disrupt your whole system.

There are Linux distros that try to work this way, but the upstream developers have become lazy; they expect distros to do packaging for them. This leads to some skewed incentives; I think everything works better when the developer is also responsible for packaging, marketing, and support.


I understand where you're coming from, but people have the exact same problem with Windows (and maybe Mac also; but I don't have much experience with Macs). Many people are staying on XP because they dislike Windows 7/Vista. I suspect the same is happening with Ubuntu.

I personally LOVE LOVE LOVE that all my apps are updated by the same program. Instead of the Windows/Mac way of each app running it's own updater.

However, I do agree that the UI for upgrading a single app should be made better.

Just this week, I finally moved out of Unity while staying with Ubuntu 11.10. My workaround is to move to XFCE (Xubuntu). So far, no problems. Btw, I have Compiz enabled with the proprietary nvidia drivers. You might want to try Xubuntu.

It's amazing how easy it was to take out Unity and replace it with xubuntu: sudo apt-get install xubuntu-desktop.

The xubuntu people have taken the trouble to interface with all the Ubuntu plumbing (networking, sound, updates etc) via XFCE. Overall I'm impressed with the Ubuntu ecosystem.


We're talking about different problems. Windows does not have the problem I'm describing. If I want to update Firefox on Windows XP, I can do that without Windows forcing me to update to Windows 7 as part of the deal. That's the problem. Updating Firefox on Ubuntu 11.04 after a certain (very soon in the future) point in time necessitates an update to Ubuntu XX.YY.

In fact you're precisely illustrating my point with your example. That people can choose to stick to XP because they don't like 7 is precisely analogous to wanting to stick with Gnome 2 because I dislike Unity. But the difference is that I can do that in Windows because I can still get app updates in XP for over a decade, but I can't in Linux because app updates are tied to the entire massive distro and everything else that goes with it... often every 6 months.

It certainly is possible to upgrade parts of the system using esoteric (to the average-Joe) solutions like PPAs or compiling from source or being forced to switch to a different desktop environment. (Most users don't even know what a DE is!) But the fact that you're forced to turn to such solutions is what I--and the OP and the linked article--argue is the biggest nail in the desktop Linux coffin.


The fact that Windows tries valiantly to offer backwards compatibility (actually is forced to by their own business model) I have to assume is at least partially responsible for the overall huge suckage that is working on Windows.

OSX/iOS are much more like Ubuntu -- practically any cutting-edge app requires something near to the latest OS release.

Android is suffering precisely because there is low OS upgrade adoption, as I've heard recently right here on HN.


> If I want to update Firefox on Windows XP, I can do that without Windows forcing me to update to Windows 7 ... Updating Firefox on Ubuntu 11.04 after a certain (very soon in the future) point in time necessitates an update to Ubuntu XX.YY.

I'm not 100% sure about it. As it happens I run both Ubuntu 10.10 (at work) and Ubuntu 11.10. I just saw an alert to upgrade my 10.10 Ubuntu to Firefox 11. I give them major props for keeping an major package like Firefox updated. They probably don't do this for other packages.

Now wrt app updates, people have pointed out downthread that PPAs can be added to the system by clicking a specially-encoded URL in Firefox). People on Windows also do things like download the file, locate it, double click, respond to a UAC prompt etc. So installing through PPA isn't that much more complicated. IMHO.

Also Ubuntu has a software store, to which developers are free to push their updates.

In summary, what people are wishing for in this thread is to have the 3-4 apps they use upgraded to the latest and greatest versions while leaving the core OS and other apps at baseline versions. But ... the solution of using PPA (analogous to downloading the app in Windows) is deemed too nerdy ... Leading to the conclusion that the distro should take responsibility for maintain all the 10s of 1000s of packages for ~10 years. Am I understanding you correctly? Probably ain't gonna happen.

Do you really think it would be a step forward if we got rid of the huge number of non-core packages in distros and had to hunt down PPA sources (again, equivalent to downloading apps from different websites in Windows) for each program. I'm not sure.

Continuing, I'm not sure the lack of updates for old releases is only a technical problem: XP has an installed base, so people jump through ALL kinds of hoops to make their programs work there. IIRC, there are major differences in audio/video/drivers between Win7 and XP, but people (e.g., Flash/Chrome) still do it.

I doubt developers will keep their apps updated and working on XP if it didn't have such a large installed base.

End long reply. :)


Firefox is a special case. Ubuntu specifically updates that one package more frequently. Perhaps it was a bad example as that's one of the very few packages that is singled out for frequent updates by Canonical.

PPAs still aren't the best solution even if distros make them easy-to-use. The reason is that now maintainers have to get their software into two places: the core repos and now an optional up-to-date PPA. Additionally they must still target those PPAs to every single new version of the entire distro. For example: I run Natty because I like Compiz, but Oneiric has showstopper Compiz bugs for my ATI card. On Natty I use a PPA that downgrades the version of Compiz to a more stable one. However that PPA has not published an update for Oneiric, so I can't upgrade to Oneiric and get other new packages. What now? My fate is in the hands of a single PPA maintainer.

Or: many PPAs only publish releases for the past 2 or 3 distro releases. What if I'm on a non-LTS release and don't want to upgrade (because upgrading would bring in more unwanted packages), but the PPAs I use no longer publish for my version? I'm out of the game again.

The core problem in these scenarios is the distro model. Devs have to keep their apps up to date with the quickly-moving target of a "distro." That's just not a situation anyone should be in. The OP and the linked article explain this better than I can.


You want stability and a slowly-changing core system, but won't use the LTS releases? Because your app provider doesn't make updated PPA's for the LTS release. At which point you blame the existence of the distro's non-LTS releases? I think the correct target of your ire should be the app developer who doesn't produce PPAs for the LTS release.

I carefully read what you wrote, and IMHO your requirements are not reasonable: in effect, you want the non-LTS releases of distros to vanish so that app providers don't have the temptation to not support previous releases.

Maybe I'm a what somebody called a "technologist" downthread.


I want what Windows does: Stable core, independent apps. The entire concept of "LTS" is a red herring. It's a byproduct of the distro mindset. To put it another way: There is no Windows LTS. You buy one version of Windows and it works for a decade--a DECADE--and your apps are kept updated for as long as the developer cares to do so, often automatically in-app. (There's no Windows "app store", but as Apple demonstrates it's a matter of will, not technology, to make one.)

That is simply not the case with Linux today no matter how you frame it. And for some reason too many Linux supporters are totally blind to that because they think package managers are flawless gems of convenience. They mistake package managers for convenience when in reality they're a band-aid for a situation that shouldn't exist in the first place, and that other OS's have solved better. The OP calls this out perfectly.


Please do not blame "package managers": Windows Installer is a package manager in many ways comparable in scope to dpkg, which given your premise immediately disproves that they are the cause of the problem you are talking about; the issue is not package managers, it isn't even centralized package distribution systems like APT: it is that the ecosystem of libraries and protocols that make up the Linux desktop have horrible binary compatibility issues that distributions seem to make even worse through the usage of "rebuild the world and update all the dependency relationships while we are at it" policies.


Windows has which problems, exactly? His mentions are pretty Linux-specific and I couldn't find one that applied to Windows.


My first digital camera had drivers only for Windows 98. I had to dual boot until I got a CF card reader.


Could you elaborate on the jist of your point? I'm genuinely not getting it. Keep in mind than Windows 98 is a 15 year old, crappy legacy OS. And we were all running FVWM and AfterStep.


OP said: "I'm stuck on Ubuntu Natty, for example, because it has the last stable version of Compiz that worked for me on the flgrx ATI drivers."

It's easy to imagine a similar problem (somebody stuck on XP because for e.g., a device driver exists only for XP). For instance, an old laptop of mine has a webcam driver only for Vista but not Windows 7.

http://www.sevenforums.com/drivers/43156-dv6000-webcam-drive...


>I understand where you're coming from, but people have the exact same problem with Windows (and maybe Mac also; but I don't have much experience with Macs).

No, they do not have the same exact problem of package managing. Preferring XP over Vista/7 whatever, is not the "exact same problem".

>I personally LOVE LOVE LOVE that all my apps are updated by the same program. Instead of the Windows/Mac way of each app running it's own updater.

Well, Mac now has the Mac App Store, which handles updates for App Store installed programs.

But the problem is not about all programs being installed by the same program. It's about installations being transparent to the end user and not wrecking havoc. Also, not being inter-dependent.

Even Mac apps that update with their own updater, are autonomous units. At most, the app itself breaks.


On the other hand, being able to install software so easily with package management is a huge benefit of distributions. Perhaps the solution is to create a new package management system for a desktop distribution. A system which ensures that when upgrading an application only its personal namespace (libraries etc) changes, while other applications remain untouched. Libraries are second-class citizens, almost. That way we get the benefits of packages and libraries, but each application is logically isolated from changes to any others.

I very much agree with the idea of a slow-moving core, though. That's how I've tended to use Debian: a stable foundation upon which I can install less stable higher-level software (the things I actually interact with). Of course, this doesn't address the distro fragmentation problem Ingo is talking about.


Couple other ideas:

Optionally bundling specific versions of libs (or compiling in statically), and placing in user's home directory, setting path to look there first (maybe that's what you're saying exactly?)

Stop using the same tool for updating userland apps and system core specific stuff. Same app for updating "/bin/ls" and for "audacity" is, imo, at the core of the brokenness. These are different types of apps with different areas of responsibility, but we lump them all together in one tool and process.


They need to be lumped together into one update process. The package manager for each needs to be aware of the other so it can resolve incompatibilities and not overwrite files it shouldn't. Joe User is not going to run two different package managers to keep up with security updates. What you consider to be userland apps may be core system stuff to someone else. Package dependency graphs are not simple. Sometimes there are even loops, depending on what parameters you pass to ./configure

You can use one tool and still separate different sorts of packages into different areas of responsibility. A few distros do that, or at least are capable of it. Arch has AUR. Gentoo has a bunch of 3rd party overlays.


I agree with your complaints but I don't see how it is related to the concept of distributions.

Having multiple distributions is fine; someone just needs to make one with a package manager that gets this stuff right. Then your parents can just use that one distribution and never need to care about what other distros do.

In fact, Ubuntu wants to be that one distribution that is easy for normals. But as you illustrated, it still falls short...


> I don't see how it is related to the concept of distributions.

Distributions freeze the set of available app versions to a specific version of the base system. You cant get a new app without upgrading everything else too, including all other apps. I cant simply get a new Emacs on my Ubuntu, because to do that, Ubuntu also wants to remove my Gnome2 and replace it with Unity.

Windows decouples the base from the apps. When you want to update an app, you just do it, everything else remains untouched. If I want to get a new Emacs on XP, it wont force me to simultaneously upgrade to 7.

Windows would have the same problems if they bundled a new set of base libs and 20000 apps that only work with that specific set of libs every few months but they dont. They invest a great effort into making the base system slow and stable and support it for a decade or more, and it shows.


As I said. That's a problem with the package management in most distributions. It has nothing to do with the fact that we have multiple distributions.

I.e. source-based distributions (e.g. gentoo) don't have this particular problem.


Tell this to people who ended up with broken Arch installations after the sudden upgrade to Python 3.x.


That is a case of yet another pair of problems, though: Python is incompatible with itself across versions (understandable, I guess), yet it is hard to run multiple versions side by side (major flaw in Python 3 for reusing the unadorned name Python, instead of making the number part of the name or changing the name of the installed software)


Python 3 does not reuse the unadorned name "python". It installs itself as "python3" by default. The decision to have "python" be Python 3 was solely Arch's.


Eh. First of all, you're not limited to the packages your distribution supplies. Lots and lots of software is distributed in Launchpad PPAs. Other stuff can be installed without the package manager, some games are statically compiled, etc.

And I don't see what's wrong with requiring users to have an up-to-date system. Security fixes alone make regular updates almost mandatory on all operating systems. Windows installs run for a decade, but they still get constant security upgrades, including ones that require restarts and big scary service packs.

System upgrades simply should be totally painless. The kernel gets updated constantly without users noticing, it should be the same for all upgrades. Maybe a rolling release would be a better solution, because it gets rid of the scary system upgrade user interaction, but they're more difficult to QA.

Maybe the repository administration model needs to be changed. Giving the devs more control/responsibility for their package in the repository might be a good idea. Many developers already set up PPAs to get there.


> And I don't see what's wrong with requiring users to have an up-to-date system.

The problem are not the forced invisible security updates, the problem are forced user-visible upgrades.

When you want to upgrade one app, you maybe dont want to simultaneously upgrade another app or even the whole desktop. With the distribution model, theres no way to avoid this forced interdependence.


Updates are not invisible by default because the organizations behind the distros can't provide the same level of assurance that Microsoft or Apple can that update X won't break something.

Average users should have no say in keeping their apps from getting auto upgraded. Linux distros have to track upstream app releases because if they don't there will be breakage eventually. Some app will require a feature added in lib X version Y, and they're still on Y-2. If the packages aren't upgraded, users will complain when they can't install newer packages.


So, you've brought up another important point in the Linux/packagemanager ecosystem:

"Some app will require a feature added in lib X version Y, and they're still on Y-2."

Windows has had this solved for something like a decade. Sure, there's the much lampooned "dll hell", but honestly, Linux's solution was "lol lets upgrade things and break user apps".

There is zero excuse for apps in Linux to have library dependency issues. A package, when downloaded, should have its depended-upon libs and so's tagged. When some other application is updated and pulls in new version of the libs, the first app shouldn't ever see the update. Wouldn't that be nice?

Similarly, having a stable ABI to program against for system calls would be helpful. Users complain when their old apps break, and this is unavoidable under the current Linux development model (see http://primates.ximian.com/~miguel/texts/linux-developers.ht... for a good article on this problem).


Windows has even solved virtually all of "dll hell" via SxS.

Linux distros would do well to implement something similar. Disk space and RAM are cheap, having a few different versions of the same DLLs is no big deal. I don't remember the last time I had a .dll problem in Windows post Vista, whereas I still run into .so issues nearly constantly in Linux distros.


Linux has versioned libraries, but distros often ship only the latest versions. Libraries usually have filenames like "liblibrary.so.x.y.z", and an application will link to "liblibrary.so.x.y" or "liblibrary.so.x". Library maintainers also get lazy with making sure that the library stays compatible within major versions, or don't update the .so version properly.


Perfect username. The problem is that Linux ecosystem doesn't have enough QA backing the amount of new and changing code. The is partly caused by inflated egos of the competing distro teams.

Ingo is wrong about freedom. Freedom is the cause if Linux's problem: dev teams are too free to make compatibility breaking changes and too many alternatives in core desktop infrastructure, so QA can't keep up.


No, there are definitely underlying economic motives beyond ideological "freedom". The only way to make money in the Linux Distro world is to sell 'stability' ala RHEL.

That practically requires that the free teaser product be 'unstable' (and therefore undesirable for paying customers). And the easiest way to do that is a top-to-bottom bleeding-edge system rebuild with each new release.

So it's not just a matter of "not enough QA", because there are very real scalability problems with re-QAing everything every six months to ensure that some random library or compiler flag change didn't break something.

Look at Debian for example - they very much get the idea of "freedom", but they also understand software deployment lifecycles and produce a long-term stable version. (One could argue with their management decisions, but the basic idea is correct.)


> And I don't see what's wrong with requiring users to have an up-to-date system.

I want a working-for-me system. Breaking it to make it "up to date" is a bug.

You're assuming that an application or version that has been replaced by something newer is necessarily broken/inferior/etc. You're wrong.

Yes, some updates do address security, but many/most don't and even the ones that do don't necessarily apply in all situations. Not to mention that security isn't the only priority.

Security isn't the only priority offline, so why would anyone think that it would be online? Disagree as to online? Show me your car, residence, or person and I'll be happy to demonstrate.


Quite right. Aside from backports, you can get things like the latest Firefox and LibreOffice on Ubuntu 10.04 thanks to PPAs. To those complaining that the 6 month update cycle is a turn off to "normal" users: put an Ubuntu LTS on their computer with a few PPAs to keep specific packages up to date.

One advantage of the distro model over the Windows model: you've never had a toolbar installed in your browser or your homepage changed because of a package you installed from a repo, have you? I find the Windows software ecosystem (at least the freeware portion of it) far more annoying than anything package managers do. Talk about unwanted changes to the user's computer.


For software updates without updating the distro, backports can be turned on.

The reason that everything is packaged together like that is so library updates and other things can be tested and vetted for compatibility. Part of the challenge of allowing such a variety of OS and system configurations is that library or software package maintainers are unlikely to have tested their updated code on every possible distribution out there, and so that makes it the responsibility of the distro maintainers.

Believe it or not in practice this doesn't create many issues. It might be an obstacle for casual users who are trying to transition into power users, and are trying to tweak their system. But most other users aren't encountering the same frustration.


But what is the alternative to a non-distributions based gnu linux operating system?

How do you manage the 1000 packages and their libraries and dependencies? Many of them have separate runtimes which may or may not depend on the other packages runtime. How would you design an application sandbox to cover them all?

What you're saying basically is that linux is failed. At least for me, since I cant see another way to distribute and manage the massive amount of packages that sit on gnu.org.

I agree with you, and It is my opinion that the one who solves these problem is in for a lot of business-opportunities.


> But what is the alternative to a non-distributions based gnu linux operating system?

A small core set of libraries that change _very_ slowly and arent intentionally obsoleted every few months. Think of Windows like slow, stable and supported for a decade. Distribution of apps decoupled from the distribution of the base. Never make an app update trigger a lib update.

> How do you manage the 1000 packages and their libraries and dependencies?

You dont do that at all. Developers to that themselves like they do on Windows and OSX. Every dev packages his own app and puts it either into the App store or distributes it himself. You manage only the libs and dont allow them to change fast or in an uncoordinated, chaotic way.

> What you're saying basically is that linux is failed.

From the point of a normal user, yes. For a normal user, it is not an option. Everybody I personally know who tried it, went back. The main reason for most of them was the insanity of application management. (And lack of hardware drivers and games, but thats not Linux' fault.)

> since I cant see another way to distribute and manage the massive amount of packages that sit on gnu.org.

Decouple libs and apps. Dont change APIs and lib versions every few months. Make the base a very reliable and slow moving target. Dont force anybody to change everything every few damn months.

> It is my opinion that the one who solves these problem is in for a lot of business-opportunities.

The problem is already solved, at least under Windows and OSX. Thats why Windows and OSX get all the desktop business and Linux gets none.


> A small core set of libraries that change _very_ slowly and arent intentionally obsoleted every few months. Think of Windows like slow, stable and supported for a decade. Distribution of apps decoupled from the distribution of the base. Never make an app update trigger a lib update.

This is the idea of the linux standard base. The concept was developed over a decade ago and it has failed to show real fruit.

> You dont do that at all. Developers to that themselves like they do on Windows and OSX. Every dev packages his own app and puts it either into the App store or distributes it himself. You manage only the libs and dont allow them to change fast or in an uncoordinated, chaotic way.

You can already do this with the package management systems. For example, each game in the humble indie bundle installs into it's own /opt directory, with it's own private copies of it's dependencies. It uses the package management system to hook into desktop menu updates, etc. As a user, it's been a nightmare for me. Half of the games don't run at all and I'm at a loss for how to fix them or get replacement libraries.


I was always surprised that no one made something like "The Linux Binary Platform 2003" and then kept it up to date, maybe releasing another one later.

At the very least, developers used to Windows would understand the dynamics of it (which is probably a good idea if attracting commercial development is a goal).

I would say it is quite a bit less ambitious than something like the linux standard base.


It's been proposed at various times in various ways. It's largely a competitive issue - the for-profit mechanism boils down to selling "certified" enterprise configurations, so there's little intrest in making the platform even more commodified than it already is.


0-install (http://0install.net/) or even better - Nix packaging system (http://nixos.org/nix/ - they claim to be a purely functional package manager)


Damn straight! I don't understand why so few people know about Nix.

I'm hoping Nix(OS) will take off eventually. I try to play with it from time to time, but I haven't made the investment yet to really contribute.


maybe if someone would build a tool that pulls from Ubuntu repository and recreates an equivalent nix package.

come to think of it, there is nothing in aptitude that prevents this kind of an approach (including the unique hash paths). It will get you to 75% of Nix functionality, which IMHO is good enough.

I dont think Redhat.or Ubuntu will want to do that though. Their business model revolves around the walles garden.


Dang... I'm going to go download the NixOS live image, now.


Is there a linux equivalent of PC-BSD[1]?:

_Programs under PC-BSD are completely self-contained and self-installing, in a graphical format. A PBI file also ships with all the files and libraries necessary for the installed program to function, eliminating much of the hardship of dealing with broken dependencies and system incompatibilities. _

[1]:http://www.pcbsd.org/index.php?option=com_zoo&view=item&...


but but but....

"you'd have duplication of files!" or "you'd have to update multiple libraries when there's a security patch!"

I didn't know about pcbsd, but I've espoused similar ideas to my linux friends years ago, and was generally met with the one of the two objections listed above. I think they're both bad arguments, but it's what I encountered the most.


The problem isn't that you have to update multiple libraries - it's that you have to rely on each application developer to release an update when there's a patch for one of its libraries. That simply won't happen in many cases.


for static compilation, true. if required libraries were bundled with an app, you could replace that one in that ___location specific to that app and be done, assuming that the app didn't need any extra work done to it to support the new library.

Given that all this discussion largely revolves around open source projects anyway, if a developer didn't update for a new security patch in a library, someone would likely step up to the plate if it was a commonly used app. If it's a niche/minor app, and there's, say, a new version of libssl, if the author isn't making updates, there's no guarantee the app will work with an updated version of an upgraded shared library anyway.


for static compilation, true. if required libraries were bundled with an app, you could replace that one in that ___location specific to that app and be done, assuming that the app didn't need any extra work done to it to support the new library.

But it'd still be up to the developer to update the library, no? Otherwise, how is that better than the current situation?

If it's a niche/minor app, and there's, say, a new version of libssl, if the author isn't making updates, there's no guarantee the app will work with an updated version of an upgraded shared library anyway.

But you don't have to upgrade the version of the library to release security updates: the Security team of Debian backports all security fixes to the library versions in Stable even if the upstream didn't, in order to prevent such breakage.


The library developer updates the library, and if the app doesn't need changes to work with the new library version, he doesn't have to do anything. You could even automate finding all copies of the library and updating them...and keep prior versions around if something breaks.

To avoid relying on app developers at all, put apps in sandboxes where it appears that the libraries are where they've always been.


> He basically wanted to upgrade just one (obscure) app, and the process triggered the automatic removal of Gnome2 and installation of Unity. Just _IMAGINE_ how nightmarish this must look for normal users. You simply dont remove somebodys installed desktop sneakily from under their feet. You simply dont. That feels like the total loss of control over your computer.

Hmm, that's not exactly what happened, according to the link: "I upgraded to Ubuntu 11.04 a week or so back in order to get a more recent version of SCons."

Your overall point is well taken, but I wonder how much it affects what I think of as "normal users", who don't care so much about upgrading to the bleeding edge of scons. Consider a hypothetical user of Hardy, so they've had it for four years: what are they actually missing if what they do is web surfing, email, and maybe document editing?


Let's say they use instant messenger services, so they have Pidgin installed. I'm pretty sure there have been critical bugfixes in Pidgin, and the old Hardy version of Pidgin was so flaky for me 2 years ago (back before my workplace had moved to Lucid) that I installed my own from source.

Ubuntu's answer to that? Well, those aren't security fixes, so you can upgrade to $LATEST_RELEASE if you want the non-critical fixes. Ubuntu is trying to force a 2+ year bugfix cycle on software maintainers, and that's just not realistic for many small teams (both proprietary and open source).

This is a particular example, but I can think of other cases where this might be a problem. OpenOffice updates after a new MS Office release come to mind offhand.


From my perspective I don't have any trouble with Ubuntu using a combination of PPAs and Win/OSX like self contained apps. Things like Chrome and Mendeley come with auto-updating PPAs. Other apps that I really care about like Eclipse, Matlab or Firefox come in there own self contained packages.


Coupling the updates of single apps with the updates of the whole desktop or framework and libs, is just plain wrong.

You don't have to do all that to upgrade a single app. In fact you're thinking of it backwards. Distros mean that when you upgrade the OS or libs, you get new versions of the apps for free.

You can still configure && make && make install, or grab a statically-linked binary, or any other method of getting Linux apps to run.


> You can still configure && make && make install

You aren't serious, are you?

> statically-linked binary

Nobody builds them.

> any other method of getting Linux apps to run.

There are no other methods.

> Distros mean that when you upgrade the OS or libs, you get new versions of the apps for free.

That in turn means that if I dont want to upgrade the OS and the libs, I cant get new app versions. The collective refusal to acknowledge that this is a problem is what is holding back (aka killing) desktop Linux.


You said you stopped pointing people toward Linux because of the distros. But distros only add ways to get software. You don't have to use their package management at all. If you want to install a new version of an app, with or without the help of the distro, what better way is there to avoid dependency hell?

Edit: Central-repository installation methods I've used: python easy_install, perl CPAN, Ubuntu PPAs (which are a way to add 3rd-party apps to your package manager).


> But distros only add ways to get software.

Without distros, the way to get software is unbearable for Windows converts, where it is just "click, click, done."

> what better way is there to avoid dependency hell?

All distros agreeing to ship one specific version of a lib, so that app devs can target that "standard" version instead of daily changing upstream versions.

The dependency chaos is a consequence of no distribution being influential enough (or the major players not being able to agree) to slow down the interdependent moving target that is the library space. So app devs dont care what distros ship and only target the upstream, and the upstream lib devs dont care about the overall ecosystem and just ship whenever they feel like shipping.

Nobody of them seems to care about the user experience of the end user, for whom getting on Linux seems like building on a shaky ground. And then they both pretend to not understand that a majority of end users would rather pay for Windows and have a decade of peace of mind and hassle free app availability, than moving for free to a earthquake prone area.


> Without distros, the way to get software is unbearable for Windows converts, where it is just "click, click, done."

Sorry, what? On Windows machines, due to the lack of package management, installing software is not a matter of "click, click, done." You have to google the program, navigate to the website's download area, find the right link, download it, execute it. It asks if you're sure it isn't a Trojan or something, which you promptly ignore (and learn to ignore every other "are you sure you want to execute foo.doc.exe?" popup).

Package management ('done right') is a matter of "click, click, done." Installing software on Windows is an absolutely terrible experience.


I think you're conflating two different issues. Poor UX for installing 3rd party software has been solved with things like Steam or the Mac App Store, for example.


Steam and the App Store are nothing more than proprietary package management systems.


Again, that's not the issue. Package managers are great. The problem is how the base system is maintained/updated/versioned.


And on Linux, you navigate to the website's download area and find out they don't offer a Linux version.

There is an open source alternative which is not in your distro's repository, so you have to navigate to the website's download area, find the right link, download it, get accepted to university, take a year of 100-level CS courses, learn the command line on your own because you didn't get it at school, run configure and make, and it does not build on your distro. Not that it matters since the latest version 0.0.3 would not do what you needed if you had gotten it to run.


hehe , Harsh.

I think one of the main problems is that there is no single agreed way to distribute a program.

At least with Windows Installshield wizard kind of became the defacto.

Sometimes I go to a download page and they offer me a .deb file which is great but then I find it is targeted a specific version of ubuntu that I am not running. Sometimes they offer a .rpm, or sometimes they just offer a zip file with a random .bin or .sh inside it that I may or may not have to run as root and may or may not do weird things to my system.

Also you get stuff that asks you to do git clone and make etc.


All distros agreeing to ship one specific version of a lib, so that app devs can target that "standard" version instead of daily changing upstream versions.

You talk as if lib versions only change to harass developers and end users. Sometimes there are security updates that need to be made. And how would lib devs add features if all the app developers and distros are frozen on an old version?


It doesn't work that way. "Linux", even GNU/Linux, is not really a single operating system. It's an anarchically-designed, anarchically-run OS-building kit revolving around a vision of what an ideal operating system would look like and with each part adjusting to what each other part is doing on the fly.

Evolution will never truly replicate the results possible with design.


it's just Ubuntu, not all Linux. unless there are other distros with Unity and 6-month cycles I don't know of.


ArchLinux suffers from the same problem.

I left my cousin with ArchLinux, he liked it alot, so much that he still used it 6 months later. At that time he wanted to install a new program, well, from then on he stopped using arch. Because he had to call me to help him fix his system, issuing the pacman -S programname failed, so he did pacman -Sy followed by pacman -S programname again, and this time his entire system was about to be updated, he answered yes on all questions... and suddenly his entire desktop was differnt from what it was before. All the programs get updated! He didnt want that!

This is people who think that a computer is broken if the taskbar on the desktop is accidently moved to another position by children. Of course they think they broke thier computer, and it did since their settings and gadgets on the plasma desktop stopped functioning, and some just changed. Without asking the user, just changed! That is a pure and simple WTF. Thats when I realized linux will never ever succeed on the desktop if it doesnt change fundamentally.


ArchLinux is a very poor distribution to leave with someone who doesn't have the knowledge to fix these problems.

When you do pacman -Syu you should be looking on the front page for update news, you should be looking at the list of packages to be upgraded. It's not a distribution that will hold your hand, because there are some of us around who don't want our hands held.


The idea that there are interfaces for expert users (or that there should be) is anathema on HN these days.


It's a pity, I can't imagine I'm the only person in the world who shudders at the idea that my computer should be made 'easy' for me (at the expense of flexibility and power). I'm a type of customer, just as much as the so called 'average user' is.


You left your cousin with arch installed but didn't tell him that it's necessary to keep a rolling release distro up to date?

Hell, I have to keep up with the mailing list and the news page on the website just to keep my OS working update to update.

It's worth it for me because of my needs but Arch is very very specifically NOT for the average user.

You basically set your cousin up to fail. And it's not like the branding and messaging isn't clear about any of these things. Why on earth did you pick Arch for this use case?


Because what else can I pick? Ubuntu!? Ugh.

He tried Fedora earlier, we couldnt get it to work with his wireless-networking, SELinux always getting in the way as soon as he took his laptop with him somewhere, I got tired of pushing against a wall as I was becoming his support, and then he asked what Im using, how does it work for me. So Arch it was.


I suggest Linux Mint in that case. Ubuntu based, but no unity.

I totally understand where you are coming from, I love Arch and want to share how awesome it is from but all those problems were entirely based on a very poor suggestion for his use case. Arch doesn't "suffer" from this problem at all.


have you heard of Debian?


Fedora is having similar indigestion with the GNOME 3 upgrade, and longer then 6-month cycles are even worse. How would you feel if all the apps on your system were 18 months old and the only way to upgrade them was to upgrade the whole OS? I believe that's called "enterprise Linux".


These kinds of package distribution schemes work well for clusters of homogenous servers but I agree that they just get in the way for a typical desktop user.


> He basically wanted to upgrade just one (obscure) app

No. He wanted a new version of the app and upgraded to a new version of the OS (with a new set of default packages). And he got surprised by getting a new GUI, something which is rather odd because Unity was one of the most publicized features of Ubuntu.

> the process triggered the automatic removal of Gnome2 and installation of Unity

Not really. Gnome2 would still be there. Just the default UI is Unity. I'm more than a little bit surprised ESR had trouble remembering you switch UIs on login. I've been doing it since my Solaris (2.5) days. I loved OpenWindows.

> having to upgrade the whole distribution every few months just to be able to get new app versions

That's not really true - you have to do so because the distro publisher won't support the newest Chrome on their 2006 OS. It's ridiculous to demand them to spend their resources on your particular needs. If you are not happy, you can ask to have your money back. And even when the distro publisher doesn't want to add newer versions to an old OS, you can always add private repos maintained by the makers of your favorite software.

And, remember, having stable versions of software (even when a newer, flashier version, was made public) is not what some people want. I wan't my servers stable.

> This "distribution" bullshit is not what is killing desktop Linux

It was never much alive. Linux is an OS that suits a couple users well, but not most of them.

> are imho just consequences of the distribution concept and the 6-month planned-obsolescence cycle.

It usually took much longer to get a new version of your favorite Linux distro. 6 months is the current standard. And, again, there is no planned obsolescence. There are many alternative places to get newer versions for.

> Windows installations, once installed or preinstalled, run for a decade.

I don't believe we met, sir. Where planet are you from?

> If anybody _ever_ really wants to see Linux succeed on the desktop (...) he will have to give up on the distribution concept first.

I don't think so. In fact, most people don't think so. And, let me say that not thinking so works quite well.

You do realize the incredibly arrogant position you are taking. You purport to be the savior of the Linux desktop (do we need one, BTW?) and to have realized what's wrong with it and, best of all, you have the solution! Just do everything opposite to how it's been working for decades and all our problems will be solved.

Let me put it simply: when you think you are the dumbest person in a room, you are probably right. When you think you are the smartest person in a room, you are most probably wrong. And if you disagree with everybody else in the room, odds are you are really the dumbest person there.

Maintaining a distro is a lot of work, but until we can make software makers to agree on a single package format, a single way to manage configurations and a single way to organize the file hierarchy, the distro way will remain a very popular way to manage your computers.


It's wonderful how downvotes can replace disagreement and discussion. You've got to love the conciseness.


The solution for this is quite simple: split the distro to a core, and a separate 'apps' repository, and let app authors control (but with a mandatory QA step) when their software gets released or updated. We did this with Maemo (and later MeeGo), and it has worked great: http://bergie.iki.fi/blog/application_quality_assurance_in_l...

Cross-distro app repositories are also a possibility, thanks to the Open Build Service (http://openbuildservice.org). And since MeeGo's community apps service is open source (https://github.com/nemein/com_meego_packages), all software needed for this (including an app store client app) already exists.

What is needed is a major distribution to make the first move on this.


You would still need to maintain and manage the core distro. The packages in their would update and break the systme, you'd have lvm issues and what not.

Any way, care to give links to more on maeom/meegos architecture?


You still package and build an application separately for each distro version. And then the community tests it before it goes to the stable apps repo. So broken packages (either by packaging, or by app not functioning properly) are unlikely to pass QA.

So if your package conflicts with something, either fix that issue or don't release for that distro versio.


Off-topic: Is active Meego Development still happening? I'm a QT developer and might be interested in contributing a little...


The MeeGo community effort continues at http://merproject.org

Used already as the base distro in the Spark tablet


The Linux desktop is dead. Face it. Accept it.

How do I know this? Go to a developers' or tech conference, and what is the prevailing OS? 9/10 times these days, it's MacOS.

As much as Apple may be control freaks, they did desktop UNIX right and developers have flocked to them.

Meanwhile, the Linux desktop community is still having the same debates, the same difficulties it had back in the mid-90's. Dependency issues. Lack of compelling use case software; and that which does exist is versions behind current. Fragmentation.

The only areas in which the Linux desktop has moved forward is UI and user-oriented management, and Ubuntu deserves much of the credit for the latter.

Add to this the gradual movement away from the desktop paradigm to mobile. The desktop paradigm will still be around for several years (especially for creators--developers, media folks, etc), but the end-user is moving more and more towards smartphones and tablets for consumption. Extend those with external keyboards and monitors (think docks) for light creation work (documents, spreadsheets, etc), and you've the future.

Nope. The Linux desktop is dead.


The Linux desktop is dead

This is really premature because there are major flaws with the other options. Apple is a closed-down walled garden (and it's expensive), for example.

The Linux ecosystem is extremely diverse, so maybe somebody will come up with something that can compete well with these other flawed models.

For example, maybe Ingo Molnar's post will inspire people to take things in new directions.

The desktop paradigm will still be around for several years

That's a ridiculous understatement. You can't do serious work on smartphones and tablets and that's not going to change until we're going around plugging them into projectors and keyboards, at which point they're just serving as desktops anyway.


Just because people who are trying to sell you something aren't using it doesn't mean it's dead.

I've been using Ubuntu for 4 years now, and the latest releases are more streamlined and usable than ever. The mainstream distros have come an extremely long way in being more user friendly to non-tech users.

Also I have mobile devices, but still end up using my desktop a whole bunch for two simple reasons: 1. when you actually need to type something of any length a full-size keyboard is wonderful 2. its nice not to have to squint at a smaller screen to see what's going on.


I agree with you, and a natural extrapolation of the point is that android is the "desktop linux" of the future. Maybe android is the linux distro (of sorts) that has actually solved the desktop problem.


Yes. 1000 times Yes.


I've been using Linux since 1993, and back then, the Linux OS and desktop were far superior to Windows 3.1.

Then Win 95 came out and that had a decent desktop. I remember when the KDE people started talking about a desktop for Unix and people didn't get it, but when we saw the beta it was like... Wow!

Then Red Hat Linux didn't like the license of KDE, so they had to create Gnome. As a result, rather than having one good Desktop, the average Linux has two half-baked desktops. This fork has wasted people's energy and been a distraction away from an excellent experience.

Another example of this is sound. I don't know how many incompatible sound APIs exist for Linux now, I know it's more than the fingers on one hand. The consequence of it all is that often sound doesn't work and unless you're a crazy enthusiast you might never get it to work.

I was a Linux zealot until 2003 or so when I had a job that had me using a Windows machine a lot, and by that point there was Win XP which was a huge improvement over Win 95.

I still use Linux on servers, but desktop Linux has largely disappeared from my life. Every so often I try to install it here or there, but I typically find the experience disappointing. I was a Fedora fan for a long time, but Fedora became increasingly finicky about where it would install. I switched to Ubuntu, but every installation ends up having some serious problem.

For instance, Ubuntu installed just fine on my PPC Mac Mini with the exception that the fan runs full speed all the time and the machine sounds like a vacuum cleaner.

Windows and Mac OS have been on a general trajectory of improvement -- sometimes there are changes you don't like, but the overall direction is good. Linux did, after years of struggle, get a stable multiprocessor kernel (2.6) but other than that I get the feeling Linux has been going backwards not forwards.


What you say used to be true, but does not appear to be any longer.

When was the last time we had useful improvements to the OS X user interface? 10.4 (2005) or 10.5 (2007), in my opinion. They've certainly improved under the hood, but the improvements to the UI have been mostly gimmicks like Expose.

When was the last time we had useful improvements to the Windows UI? That would be Windows Vista, 2006/2007. (W7 was basically just a stable version of Vista). Windows is certainly attempting to add improve the UI with Metro, so it's a 5 year timeframe to wait for improvements.

OTOH, in Linux land we've had KDE4, Gnome3 and Unity all land in that time period. Every 6 months we receive useful new improvements to our UI. Sure, the initial reception to KDE4, Gnome3 and Unity were all negative, but the haters are always the loudest. I haven't tried Gnome 3, but Unity 12.04 and KDE 4.8 are both really nice, much better than the OS X or Windows 7 UIs, in my opinion.

And it's not just the UI. It takes about 10 seconds for my computer to leave the BIOS and have both Firefox & Emacs open in Ubuntu. It takes the same machine over a minute to have Steam open in Windows.


There's just one problem with your post: you're thinking like a technologist, and not like an end-user.

End-users don't want to have to learn an entirely new UI (read, a different way of doing things; or, "Where's my Start button? Everything I know how to do is under that.") every couple of years. Not because they're (all) dumb, stupid, or lazy.

It's because end-users view a computer as a tool to do what they need/want to do--quickly and efficiently. Anything that distracts from that (like having to re-learn where everything is, and how to do the task they've done the same way for several years) is a negative and annoyance.

Unfortunately, the technology community has forgotten that.


Which is one reason why the 6 month model that the Linux community has adopted is so great. Every 6 months Unity or Gnome3 or KDE4 incorporate some improvements, but the differences are not major and are easy to learn or ignore. But over time these differences accumulate and become significant real improvements. We get to have our cake and eat it too.

And hopefully these environments have learned their lessons now and haven't incurred the large technical and design debt that necessitate large breaking changes. KDE is up to 4.8 now but nobody is talking about a massive rewrite for KDE5 -- it's just another incremental update. OS X appears to be moving towards this sort of model too, with fairly regular yearly updates. Windows is the only one bucking this trend with its major upcoming breaking change in "Metro". And they're getting pummelled for it.


There's just one problem with your post: you're thinking like a technologist, and not like an end-user.

Well, a lot of people who use and hack on Linux distros aren't targeting the "end-user," they're targeting the technologist. So that mindset is not necessarily a problem.


If we're going in that direction, then everyone's participating in a different conversation. The original conversation was not whether or not this is a non-problem because end-users have issues. The original conversation was why hasn't Linux been adopted by end-users at the same rate as other OS's, if it's superior in so many ways.


Is funny, cos in the games industry the concept is that the average end users are completely willing to learn a completely new UI for each and every game, as long as it is fun.


That's true, but it's something I hate about games. Having to learn a different button layout for every game is really frustrating.


You can't have a consistent button layout across all games though, as there is far too wide a variety of different functions to be mapped to.

Also, by having the attitude that the user is willing to invest time learning how to use the game, there is far more experimentation in interface styles being done.


> I still use Linux on servers, but desktop Linux has largely disappeared from my life.

And you still keep saying that "sound doesn't work". It's literally been years since I've had problems playing sound.

> I get the feeling Linux has been going backwards not forwards.

Hmm, I can play most media out of the box on Linux. I think that's a huge step forward.


> I still use Linux on servers, but desktop Linux has largely disappeared from my life. Every so often I try to install it here or there, but I typically find the experience disappointing. I was a Fedora fan for a long time, but Fedora became increasingly finicky about where it would install. I switched to Ubuntu, but every installation ends up having some serious problem.

A quick Google search reveals this was the fix and was found in 2008:

http://ubuntuforums.org/showthread.php?t=1004899

Perhaps I'm some kind of genius at using Google.


For me multiple desktop managers (KDE, gnome etc) is not a problem since Gnome 3 has become very competitive to Max OS X. The bigger problem is XOrg and it's constant tearing problem during video playback. People spend way too much time playing videos on their desktops to accept tearing. Wayland does not have this problem, and I therefore have great hopes in Wayland as a replacement for XOrg.


The author is right mostly, the problem is friction and market dynamics. Adding software in a given linux distro is too nerdy. If you don't have an "app store" like gui, you've already lost most users. Command line = friction for most users, even a lot of developers who grew up with windows.

As a linux nerd, I'm fine with command line + synaptic, but look at how well people have used the iTunes store, Amazon store, Google Play, etc... All of those have much less friction to find and download the right software than most linux distro's have. Ubuntu's market is close but...

WHERE IS THE NON FREE SOFTWARE?!!

If linux wants to do well for humans, paid, proprietary software NEEDS to exist on the platform. Ubuntu Software Center comes close, but it still kind of sucks.

Also, as a dev, it wasn't until VERY recently that you could even sign up to publish an app that was commercial in nature. It is hard to build a real marketplace when you're asking developers to give away all their work for free so that you can sell more operating systems (or support contracts) without the software dev seeing a dime.

As a software developer I can't feed my kids with free downloads on an open source operating system used by people who don't like paying for software.

Make it easy for devs to build software that people will pay for, then get operating system users who will buy that software for real money and you'll have fixed the Linux Desktop problem.


App stores can suck too, think Blackberry. You don't have to resort to the command line to install packages. Synaptic just needs a little freshen up that's all. I far prefer it to Ubuntu's software centre. A load of crap icons and screenshots just feel like clutter, and don't tell me anything.


Ten years ago I was trying to persuade people that two things were very important for Linux to succeed in the desktop:

1) Distribution of software as a cross-distribution package that just has everything it needs inside a directory, libs and so forth. If you said this N years ago you were an asshole because "duplication of file blabla" and so forth. The typical example was "an user should simply go to some web site of some application, download a file, and click on it to execute the program".

2) Device drivers with a well specified interface between the OS and the hardware, so that different versions of the kernel could use the same driver without issues.

People complained a lot with technical arguments, about why a different approach is better than "1" or "2" from some kind of nerd metric. So the reality for me is that Linux does not succeed in the desktop because it is "run" by people with a square-shaped engineering mind. There is no fix for this.


Molnar's point about the political and procedural difficulties of adding new applications in the official repositories of most distributions is true (although this is changing -- witness Canonical's Ubuntu Software Center, PPAs, and "universe" repositories).

But his reasoning breaks down when he says the relative dearth of commercial applications for the Linux Desktop is due to this issue. That's not true.

The main reason why OSX/iOS, Android, and Windows attract more commercial developers is because those platforms have a much greater installed base!


Another reason is commitment to binary compatibility of a set of components over a time scale of 10 years or more. Linux has historically been relatively unstable, with an ideological focus on source rather than binary compatibility; and fragmented into different distros, which may have different ideas about which versions, configurations etc. of components are pieced together.

That's a perfectly understandable focus - even laudable, if you take the GNU line - but it makes it hard to release reliable, tested binary software. The landscape has too many variables.

I think this is a significantly bigger problem than install base. And I think it would be addressed with an appropriate focus on exactly the problem Ingo is pointing out: the lack of a defined, high quality core that can be relied upon.


You can statically link binaries and distribute those, which companies do when there's enough of a market to bother. That's how Matlab is distributed, for example. I think the bigger problem is that there usually isn't enough of a market; AutoCAD was discontinued on Unix because too many shops were retiring their Unix workstations in favor of Windows desktops, not because Unix software distribution was too hard.


Sure you can statically link binaries, so long as you have a license for all those libraries, or if you write your own UI etc. from the ground up not much removed from the X protocol. And when UI refreshes come around, your UI will look frozen in time in a way that UIs using standard controls don't in Windows.

And how about those libraries that control shared resources? For example, sound output. It's been many years since I bothered to try and use Linux as a desktop OS, but I recall wholesale choices of sound subsystems, with options for one subsystem to emulate another, etc. How well would that mess work with static linking? And these are only the most basic of shared resources; not thinking about file/app association, icon display in file managers, and other really really basic OS services that have seen repeated whole reinvention in Linux.

IME Linux is unusable beyond the command-line, preferably via ssh.


The semi-standard commercial solution for widget toolkits, excluding old software tied to something legacy, is typically to use Qt, which has both LGPL and (reasonably priced) commercial license options. Audio isn't really a problem for statically linked apps not doing anything particularly strange; it works fine in anything I've tried. Even ancient audio APIs in ancient binaries are transparently emulated through something-or-other in a way that "just works".

If you haven't used Linux in "many years", that might be the source of your impressions. I remember mucking with that kind of stuff on Slackware in 1998, but I haven't touched OSS or ALSA or whatever in a decade; it just does its thing under the hood. Audio even works fine when I run Windows applications under Wine!


I use Linux every day; I haven't used desktop Linux much since a fairly unpleasant experience with the original Eee PC (701, in 2007).

But package management (I live with Debian apt-get and friends) is a bane of my life. Generally, the binaries that come when I install a package are too old, or don't have the right compile-time options configured, or there is some dependency missing that I don't have installed and is for some reason missing in the repository due to bitrot. It's usually more reliable to download the source, configure, build and install it the old-fashioned way. That also means fixing build errors, tracking down missing libraries, and generally a whole load of work I wouldn't trust an otherwise fairly competent software engineer to do, never mind the average user.


What distro do you use? It's very much not my experience on Debian (Sid).


You use debian and are complaining about old binaries? I think they consider it a feature.

I use Ubuntu desktop/server and etc and my experiences with the package managers have been flawless.


The Eee PC Linux install was terrible. I installed Ubuntu almost immediately.


Yes; first I reenabled the underlying UI so that I had access to a console; soon after I installed Eeebuntu when it was put together by the community. I never used the original limited UI for more than perhaps 60 minutes total.


> The main reason why OSX/iOS, Android, and Windows attract more commercial developers is because those platforms have a much greater installed base!

A few years ago, Linux had a bigger installed base than both iOS and Android. Both of them outran linux with ease.


Yeah, but they're in a brand new market on brand new devices. People who already have a proprietary PC OS aren't going to switch to Linux out of the blue, but they are going to buy a shiny smart phone.

So the two situations are really not comparable. I still think the main difference is that you have to install Linux yourself. It's not even that installing is difficult--it isn't!--it's that normal people don't even realize it's an option. Your average random laptop buyer who just spent $600 on a laptop from Staples would be able to use Linux perfectly well if that's what his laptop came with--I suspect some wouldn't even realize it wasn't just a different version of Windows. But since his laptop invariably came with Windows, that's what he's going to use, not for any reason but inertia.


> but they're in a brand new market on brand new devices.

But they still had no problem starting with an ecosystem with zero apps. Now they have hundreds of thousands.

> I still think the main difference is that you have to install Linux yourself.

Over the years, I've had several people, whom I talked into checking out Linux, give it up and go back to Windows because they refused to accept that they have to upgrade the whole distribution just to be able to install new versions of single apps. Did you ever try to explain a Windows user what a "backport" is, and whats it good for, and why there are none on Windows and why he can he can install whatever app and whatever version of an app he wants on Windows, but cant on Linux?

> Your average random laptop buyer who just spent $600 on a laptop from Staples would be able to use Linux perfectly well if that's what his laptop came with

For 6 months, then he wouldnt be able any more to update his apps.


Ummm yeah they're also OS's for a completely different platform. That makes a pretty big difference.


This is the most incoherent statement in this entire thread full of non-hackers saying "wouldn't it be nice if there weren't dependencies in software?".

The last thing this discussion needs is troll comments.


""" The main reason why OSX/iOS, Android, and Windows attract more commercial developers is because those platforms have a much greater installed base! """

... and therefore enables developers to make more money!

In my opinion it is one of the most important factors right here, combined together with a lot of marketing from companies which own the "platform".


I am running on linux desktop for already 5 years together with 50 friends, peers etc. It does not quite seem write to compare 90 cent mobile applications, which are 'mostly' few bunch of screens interfaced to a service otherwise given by a web site or simple 80s area arcade games, to packages in a linux distro. the author is totally making a terrible mistake here. mobile apps lack in size, complexity and who said they are great? they are mostly consumables which perish in few days or months (I exclude some ).

Ubuntu is already trying to be more flexible with software center and applications, however open source is not single walled garden which can adhere to tight monotonous architectures found in mobile.

In an linux distro the apps are diverse, and programmed in multitude ways by all possible programming languages (from python to lisp to C) and environments. That is reality and life, and what must be done must be done by being aware of that fact. and No, no one can force the broad, diverse open source world to a tight control a la Apple.


>they are mostly consumables which perish in few days or months (I exclude some utitities).

This is simply false, there are plenty of complex apps with varying degrees of usefulness but they certanly don't reduce to 80's game clones and few screen apps. And even if it was true it's not a result of anything intrinsic to the app model. Also "tight control a la Apple" indicates that you missed his point.

How many times did you do make and sudo installed shit on your system to get basic app that should for all intents and purposes be walled of in a sandbox ? Even without sudo why should install script have access to my personal files without explicit permissions ? Did you run in to a situation where you wanted to install the latest version of an app for your system but it wasn't in your distro repo so you decided to build it only to find out that your repo GTK+ library was out of date and your options were rebuild all GTK+ packages or update the OS to alpha ? Even I gave up at that point - imagine the average user wanting to get the latest feature advertised on his favorite app site.

I use Linux desktop daily but it can be a hell and you need to know how to wrestle with the system, it's certainly not idiot proof - and it needs to be for mass adoption. But then again "mass adoption" (IMO) shouldn't even be considered a realistic goal, being useful to techies/developers and available on servers is a good objective too.


> This is simply false, there are plenty of complex apps with varying degrees of usefulness but they certanly don't reduce to 80's game clones and few screen apps. And even if it was true it's not a result of anything intrinsic to the app model.

If I am writing "mostly", I mean "mostly" and if you refute "mostly" with "there are plenty of" you are missing even the basic tenure of a sane discussion. Check the top 100 apps in app stores/markets, and you will see how wonderful those sandboxed apps, you and ingo show us as examples of success. They are "mostly" my-web-page-as-an-app-now hype.

You miss Ingo's point, if you don't read his paragraphs, and carefully read that he compares Android/iOS core and apps to 20+ years of Linux kernel /desktop and software from various architectures, programming languages and technologies. He even claims their core is stable missing the point how young they are. The proud iOS core cannot go for more than few devices and the number of deficiencies/issues people had there are also interestingly high.

Finally, and if you really are doing all those compilations to get the latest version of a software, you might consider better considering your options of distro and package manager. Clearly you are doing something wrong there; and if you please mention what you compile on which distro and version Someone may able to understand what you are trying to achieve, and tell you what you should do instead.


I agree that having to make and sudo install is essentially impossible for a common PC user, but how does the sandboxing have any impact in user adoption?


State is the big evil in software. Minimize shared state, and many, many problems go away.

Centralized package management with a web of specific versioned dependencies is a huge hairball of state as it is. And when installed applications poke around inside your system, it only takes one or two oversights here and there to mess up this state.

Sandboxing, where feasible, minimizes shared state; usually at the cost of other things, but with modern software and modern resource constraints, our ability to deal with complexity is usually the tightest constraint. Sandboxing reduces the probability that things will break horribly (e.g. installing one app, and finding you need to install a new version of a library, which in turn means you need to reinstall 100+ more apps, which may bring in a new desktop or some other abomination).


That has nothing to do with sandboxing: it's a matter of distributing the libraries with the package itself, either as .so files or as a big statically compiled binary, and using those instead. Nothing forces the developer to use the distro provided libraries.

And frankly, I think this is one of the things where you can't please everyone; personally, I rather have dependencies and know that a security fix in a library will fix all applications that use it than having to hope each developer releases an updated version, like on Windows.


That's true to the degree that the implementation of those libraries does not require other things from the rest of the system. If they are just libraries of code, attached to nothing, sure.

But when they need integration with things beyond that? That's when it breaks down. That's when you need interfacing libraries that adapt one version of a library's interface to the next version, and very careful API and ABI design. I see that there's very little reason to take that approach for open source everywhere, but that kind of environment is essential for the stability commercial development needs.

Static libraries aren't the answer. A careful API and ABI interface to the core system, with managed compatibility from release to release, is needed. This is hard work; incredible amounts of effort went into making Windows 95 work as smoothly as it does, and some of the best war stories from Raymond Chen's blog date from this effort.

Things like preserving bugs in the moral equivalent of malloc to accommodate software that used memory after freeing it. This very concept - bug compatibility - is something that I see OSS community in general as being fairly hostile to, thinking that the software with the bug should be fixed, rather than making things work for the user. Of course the buggy software should be fixed; but making things work for the user is more important still.


Actually, your post is making me optimistic.

On Linux, the 4 most common API layers encountered by an application are:

1) system calls 2) standard (and not-so-standard) libraries 3) X11 4) d-bus

#1 is notoriously stable. Linus often posts quite a vehement response to anybody who proposes breaking #1.

#3 was for a long time, TOO stable, it wasn't taking advantage of new developments in hardware and design. The breakup of XFree86 freed that up.

#4 is relatively stable because it's a cross-distribution partnership. They're designed to be the same across multiple distributions which by necessity makes them stable.

The problem are for Linux is #2. Most standard libraries are actually very stable on the micro level. The problem is the vast area they expose: if a single backwards-incompatible change can break your app, it's a problem if there are a million things that can potentially change.

Windows & OS X applications "solve" this problem by shipping their libraries with their applications.

I think Linux has a potentially even better solution: simply allow multiple versions of applications and libraries to be installed simultaneously, a la rubygems & bundler. You'll get all the disk space savings of shared libraries at the point of initial install, which will slowly erode as users install new apps, upgrade some apps but not others. But who cares? Disk space is cheap, but more importantly, if the user cares he can do something about it, only upgrading applications when the whole application upgrades.


Yes, but then we all have to use Plan 9.


On any of the modern distros there should be no scenario in which a common user should have to run 'make' directly. Even for distros which don't use precompiled binaries the install process is simpler than that.


Igor's point about distros trying to "own" 20K packages is well-taken; it's simply not possible.

In my own life the 'solution' I have found is to use FreeBSD; I get a stable, well-maintained core with a sharp distinction between core, userland and third-party (the ports system).

I have found the ports system to be a lightweight, agile alternative to GNU/Linux package managers:

When you install FreeBSD you are left with a kernel, standard UNIX command-line utilities and everything you need to hammer the system into a finely-honed tool.

Right now, I'm using it exclusively on my servers because I'm willing to accept the trade-offs of Ubuntu (beta 12.04 on my dev box, XUbuntu 11.10 on my netbook) on the desktop; a little instability and fully-automated updates is OK in exchange for not having to fiddle with graphics drivers, sound, Flash, etc.


I'm just an ordinary user, but I'd put my 2¢.

Nowadays, any developer can create a packages for his application (and any of its dependencies) and publish it in self-hosted repositories. Users can easily add such repository to their system's sources list and bureaucracy problem's over. Well, some developers are doing this already - the only thing that keeps the rest of them is either ignorance or complexity of the packaging process.

I believe there's no need for an Android market clone (which is yet another centralized repository). What users may need is just a directory, pointing to external repos. Ubuntu market seems somehow promising (at least I remember seeing some dialogs like "you need to enable this source to install that package").

Content duplication is not a problem. A real problem is keeping the system up-to-date when you want to update some library, because of an important feature or bugfix. And it's nearly impossible with every application's bundling their own copy of that library, with some copies being actually incompatible forks (and a lot of copies being just different builds - think of different compiler versions - of exactly the same sources).


What you think is easy is actually not easy. Any time you've added one, two, three more steps to a problem after clicking the link in the browser you've already lost. Adding repositories? Already too late. Touching the command line? Sorry, as much as we love it, for most users it's already too late. Going into Aptitude or Synaptic and pasting in the repository URL? Yep. Too late.

This is all compounded by the fact that there is no app bundle. Mac OS X has the bundle and a terrific way to install it: Drag and drop it into the Applications folder, just like we did in the days of the Mac Classic and MacPaint. It hasn't changed. (Well, it did for a while, but thankfully they went back.)

If you say, "Well, that means you end up installing 5 different versions of the same library on the same system" - Who cares? Disk space isn't a priority any more, and from a developer standpoint it makes a lot more sense to target a dependency whose version number I know, instead of a dependency whose version only the package maintainers know for sure. I don't want to be forced to target v1.3 of a library if it's only been tested (by me) on v1.2. It makes for much better application stability for the developer to be in control of dependencies and not these package maintainers.

I want to ship a self-contained bundle of awesomeness, not a dysfunctional shard among ten thousand other shards.


Some of the bloat could be dealt with by the file system. If you download a library you already have, then simply discard it - and make do with a reference.

What I'd like to see, is the possibility of having multiple versions of software and libraries residing on the OS with ease.

I think these problems are surmountable.


Maybe the experience is different on other systems, but on Fedora with KDE there already is no friction to installing random stuff. If I want to install, say, Google Chrome, which is not in the normal repositories, all I have to do is download the package and open it. A nice GUI program then takes care of the rest for me automatically; the main differences from installing a program on Windows is that it then updates automatically and I have to enter my password.

So really, there is less friction than on OS X--I don't have to drag anything anywhere! In fact, on Firefox, I just download the package and it figures out to open it with the right program automatically. And then it just works.

Of course, if a program is in the repositories it's even easier, but that's a different story...


> Adding repositories? Already too late. Touching the command line?

Nope. As simple as clicking a link with special URL scheme, like `apt+hXXp://archive.canonical.com?package=acroread?dist=feisty?section=commercial`

> This is all compounded by the fact that there is no app bundle.

I'm all for the bundles (which single-app repositories, actually, are!), but I want them to be non-monolithic (i.e. contain multiple separate packages).

I don't care about disk space — if I'm that constrainted with disk space that's probably another story that'll probably never happen to most ordinary users, having terabytes of storage. But I certainly care about bugs, and if libXYZ 1.2 has a critical one, I want my system to be free of that version ASAP.

And I don't care that you've never tested your awesome app with 1.3 — it's better to be possibly unstable than certainly unstable or, far worse, vulnerable.


What makes package maintainers uniquely qualified to patch dependencies and upgrade them? Either we say Canonical or Red Hat hires the best possible people to watch over their package repositories or we say that a qualified application developer could do just as well. Either way we end up having to trust somebody.

Both package maintainers and developers have an interest make sure their programs don't introduce vulnerabilities into the system. Therefore if there's a serious problem with one of their dependencies vulnerability patching will happen either way.

The distribution maintainers should be in charge of maintaining a core set of low-level dependencies that are needed by many applications. Beyond that they should leave the dependency management to the application developers. Seriously. That would free up so many millions of man-hours of work for say, Canonical, that they could actually make the core system usable to the average user.


"And I don't care that you've never tested your awesome app with 1.3 — it's better to be possibly unstable than certainly unstable or, far worse, vulnerable." Not all library bugs affect all programs though. If you change a library and break things, then the user usually has to wait for the maintainers to fix it, even if the programs would not have exposed the vulnerability. I am thinking of something like a program that uses libPNG to load included images that might break because libPNG has been changed because malicious images could cause a buffer overflow.


This is all compounded by the fact that there is no app bundle.

A "collection" type for installing multiple packages transparently would be nice, but I have to point out that nothing prevents you from doing the exact same thing that you do on Mac OS X, which is to distribute the .so libraries in the same package as the application, or even compile it statically.


I do this at http://pagekite.net - I build both RPMs and DEBs and host my own repos. This works fine for the most part and is relatively easy for technical users to take advantage of. It was a bit of work to get it working (and my packages arguably aren't good enough yet), but it's doable, even for a small shop like ours.

The problem is, this is horribly, horribly insecure.

If you add my repo to your sources.list, I can offer a "security update" to any app on your system, a binary which does anything I want.

Obviously, it would be signed by me and could be traced back to me, and I would never do that ... but it would still be a really bad idea to go around adding repos to keep up with the latest GNOME FartButton Free. Add 1000 repos, and the odds of a successful hack or a stupid packaging mistake breaking your system go up by at least that much.


Package managers certainly need improvements.

Repositories must declare what packages (or, better, package name prefixes, like `foobar-*`) they intend to host, and package managers must restrict them from installing something not from this list.

Then you can, for example, host your own libsqlite3, but it'll be namespaced as foobar-libsqlite3 with some `Duplicated-By: libsqlite3 (tested with >= 3.7.3, <= 3.7.7)`.

[Added after some thought] Or, better, let's just namespace package names, based on DNS. I.e., a repository at sqlite.org can provide org.sqlite/sqlite3, but not org.kernel/linux2.6. Obviously, trusted repositories won't be subject to such restriction.


Yes, I do think this could be worked out - the packaging infrastructure we have today is really powerful and could be built upon. No reason to throw out the baby with the bathwater, I just wanted to illustrate some of the obvious drawbacks of the current "state of the art".

I really hope the ideas from the OP get some traction, Ingo makes some very good points which I haven't heard expressed so well before.


UNIX in general never transitioned well from single binary applications that could just be placed in /bin /usr/bin or /usr/local/bin as appropriate to multi-file applications that needed an array of libraries, resource files etc. That coupled with a file system layout that nobody could ever agree on made it a mess once UNIX moved outside the curated environments it was typically found in prior to Linux.

NextStep made a good stab at the problem with bundles (.app .service .framework etc) but once you moved outside of the abstraction layer you were right back into the mess. Most Linux distributions seem to want to emulate SunOS circa 1992 and any attempts to "improve" on the solution in drowned out by fundamentalism.

I actually though the author nailed it in the last paragraph of part 2 of his post. The free software movement needs to start looking forward and not try and emulate what worked 20 years ago. There is definitely potential for a brave organization that is willing to try and tackle the challenge.


My impression is that the problem is more fundamental: a lot of open source and free software which finds its way onto linux is made by people who don't seem to care about the end-user experience. If it works for them, there is no need to improve it.

Not all open software projects are receptive to changes, improvements or bug reports from strangers so nothing is really resolved without forking by which adds its own complications.

Of course there is still good open source / free software. It's just hard to come by.


Even more proprietary software that doesn't make it to Linux is written by people who don't seem to care about the end-user experience. I've spent enough time battling horrible programs on both OS X and various versions of Windows to know that this isn't unique to open source software.

Of course, on the proprietary platforms, the core programs--browsers, office, media...etc are all good. But that is true of Linux programs as well. And open source projects--even ones that are not terribly responsive--are still more responsive than most proprietary programs.


I disagree. The main problem is that the Linux desktop doesn't (yet?) have a large, fast-growing installed base, so mass commercial developers don't target it!


So, how does the fact that any person or company can host their own packages or even full repositories that can be added with a single click¹ and are not dependent on hierarchical organizations fit in that?

GNU/Linux distros, at least APT based ones, are perfectly distributed if the person wants to.

¹ If you have apt-url installed, which Ubuntu has by default


I agree that distributed repositories are the way forward.

The only thing holding that back is the atrocious user-interface. Debian urgently needs to fix that and push their apt-infrastructure out of the 1990s.

In short: /etc/apt/sources.list must die.

This is how it must work:

   apt-get install https://foobar.com/debian/squeeze/widget-1.0
A package installed like that must add itself automatically to a proverbial sources.list for future updates. Don't bother me with the housekeeping.

Then add central indexing ('apt-get install widget' is nicer), a web-of-trust ("1234 users have installed packages by this author"), package-signing looks about alright already (except: no, don't make me run gpg).

This can (will and does) co-exist happily alongside the centralized repositories. Someone just needs to implement it and push it through the glacial Debian processes.

And while we're at it, there's no reason 'apt-get install github://foobar' can't be made work.

tldr; apt needs to absorb homebrew.


But what's the problem with sources.list, as long as the user doesn't have to manage or even know that it exists?

Provide a .deb from your website, use the install script to add your repository to the sources.list. There, the user doesn't have to know or care about that file. And this is all possible - no, easy - to do right now.

This is how it must work:

I disagree, having to run apt-get is too cumbersome to a regular user. But a better way already exists in the form of apt-url. Click a link, have the package downloaded and installed automatically.

This can (will and does) co-exist happily alongside the centralized repositories. Someone just needs to implement it and push it through the glacial Debian processes.

But my point is that most of the infrastructure (support for multiple repositories, one-click installation of third-party packages) already exists. That's why I don't agree that this is the problem with Linux.

And while we're at it, there's no reason 'apt-get install github://foobar' can't be made work.

I don't see how - there's no standard on GH projects for installation; some projects are installed by simple make/make install, others with easy_install/gems/npm, etc.


But what's the problem with sources.list, as long as the user doesn't have to manage or even know that it exists?

You are right. My point was that I (the user) shouldn't need to know that a file by this name exists. Of course debian is free to maintain the bookkeeping in any way appropriate.

I don't see how - there's no standard on GH projects for installation; some projects are installed by simple make/make install, others with easy_install/gems/npm, etc.

People would check-in either the deb or the debian/ meta-files. The latter would be preferable, but the deb package-building process needs to be simplified for developers to consider participating.

The current deb building procedure is a trainwreck, which is in fact a related problem. Homebrew recipes and gentoo ebuilds are trivial in comparison, there's no reason the deb process needs to be as convoluted as it is.


You are broadly describing PPAs. Adding a package from a PPA is a three-line (easily scriptable) process: add-apt-repository <url|ppa-name>, apt-get update, apt-get install <package>. Obviously, that doesn't just work for PPAs, but for all apt repositories.

Launchpad does some central indexing ("other versions of this package") for PPAs, but I don't think it's accessible via command line. There is no indication how popular a PPA is, but PPAs are linked to admin user accounts which might help. Packages are signed and the key is auto-imported. Packages co-exist alongside packages from the repos, you can easily switch between different provided versions.

Installing a plain deb is much easier still, but you don't get upgrades (unless the repository is auto-added during install) or signing.


You are broadly describing PPAs.

Yes, indeed I am. The point where PPAs fall short is that the user still needs to add a repository when all he wants is to install a package.

I maintain `apt-get install https://foobar.com/pkg` is where it's at. Feel free to query me all you want ("really add this untrusted repo?", "fetch updates from there?", "trust this key?" etc.), but by all means make it a one-liner. Not three, not two, one.

And have proper procedures for all the little corner cases where PPAs fall down. I.e. when the repo goes away or changes URL, when the signing key changes, etc.

It's a matter of polish. Non-technical users need that polish. And technical users like it, too.


Like I said, the three line process is trivially scriptable, so you could make a one-liner script that works like you describe, ie. install https://foobar.com/repositoryX pkg or install ppa:xorg-edgers/ppa nvidia-graphics-drivers. I suppose you could combine the two parameters into a single URL to make it more familiar for users.

I don't see why it'd have to be integrated into apt-get itself, which does other things. The next step would be to create this a GUI process and to integrate it into apt-url, to make it easy to install stuff from the web. Of course making it easier to install stuff from untrusted sources also makes it easier for users to install malware, which I recon is why they haven't made it this easy yet.

If you want to have the ability to pull updates, you need some ___location to query for them, and that ___location is the repository. Even if it's just for a single package. If you want to install a single package without updates, there's deb files.


I don't see why it'd have to be integrated into apt-get itself

That's exactly what I'm talking about. To give just one of many reasons: Because currently 'apt-get update' fails hard as soon as a PPA repo disappears. These corner cases must not exist. This is fundamental infrastructure, save the "easy workarounds" for higher levels.

Apt has been "almost there" for about 10 years now. But "almost" is not good enough - neither for desktop-users, nor for professional deployments with puppet/chef that have to fight the same and similar issues (e.g. idiotic license seed files, key conflicts, daemons auto-starting after installation, unreliable exit codes/error reporting, insufficient logging, insufficient hooks to override defaults).

We're in a state of inertia where everyone is so used to working around ever the same bugs that they're not even recognized as bugs anymore. I should absolutely not need a ~80 line puppet manifest or chef cookbook to make apt behave right. Desktop users should absolutely not have to find a bash-script (no matter how small) to be able to seamlessly install third party software.

If you want to have the ability to pull updates, you need some ___location to query for them, and that ___location is the repository. Even if it's just for a single package. If you want to install a single package without updates, there's deb files.

I'll quote pg on this: If you think something is supposed to hurt then you're probably doing it wrong.

Apt is a beautiful concept. Sadly the implementation stopped evolving shortly after becoming "good enough" rather than proceeding to "the best we can possibly make it".


Because currently 'apt-get update' fails hard as soon as a PPA repo disappears.

No, it doesn't:

  W: Failed to fetch http://repository.spotify.com/fake-
  url-to-fool-apt/dists/stable/non-free/binary
  i386/Packages  404  Not Found

  E: Some index files failed to download. They have been
  ignored, or old ones used instead.
All the other repositories are refreshed as usual. Again, apt-get and the rest of the underlying infrastructure is not the problem. I agree that users should not have to find a script, I'm just saying that's a reasonable way to solve the problem. The script should be part of the distribution, just like add-apt-repository, another script, already is.

(That said, I can agree that integrating this functionality into apt offers a chance of deeper integration in the future. And while it's not the Unix way, it may be that the Unix way has stood in the way of an ideal Unix desktop experience for some time now.)


> tldr; apt needs to absorb homebrew.

An excellent package manager that exists for 14 years already is being told to mimicry a primarily inferior set of cruthes? The world has definitely gone mad.


I disagree. The author seems to forget the fact that packages are only a layer of added ease of installing new software. The act of installing new software from source or binaries distributed by the author is still as free as ever.

I admit that I do install about 90% of my programs as packages, but the problem of central authorities responsible for patching and distributing software and taking too long to do it isn't present in every distribution. I've used Arch Linux for years now, and it solves this problem by separating the packages into an 'official' channel of reliable maintainers testing and releasing new versions on the package system and a user repository where anyone can add packages.

To me, this seems like the optimal solution. Community-maintained packages can be promoted to official ones, from what I can see new versions are released from testing within days and if you're not satisfied with how others maintain the packages, building the packages from the newest versions yourself is almost as easy as installing binaries from the repository because anyone can use the build and packaging scripts used by the maintainers themselves.


There are "core" linux distros, like Slackware. There are many attempts at autonomous apps distribution like zero-install and openpkg that mostly work, and probably would work fine with a reasonable effort backing them.

[1]: http://0install.net/ [2]: http://www.openpkg.org/

Of course the problem is that the big distros (redhat, debian) can't be bothered to care.


And in fact, with their sysadmin focus, they explicitly are opposed to this method. If your goal isn't a desktop experience but rather protection from every adversary at all times, then only vetting a small core and letting people run various and sundry packages from third parties is flat-out irresponsible. But of course this is what desktop users want -- to run any software they like with no hoops to jump through or update schedules to keep on top of, so they will always be at odds.


Yes, I see the point being made in the original article. I tend to use LTS Ubuntu and conservative distributions (currently PUIAS) and so see a slower rate of change.

Another Hacker News thread is discussing the new release of Audacity.

http://news.ycombinator.com/item?id=3714766

and it occured to me that I would like to try to compile a statically linked build of Audacity that could work on any version of GNU/Linux from Ubuntu 12.04 down to (say) CentOS 5.7. Just a big binary blob that I could copy and run.

How would I find out how to do this? I've compiled little things before (dwm window manager, qalculate)


It really frustrates me that authors are not responsible for compiling and distributing their software, only then will we have software upgraded regularly and quickly.


I believe proper packaging is quite complex process, so it scares a lot of the developers away.


Isn't creating one of those Windows installers also complicated? To me, it seems even more complicated than packaging software for Linux, yet it has never stopped anyone.


> Isn't creating one of those Windows installers also complicated?

Not for years now, no. The process for creating an .msi for a basic stand-alone app in Visual Studio is pretty simple and well-documented.


which distribution should they compile and distribute (sigh) for? Not to mention basic problems like "should I integrate with GNOME, Unity, KDE or nothing at all?"


There are only 2 popular package formats: deb and rpm.

I don't have any experience with RPM, but to build a package for a reasonably big part of the Debian-based world, you have to set up a build system (pbuilder/cowbuilder) and tell it something like `for DIST in lenny squeeze wheezy sid lucid maverick natty oneiric pangolin do; git-buildpackage ...; done`

The problem is, to get it right one has to find and read TONS of documentation.


It'd be a lot more popular if it was as easy as the Android apk export wizard.

My 2c: Improve the tools and the developers will follow.


I don't think the Gnome/Unity/KDE split is nearly as significant as people find it. You don't have to support them all! I use KDE, and programs written for Gnome fit right in. As far as I know, there is no reason at all to depend on some particular environment exclusively; even things like notifications work uniformly across different systems.


Why not all of them? OBS makes this easy. Big projects can run their own OBS, and for us others there is the public instance provided by OpenSuse. Builds many versions of all major distros.


This is basically why Ubuntu has PPAs and what they are trying to turn Software Center into right?


I've been happily using the stable branch of Debian for about 4 years now. Whilst I agree bureaucracy and politics should be avoided where possible, it seems to me that handing everything off to a 3rd party to focus on the "core packages" harms overall system performance.

Time and again I hear people in the lab complain about all these bugs in their applications, running Debian I can honestly say I don't have this problem. Ofcourse, I don't have the latest software either, but for me that's the price I pay for a stable system.


Most of the complaints here don't even have any suggestions, just a complaint that can be easily translated to "dependencies in software are hard to manage". Well, no shit. How about some interesting ideas around this from the "hackers"?

The ideas that are proposed are mostly things that have already been tried but have failed because of social/manpower reasons (they would take enormous amounts of effort and time from all involved for little benefit) or technical (they don't work)

A large percentage complain that the distro in question updates too frequently, when there are clearly distros that cater to stability (they just aren't the "cool ones").

Some of the complaints are demanding some mythical OS that allows you to install it once, never have it update or change but still have access to all the newest software. That would be wonderful. No one has figured that out yet, not windows, not mac, not any *nix.

I know bitchy comments aren't helpful, or likely to be well received, but there must be some other people of my ilk still on HN. Now that it's Product Guy News, where should I be going? Where is this story posted with people actually talking about interesting ways to improve things who actually understand the problem and could be called hackers without the technically skilled people laughing?


Part 2, where he discusses his solutions, is more enlightening I think.

He's dead-on about the impassable problem of package updates affecting other packages. Having everything sandboxed with a general permissions system for directories instead of per-file is also better (this is how MacOS wanted to work before OS X). A free and open mesh network with reputation-based security is also the future.

But hey, linux is wide open, if these are the changes that are needed, we will see them.


Isn't the problem that the distributions are downstream to the apps and libraries they install ?

Quoting android and ios ecosystem is well and good, but in those ecosystems the os comes first and the apps are developed downstream. You simply cant have that in a linux eco system.


Why do linux distros keep trying to recreate the gaudy, confusing experiences of the proprietary systems? I want my drivers to work, and a window manager that wraps the file-system - allows interaction with the files. In an ideal world, someone would restructure the four bin directories so that system stuff was in one area, and user stuff was somewhere else so we could easily open what we want by navigating the tree.


Just for another POV -- I love the fact that I can apt-get a version of practically any FOSS project of note, and within a minute or two, I have something that works with the rest of my system. If I need bleeding-edge, I go to the project page and download a later binary or source, but 90% of the time that's not necessary.

For me, it's a perfect combination of a vetted ecosystem (ok, somewhat closed but closed in the way I like -- no crapware, all legit source distros and mostly mature projects) with the ability to go outside that system at any time, at my own risk.

Anyone can set up a repository to add to/compete with Canonical's, and of course they do. So with my willow garage repo's, I can keep up with their concept of what's stable, etc. It works nearly perfectly, IMSHO.


Isn't the real issue simply that the desktop distributors simply lack the manpower required to out together a stable desktop release, party due to the massive fragmentation?

I assume MS and Apple have huge teams dedicated simply to making sure that all of this software works together nicely.

If there was a desktop distro that cost $100 a throw and that money was re-invested in testing the desktop platform more thoroughly and making it past and future proof that this would solve many problems?


Sounds like he wants a distro agnostic Ports system, one that hides the dirty work of watching compile time crap fly across the screen and just gives the user a suitable package for their distro.

Either that or he's essentially suggesting we move to static compiled packages, which while tremendously inefficient from a space and security standpoint would alleviate at least some of the headaches of trying to do cross distro binary offerings.


Maybe Android will conquer the desktop then?

All it needs is mouse and keyboard support in the interface and higher-resolution apps, which will come for tablets anyway.


Android has pretty good mouse and keyboard support already. Try one of the Asus Transformers in laptop mode, for instance. Keyboard shortcuts, mouse cursor, two-finger scrolling, all works as you would expect on a desktop.

High-resolution apps are still somewhat lacking because the larger tablets haven't sold that well, but I hope this will improve.


Honestly, if Android came on top of or with a real Linux base so that I could be assured my existing Linux applications and command-line environment will work... I would totally switch.

The new Ubuntu for Android project will take over the world.


Good point. But most of these apps are written, because they can make money for the author. There just isn't a large enough user base in Desktop Linux to make that money, and hence much less people willing to invest the time to write apps. And the much smaller userbase is split up between Gnome and KDE and now Unity and Xfce and so on. So there is even less incentive to write apps.


A program that works in KDE will work in Gnome and Unity and Xfce, so that divide is not strictly relevant. In KDE, at the very least, even GTk programs look very good. And, moreover, those programs would also work on Windows and OS X.


Linux is being used by more developers that I see than ever and even regular people are finding out about it and trying it out.


The author mostly talk about maintaining packages. Some people say that it is the missing games and other desktop applications. However, we all know that Linux is best OS for development. It has best set of libraries, languages and utilities. Yet, why there not many cool Desktop applications?

I believe the answer is, Linux doesn't have an IDE for Desktop development. Look at windows, they have .net Platform and IDE, Look at Mac, they have Xcode and Android uses eclipse for IDE.

Most of the application developers want and IDE, which is what is lacking in Linux. I hope someone makes an IDE for Linux desktop development.


QT Creator for Cross-platform. KDevelop for KDE focused stuff. Not sure what for GNome.

What is wrong with the above?


I didn't know about QT Creator, it looks really good for application development. A lot of people are using Ubuntu for their desktop, including me and my friends. But, Ubuntu's landing page doesn't have a tab called "develop" for leading users to develop applications for ubuntu. So, My impression is linux is a great OS with libraries and frameworks necessary for system development. But, it didn't had the tools to build applications. But, apparently it does. Linux distributions should encourage users to write applications. The best way to do this would be having the information right on their landing page.


Part II of Ingo Molnar's blog post: http://news.ycombinator.com/item?id=3719719


I have to wonder if people's expectations are just a little to high for something that is free. I think a large amount of people complaining about Linux did not use it when it really really sucked (back when Slackware and RedHat were the only options).

Linux is a hackers operating system. When people try to make it into a desktop operating system it starts to suck (Gnome 3).


Honestly, Linux is a lot like the tiptronic setting on my cars gearbox. I have thought it's a cool feature and I'm glad it's there but really... I never use it 'cos it's kind of fiddly and dumb.


what I prefer in this posts is the interesting discussion coming with the article. It seems attention whores and trolls have not yet reached g+, it looks pretty sane for now.

mode religious:on please g+ try somehow to make a good karma system to keep the NSR low.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: