Hacker News new | past | comments | ask | show | jobs | submit login

Molnar's point about the political and procedural difficulties of adding new applications in the official repositories of most distributions is true (although this is changing -- witness Canonical's Ubuntu Software Center, PPAs, and "universe" repositories).

But his reasoning breaks down when he says the relative dearth of commercial applications for the Linux Desktop is due to this issue. That's not true.

The main reason why OSX/iOS, Android, and Windows attract more commercial developers is because those platforms have a much greater installed base!




Another reason is commitment to binary compatibility of a set of components over a time scale of 10 years or more. Linux has historically been relatively unstable, with an ideological focus on source rather than binary compatibility; and fragmented into different distros, which may have different ideas about which versions, configurations etc. of components are pieced together.

That's a perfectly understandable focus - even laudable, if you take the GNU line - but it makes it hard to release reliable, tested binary software. The landscape has too many variables.

I think this is a significantly bigger problem than install base. And I think it would be addressed with an appropriate focus on exactly the problem Ingo is pointing out: the lack of a defined, high quality core that can be relied upon.


You can statically link binaries and distribute those, which companies do when there's enough of a market to bother. That's how Matlab is distributed, for example. I think the bigger problem is that there usually isn't enough of a market; AutoCAD was discontinued on Unix because too many shops were retiring their Unix workstations in favor of Windows desktops, not because Unix software distribution was too hard.


Sure you can statically link binaries, so long as you have a license for all those libraries, or if you write your own UI etc. from the ground up not much removed from the X protocol. And when UI refreshes come around, your UI will look frozen in time in a way that UIs using standard controls don't in Windows.

And how about those libraries that control shared resources? For example, sound output. It's been many years since I bothered to try and use Linux as a desktop OS, but I recall wholesale choices of sound subsystems, with options for one subsystem to emulate another, etc. How well would that mess work with static linking? And these are only the most basic of shared resources; not thinking about file/app association, icon display in file managers, and other really really basic OS services that have seen repeated whole reinvention in Linux.

IME Linux is unusable beyond the command-line, preferably via ssh.


The semi-standard commercial solution for widget toolkits, excluding old software tied to something legacy, is typically to use Qt, which has both LGPL and (reasonably priced) commercial license options. Audio isn't really a problem for statically linked apps not doing anything particularly strange; it works fine in anything I've tried. Even ancient audio APIs in ancient binaries are transparently emulated through something-or-other in a way that "just works".

If you haven't used Linux in "many years", that might be the source of your impressions. I remember mucking with that kind of stuff on Slackware in 1998, but I haven't touched OSS or ALSA or whatever in a decade; it just does its thing under the hood. Audio even works fine when I run Windows applications under Wine!


I use Linux every day; I haven't used desktop Linux much since a fairly unpleasant experience with the original Eee PC (701, in 2007).

But package management (I live with Debian apt-get and friends) is a bane of my life. Generally, the binaries that come when I install a package are too old, or don't have the right compile-time options configured, or there is some dependency missing that I don't have installed and is for some reason missing in the repository due to bitrot. It's usually more reliable to download the source, configure, build and install it the old-fashioned way. That also means fixing build errors, tracking down missing libraries, and generally a whole load of work I wouldn't trust an otherwise fairly competent software engineer to do, never mind the average user.


What distro do you use? It's very much not my experience on Debian (Sid).


You use debian and are complaining about old binaries? I think they consider it a feature.

I use Ubuntu desktop/server and etc and my experiences with the package managers have been flawless.


The Eee PC Linux install was terrible. I installed Ubuntu almost immediately.


Yes; first I reenabled the underlying UI so that I had access to a console; soon after I installed Eeebuntu when it was put together by the community. I never used the original limited UI for more than perhaps 60 minutes total.


> The main reason why OSX/iOS, Android, and Windows attract more commercial developers is because those platforms have a much greater installed base!

A few years ago, Linux had a bigger installed base than both iOS and Android. Both of them outran linux with ease.


Yeah, but they're in a brand new market on brand new devices. People who already have a proprietary PC OS aren't going to switch to Linux out of the blue, but they are going to buy a shiny smart phone.

So the two situations are really not comparable. I still think the main difference is that you have to install Linux yourself. It's not even that installing is difficult--it isn't!--it's that normal people don't even realize it's an option. Your average random laptop buyer who just spent $600 on a laptop from Staples would be able to use Linux perfectly well if that's what his laptop came with--I suspect some wouldn't even realize it wasn't just a different version of Windows. But since his laptop invariably came with Windows, that's what he's going to use, not for any reason but inertia.


> but they're in a brand new market on brand new devices.

But they still had no problem starting with an ecosystem with zero apps. Now they have hundreds of thousands.

> I still think the main difference is that you have to install Linux yourself.

Over the years, I've had several people, whom I talked into checking out Linux, give it up and go back to Windows because they refused to accept that they have to upgrade the whole distribution just to be able to install new versions of single apps. Did you ever try to explain a Windows user what a "backport" is, and whats it good for, and why there are none on Windows and why he can he can install whatever app and whatever version of an app he wants on Windows, but cant on Linux?

> Your average random laptop buyer who just spent $600 on a laptop from Staples would be able to use Linux perfectly well if that's what his laptop came with

For 6 months, then he wouldnt be able any more to update his apps.


Ummm yeah they're also OS's for a completely different platform. That makes a pretty big difference.


This is the most incoherent statement in this entire thread full of non-hackers saying "wouldn't it be nice if there weren't dependencies in software?".

The last thing this discussion needs is troll comments.


""" The main reason why OSX/iOS, Android, and Windows attract more commercial developers is because those platforms have a much greater installed base! """

... and therefore enables developers to make more money!

In my opinion it is one of the most important factors right here, combined together with a lot of marketing from companies which own the "platform".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: