You can have my mouse and keyboard when you pry them from my cold, dead hands! (Or alternatively: When you invent input devices that are genuinely better for the purposes of inputting huge amounts of text and indicating intent on a large display mounted in an ergonomically suitable position)
I think that our current idea of what a PC is in in its twilight, however. A co-worker bought a brand new Macbook which proceeded to self-destruct (it quite literally melted) 2 weeks after they purchased it. The Apple store turned around a new one in minutes, obviously, but I couldn't help think - in a year or two from now, all his apps would be storing their data through Apple's cloud. The hardware would be completely fungible in that case, and failure would be an almost non-event compared to how much hassle it is currently.
Yeah, I am really interested in this question: how do mobile computers become capable of much faster data input and more fine-grained control than the current point-and-grunt interfaces.
I can actually get quite close to the world record for typing a text message on a Motorola Droid, to the point where I have (in a pinch) done some emergency programming / server configuration on it. With a little more work I can see me getting up to 40WPM on a mini keyboard... but that's never going to replace the 100WPM I can hit on a full-sized one!
For me the big issue is screen ___location and ergonomics. Touch screens are fundementally incompatible with good ergonomic practices.
I'm not sure i'm convinced by the argument that the cloud is a good thing because the hardware failures will have less consequence.
Apple sold a laptop that melted two weeks after it was bought. They could, instead, have made a laptop that WOULD NOT melt two weeks after it was bought.
I see the cloud as the biggest threat to individual online freedom and privacy. I will continue to buy big Thikpads, and avoid the cloud and tablets and the likes, at all costs.
You'd have to be very narrow in your online usage to guarantee your privacy. I wouldn't say privacy is dead but complete privacy is impossible. One could argue as well that the cloud was the major force behind the Arab Spring so it isn't all bad.
Regarding the laptop, I haven't heard this being a widespread problem with the Air. Once in a while, everyone comes across a device that has problems regardless of the manufacturer. What I take out of the story is the quality customer service. Good support almost guarantees that they will be a returning customer.
No, I don't think it's great that his laptop melted after two weeks, quite the opposite. I do think it is a good idea to plan for failure though. One thing I like about Windows (from Vista onwards) is that it is really fault-tolerant. I've been using OS X for about 9 months now, and while programs lock up less frequently, when they do they do sometimes manage to lock up the entire O/S.
Windows used to do this, but programs on Windows go catastrophically wrong that it is now expected and planned for - it's rare that a user space program impacts performance to the point where I cannot kill it.
Whole system? No. X? Yes. And when X crashes so does every other program I am using, so other than the shorter recovery time that isn't a lot better than crashing the whole system (ok, I have to admit there is much less chance of catastrophic disk corruption, as all the higher level processes will continue unimpeeded).
While I do absolutely love being able to ctrl+alt+F(x) no matter how bad things are and regain control, the fact that Windows has been able to recover from the window manager, graphics card driver, or even graphics card crashing without my music even skipping a beat for years now makes the reliance on X (by far the least stable component at least on my install) feel wholly unnecessary.
> by far the least stable component at least on my install
This is often the result of bad graphics drivers. Would you like to share your setup?
My laptop has that ACPI bug (in that the BIOS enables ACPI on peripherals and reports they don't support it back to the OS which then makes terrible decisions). I had to change my screensaver to a non-3D one to prevent the machine from seizing when entering low-power mode.
PCs are definitely not going to disappear, but most of the people that don't have a professional need for them will no longer have them in the future.
This is because the majority of the population (that is, everybody except geeks, developers and other people that use PCs for work) don't really need a PC. It's just that, until now, there were no devices other than PCs that would give them everything they wanted - the ability to communicate, play games, watch videos, listen to music, surf the web, read mail etc. - all in one package.
Now, as smartphones and tablets get more and more powerful, it's very likely that people will sooner or later abandon their PCs in favour of devices that offer those same features at a lower price, and that are both simpler to use and more portable than PCs.
PCs will survive, though. It's just that they'll no longer be used by average consumers, but by professionals and enthusiasts.
I think you are spot on. Just like big sound system aren't really dead in an era of miniaturisation, or vinyls aren't really dead in an era of digitalization, PCs or laptop won't really die in an era of cloudification.
It should be interesting to see the potential metamorphosis of the desktop as well. Does anybody not see Apple eventually releasing an iPad with a 30 inch screen?
There is one hole in this line of thought. All of those mobile devices, all of the "social spaces": where do they get planned, designed, and programmed? On desktops.
Until we have a mobile device powerful enough that you could use it to build applications for itself, we're not going to see the full feedback loop of better tools leading to better apps leading to better tools.
> where do they get planned, designed, and programmed?
In 2000 or so, I was developing things with Zope. Most of what we did was through the web and it was quite a shocking experience to have almost nothing on my desktop. While Zope development has shifted to filesystem, there are other platforms that took the through-the-web idea and ran with it.
And, BTW, if you give me a terminal, I can run Emacs just about everywhere. Version 22 even runs on VMS.
Why is this a "hole"? One could argue that while such a feedback loop was necessary when computers first appeared, it may not be necessary in a world where computers are ubiquitous.
I don't think anybody is suggesting that PCs will vanish. Mainframes, after all, are certainly beyond their golden age yet they are still in use today.
Moreover, the penetration of PCs does not need to decrease in order for them to be eclipsed by something else.
Is anyone saying that PCs are going to disappear? This seems to be a common response to this sort of article. I think instead more and more people will rely on their mobile devices, while plenty will still use PCs.
Most of the mobile devices are, in fact, powerful enough to develop for themselves, what they lack is mostly decent input ergonomics and redesigned UIs for the tools. (And in Apple's and a few other companies' cases, they have an excess of lockdown.), but that's not really the point.
I'll be using my laptops for the foreseeable future (seriously, people still use desktops?), but the IBM guy's point is that the mass market is moving on from what we consider "PCs". Geeks will absolutely still have them, other people not so much.
Orders of magnitude? I can believe your desktop is faster than a laptop, but 100x faster seems a stretch. Still, I think there are valid reasons for preferring a desktop. Besides it being faster, you can have bigger screens etc.
I never understood that line of thinking. I have an wireless keyboard and mouse hooked up to my MacBook. When I hook up my external monitor I now have two screens, giving me more dekstop area than a single desktop. I don't see why anyone would want a desktop unless they have serious horsepower requirements.
What kind of development is that, exactly? I bet your IDE runs just as well on my laptop as on your desktop. Does your company not have a build farm? Why not?
External monitor, keyboard, and mouse makes a laptop exactly as ergonomic as any desktop, but still perfectly easy to pick up and go.
It eludes me why any developer would be using anything other than a laptop. Headless machines for VMs and heavy lifting, sure, but why would a desktop be what you're actually "working" on?
Why does a developer need to upgrade the video card in their laptop? What is it they're doing? There are a lot of developers, and a fairly small set of them care about fast graphics.
Why does a developer need to upgrade their processor more than every few years? What is it they're doing? For the relatively rare developer that still has long compiles to deal with, why doesn't their company have a build farm?
Why does a developer need two or more video cards? What is it they're doing? Again, most developers are not in gaming or other high-end graphics work.
My laptop can use two or more optical media drives.
Laptops are not a little black box with zero expandability. Ports exist for a reason.
With a company-issued desktop sitting in your office... Why are you playing games?
Anyway, my laptop is my one and only front-and-center machine, used for development, gaming, videos (why you brought that up, I have no idea, since it takes hardly any power at all), etc., and I have no performance problems.
Well in my case the answer would be "because I'm bored" (though I actually have a laptop at work, and a desktop at home, So I'm playing games with a company-issued laptop in my office).
But there are plenty of developers (on HN in particular) for home "their machine" isn't just something that is in their office for work use, but is what they have for personal use as well (maybe they're freelance, maybe they work for their own startup, whatever).
I've seen you reference this multiple times now. Based on experience as a consultant, I think you are giving most companies (or at least their IT dept) way too much credit.
Well, I've referenced it twice, but it's an important question.
Most companies have no need of build farms, the code the developers build simply isn't that big, and definitely doesn't need the latest hardware to be done quickly.
I would definitely expect companies with massive projects to have build farms, even if the developers have to take matters into their own hands.
I think that our current idea of what a PC is in in its twilight, however. A co-worker bought a brand new Macbook which proceeded to self-destruct (it quite literally melted) 2 weeks after they purchased it. The Apple store turned around a new one in minutes, obviously, but I couldn't help think - in a year or two from now, all his apps would be storing their data through Apple's cloud. The hardware would be completely fungible in that case, and failure would be an almost non-event compared to how much hassle it is currently.