20 years ago, when I helped cover IT hardware including AAPL for a large investment bank, our analyses consistently showed that Apple products were comparable in price to competing products with comparable specs.
I agree that Apple Silicon has given Apple an additional leg up on the competition, even aside from the more-than-competitive price.
Agreed. My (former) x86 Macbook was the best out there at that price. The only other laptop I found that was comparable (this was ~10 years ago) was the Thinkpad Carbon X1 (and the Thinkpads in general back when they were still high quality; not sure I would buy one now), but it was similar in price to the Macbook.
I knew the very basics of the story, but did not know/hadn't put it together in my mind that kernel 1.0 appeared in mid-1994, and not somewhere in the 1991-1993 timeframe as I had vaguely assumed. Certainly, the Red Hat Linux 2.1 (kernel 1.2.13) I installed in December 1995 (making my home 100% Microsoft-free ever since) felt pretty feature-complete.
>At its soul, Apple is a software company that also makes their own hardware. Nokia was a hardware company that also made their own software.
... and bad software, of course. Worse than that, multiple versions of bad software.
Apple is the only company in history to build consistently good hardware and good software and UI. Not IBM, not DEC or the other Seven Dwarfs. It really does go all the way back to the Woz-Jobs duo providing a maniacal focus on UX and one of the most brilliant engineering minds of the century.
Nokia did had software chops, just on another metric than UI: According to a presentation at my university they were very deep in testing and verification and had a lot of expertise there.
And in all my years of using Nokia phones I can’t remember a software bug. But of course we wanted more from our phones than just stability, we wanted features and better UI.
> For example, the bullet "scaling the user experience to lower HW specs may be challenging. iPhone mini may be closer to iPod UI" comment still suggests they were stuck in the mindset of the time. They thought it was unlikely that Apple could deliver a horizontal platform, rather Apple would be launch a series of individual phones at different prices, each with bespoke interfaces, just like all the players had been doing, over and over.
Indeed. I referred to it at the time as the 50-model strategy.
I helped cover IT hardware companies including Apple at a bulge-bracket investment bank. Not just Nokia, but the entire phone industry was caught flatfooted by iPhone as willvarfar and anonu said, despite rumors going around the industry. (The joke slide in Jobs' announcement presentation showing an iPod with phone dial was not too far off what we and most people expected.)
Thoughts on the presentation:
* "There is not much coolness left for Motorola" - The day of the announcement, I saw a press release from Motorola come across the wire, in which the company announced yet another phone with a keyboard. I felt pity for the unfortunate souls who had designed it, worked on its launch, and wrote the copy for the press release, and who now had to see their efforts fly into Hurricane iPhone.
* Predictions of lower-priced iPhones - Average iPhone prices of course rose, as opposed to falling. As JSR_FDED said, Apple has always played upmarket. I heard Apple's CFO say at a Citigroup-hosted investor conference that his company could release a $799 computer "but we don't want to".
(That said, it is quite possible to find deals, at least in the US. I got my iPhone 13 by agreeing to pay $200 over 30 months on top of my already super-cheap T-Mobile plan. The iPhone before that, I bought carrier refurbished for $100 from Sprint.)
And of course, there never was an iPhone mini with a fundamentally different UI. Despite the repeated commitment to improving on UI, etc., I guess it would have been too much to ask a company like Nokia, the king of releasing a new model with new UI and new form factor weekly, to imagine that another company would just not play the infinite-SKU game. (Conversely, it's not hard to imagine that had Apple entered the phone market in the 1990s during the years of endless indistinguishable Performa models, it might have tried to play along.)
* The MVNO mention is regarding rumors of Apple launching its phone in conjunction with an MVNO. We thought this was quite possible, but it was based on Apple having the credibility to immediately have millions of customers switch to it as their carrier, and not because Apple—of all companies—could not get whatever it wanted from carriers.
* Third-party app support - Most have forgotten that Apple really did expect webapps to be the app experience for iPhone's first year. But even that would have been an improvement over what things was like before iPhone. I speak as one who purchased my share of Palm apps. $20 was the norm for, say, DateBk6 (which, by the way, has at least one function that MacOS's Calendar just got with Sequoia).
* "Expect RIM and Palm to suffer" - I never liked using my company-issued Blackberries. I didn't leave Palm until 3GS in 2009; besides DateBk6, I also liked being able to tether my computer to my Palm Treo 700p.
* I'm pretty sure there was no sharing of data revenue or iTunes revenue. Apple got what it wanted from Cingular/AT&T regarding marketing and in-store push without having to preload bloatware or the carrier's brand name all over the device/packaging, and the carrier got the exclusive of the decade. Remember, Deutsche Telekom deciding to sell T-Mobile in 2011 was directly because it didn't have iPhone (so that tells you how the repeated mention in the presentation of T-Mobile turned out).
Oh man… I forgot about the software branding on pre-iPhones. Everything had the carriers brand on it from the boot screen to all the “special apps” and crap. iPhone had none of that and it absolutely pissed off the carriers. Apple turned them all into dumb pipes and they hated that.
>It's going to be like the glory days of the Tel Aviv binary options scammers, who at one time were 40% of the Israeli finance sector and had good political connections.
When I started at Goldman Sachs 25 years ago, I was told early on of an "Israeli discount" and "Canadian discount"; that is, investors were more skeptical of companies based in those countries.
I was not told of any more details than that at the time, but I now wonder if what you said is the cause?
I played the Linux version the article mentions while at Goldman Sachs; a colleague on the Red Hat coverage team gave me a boxed copy of Corel Linux including the game. The port ran very well on my Red Hat Linux box at home.
In retrospect it was part of a brief flurry of Linux ports of major games. I also got to play Return to Castle Wolfenstein and Neverwinter Nights; in both cases the publishers made Linux clients available for download that use the retail version's assets. Despite the valiant efforts of Wine and related projects, the world would have to wait 15 more years before Proton leveraged Wine technology to bring quasi-native games to Linux, and 20 years before Steam Deck made it the norm or close to it.
That reminds me of 1999, where I threw a party to help my friends modify their Celeron 300A CPUs so they could run dual-socket. My dual 300A running at 450MHz would run Starcraft under WINE faster than Windows could run it because at the time Windows couldn't do multi-core. Under Linux one processor would run the graphics (in X) and the other would run the game mechanics, and it would blaze.
Yes, the dual Celeron 300As, if you could take advantage of multiple cores, were faster than the higher end CPUs, particularly if you overclocked to 450MHz. My box was stable at 450MHz for around a year, then I had to gradually down-clock it, eventually back to 300. Never really did much to track down why that was, just rolled with it and figured I should be grateful for the overclocking I had.
I also ran a dual Celeron system overclocked to 450mhz - it was great value in 1999. Abit even launched a motherboard that let you run dual Celerons without modifying the processors, the legendary BP6:
This was first board to let you use unmodified Celerons, the "hack" to let dual CPUs work with those chips was performed at the motherboard level, no CPU pin modifications needed.
The real problem with this setup was that a vanilla Pentium 3 would run circles around the dual Celerons. I had my Celerons clocked to something ridiculous at one point like 600MHz and still could not beat the Pentium.
You are forgetting the massive price difference though. For sure a P3 was great if you had an unlimited budget, but a quick look at pricing sheets for September 1999 shows a 600mhz P3 at ~650 dollars.
The 300mhz celerons, easily over-clockable to 450/500mhz, where only ~150 dollars each. These prices are in 1999 dollars too, I haven't adjusted for inflation.
It was the value proposition, not the outright performance that made dual celeron builds attractive, especially in an age where we were having to upgrade far more often than we do today to keep up with latest trends.
In 1999 I vividly remember not being able to afford a P3 build, was largely why I ended up with the BP6. The P3 also had significant supply issues throughout its lifespan, which didn't help pricing at retail either.
About a year later, I got the P3-550 that overclocked to 733. Not quite as good of an overclock in terms of percentages, but I ran that machine for 5 years with no issues.
Well, just envy hate and just momentarily. Back then, such hacks were harder to find/discover. I would have loved to do that hack, I yearned for true multicpu.
For some reason I feel like running home stuff fell out of favour. Or perhaps, I stopped doing it. I would prefer to do it again however I don’t ever have an idea what to do with it since these days I just stream everything from the internet. And I have plenty of cloud compute for whatever I want to do.
People call it "homelab" or "data hoarding" these days, but yes, the easy access to hours and hours of movies and music was "solved" for the average person by the content streaming sites, so there's not as much a drive as there used to be for it.
It really is a world of difference - XBMC 20+ years ago made your TV a wonder to behold, more powerful than anything anyone else would have (and desirable, too!).
Now a full Plex + JellyFin + Infuse setup just makes it feel like some sort of knockoff Netflix.
There are still advantages. But they’re not as noticeable (main one being if you can find it, you can have it instead of having to search various streaming platforms).
yes i remember setting up a system through xbmc that made it so the tv would appear to have cable channels that were built from everything that was on my storage. So you had channels for different genres or cartoons or movies or whatever. and you could flip through the channels and watch whatever was on. and now it's just that you can even watch digitizations of old vhs recordings of television on youtube if you want. there just seems to be no need to do all that work and when you do, it never seems terribly nice and half the time things are falling over. It was fun when i was a student but now I actually want to watch television if i sit in front of the tv, not try and fix whatever broke to make it work again.
Yeah, though one really nice thing I've discovered is that JellyFin has support for what it calls "home videos" and that + Infuse means the family has access to all those recordings to take but never watch, like kid's recitals, etc.
It was the first Linux I ever used, from a PC magazine CD in 1999. A significantly hacked-up KDE 1.1 w/ integrated Wine. To this day, you can find Corel in the copyright dialogs of a few notable KDE apps, e.g. the file archiver Ark.
I'm now looking back on 25 years of Linux use, 19 of them as a KDE developer, including writing large parts of the Plasma 5/6 shell, 6-7 years on the KDE board, and working on the Steam Deck (which ships with KDE Plasma) at a contractor for a hot minute to bring gaming back as well. At least on the personal level it was an impactful product :-)
Same here. The Spanish edition of PC Mag included Core Linux. It was the most pleasant install experience in much, much time (next, next, next, finish)
I had the box set, it was the first Linux game I bought. The flurry was Loki Games, a porting house. They let me help as a beta tester! I got to test Descent III and Mindrover. Next would have been Deus Ex, but they flamed out. One of them, Sam Latinga, built SDL and I believe is still active.
You probably don't, the old linux binaries are notoriously hard to get to function properly on a modern distribution.
While the kernel interface remained stable across all those years, user space libraries have changed quite a lot, so it's much easier to run the Windows version with wine.
I feel relying on WINE and Proton instead of building a proper GNU/Linux ecosystem will eventually backfire, it didn't happen already because thus far Microsoft chosen to ignore it.
However as Steam vs XBox slowly escalates, Microsoft might eventually change their stance on the matter, forcing devs to rely on APIs not easier to copy, free licenses for handhelds, taking all Microsoft owned studios out of Steam, see which company has bigger budget to spend on lawyers, whatever.
WINE and Proton piggyback on Microsoft's guarantees of Win32 stability. As long as that remains in place (which should be for all intents and purposes forever given MS's customers) they can't really do anything about it.
So, next time you hear the joke about Win32 ABI being the only stable ABI on Linux, remember it's funny because it's true!
Don't forget Windows finally made Year of the Linux Desktop(tm) a reality, Windows is the best desktop Linux distro (Android gets the mobile Linux distro crown).
Windows' desktop environment is much too lackluster for that. It's uniquely inconsistent (many distinct toolkits with irreconcilable look-and-feel, even in the base system), has poorly organized system configuration apps that are not very capable, takes a long time to start up so that the desktop becomes usable, is full of nasty dark patterns, suffers an infestation of ads in many versions.
Besides the many issues with the desktop itself, Windows offers piss poor filesystem performance for common developer tools, plus leaves users to contend with the complexity of a split world thanks to the (very slow) 9pfs shares used to present host filesystems to guest and vice-versa.
And then there's the many nasty and long-lived bugs, from showstopping memory leaks to data loss on the virtual disks of the guests to broken cursor tracking for GUI apps in WSLg...
> It's uniquely inconsistent (many distinct toolkits with irreconcilable look-and-feel, even in the base system)
While I agree that Windows has long since abandoned UI/UX consistency, it's not like that is unique: On desktop Linux I regularly have mixed Qt/KDE, GTK2, GTK3+/libadwaita and Electron (with every JS GUI framework being a different UI/UX experience) GUIs and dialogs. I'm sure libcosmic/iced and others will be added eventually too.
> On desktop Linux I regularly have mixed Qt/KDE, GTK2, GTK3+/libadwaita and Electron (with every JS GUI framework being a different UI/UX experience) GUIs and dialogs.
And you can choose to install GTK+, Qt, and Electron apps on Windows or macOS, too. That has no bearing on the consistency of the desktop environment itself (not on Linux or on macOSa or on Windows). That fact is simply not relevant here.
You could point to some specific distros which choose to bundle/preinstall incongruous software— those are operating systems that ship applications based on multiple, inconsistent UI toolkits. But that's neither universal to desktop Linux operating systems nor inherent in them. Many cases that do serve as examples by the definition above are still not comparable to the state of affairs on Windows— for instance, KDE distros that ship a well-integrated Firefox as their browser— are on the whole much more uniform than the Windows UI mess.
> could point to some specific distros which choose to bundle
Why does that matter if that’s not how most users do it? There is no magical dividing lines between a distribution and the user choosing to install a random collection of apps on their own.
'Desktop Linux' isn't an operating system but a family or class of operating systems. Linux distros are operating systems. If we are to make mewningful comparisons to macOS and Windows, then we must compare like to like.
But they are inherently different and not really comparable to macOS or Windows so it wouldn’t make a lot of sense.
For instance where exactly do you draw a line between which app/package/component is part of a Linux distribution and which is third party? OTH it’s more than obvious for proprietary systems.
If the API only has additions, then Microsoft would still need to convince game devs to actually use them (and Valve will point out that if they do, their game will not work on Steam Deck, so there's a clear downside).
If some APIs are removed, it breaks older Windows games. I can't think of any historical API that has been completely removed in this way - even stuff like DirectDraw and DirectPlay is still there even though it has been deprecated for decades.
They could create a new interface that's somehow more efficient, and work with Unreal / Unity / Godot and a few others so it's just a recompile for them, but it's a bigger problem for Wine, perhaps? I'm just thinking out loud.
They “could” also create a new interface that's somehow more efficient for Windows. Oh wait..
I don’t think MS has the attention span for stuff like that. Especially considering the limited short to medium term payoff.
They could buy Unity though. Considering how mismanaged that company is it wouldn’t be such a bad outcome. Of course large acquisitions are very costly and risky these days.
Note this is a huge improvement from 'binary is guaranteed to not work in the future, probably not too distant' of the standard model of Linux distributions.
If Linux gaming picks up and it gains significant market share then that is not an issue. Game developers will not use APIs that don’t work on the machines of ~20% of their users (or won’t make it mandatory, anyway)
Considering the alternative (ie. the native approach) would result in having very few games on Linux anyway that doesn’t seem that bad.
There's a good chance that if that if Microsoft doesn't act soon enough, and a lot more devices running Steam OS are released, Proton might become the de-facto platform against which many new games are developed, and which engines target.
Agreed. I actually think it might be too late at this point since it takes so long to turn the aircraft carrier.
Microsoft can't realistically deprecate/remove Win32, so all they could do is entice with new APIs. That will work for some games, but especially with the frameworks in place, they'll have to be really good to get people to abandon Steam Deck compatibility to use them.
They bought a lot of companies and are doing their level best at running them into the ground. Xbox is a dying platform. They may try some things that they've tried before (GFWL) but they're not going to succeed this time either.
Kernel-level anti-cheat is a bigger threat to gaming on Linux than anything Microsoft has directly done, but even that is fixable.
> Kernel-level anti-cheat is a bigger threat to gaming on Linux than anything Microsoft has directly done, but even that is fixable.
Agreed. Valve providing some service that gives local games less info, so they literally don't know where players are until they need to, might spell an end to wall hacking at least.
The Steam Deck is basically the successor to the Steam Machines. The actual hardware didn't go that well, but they laid the foundation in software for what we have now.
So, in a way, the Steam Machines were a great success.
Also, Valve has (for better and worse) far more power and control in the gaming ecosystem than most companies Microsoft has to deal with.
> Microsoft controls Windows and DirectX, Valve only gets to play until Windows landlord allows it.
DirectX has to stay reasonably close to Vulkan. And Vulkan is not an afterthought for graphics card manufacturers, quite unlike OpenGL of yore.
And Win32 (sans Vulkan/DX) is mostly feature-complete for gaming purposes. Manufacturers can just target the current state of Win32 for a decade more, if not even longer.
> It certainly is, in what concerns NVidia, they keep innovating first with Microsoft on DirectX, and then eventually come up with Vulkan extensions.
I don't get that impression. I can't remember the last significant feature that was present in DX first, and not immediately or shortly available in Vulkan.
On the other hand building Linux binaries and keeping them running for years without maintenance has proven far more difficult than emulating Windows.
For an example track down the ports Loki games did many years ago and try to get them running on a modern machine. The most reliable way for me has been to install a very old version of Linux (Redhat 8, note: Not RHEL 8) on a VM and run them in there.
It just means Microsoft has put more emphasis on ABI compatibility. This makes sense. In the open source world ABI compatibility is less of an issue because you can just recompile if there are breaking changes. ABI compatibility is far more important in a commercial closed source context where the source may be lost forever when a company shuts down or discontinues a product line.
Even then the rights get dicey when they include third party libraries and development systems. Doom famously had issues with the sound library they used.
Plus, with commercial software it often happens that the code only builds cleanly on one specific ancient version of a closed source compiler in a specifically tweaked build environment that has been lost to the ages. Having the source helps a lot, but it is not a panacea.
Yet both of these issues seem to plague closed source software more than open source ?
Doom wasn't developed with open source in mind, was it ?
What open source software "only builds cleanly on one specific ancient version of a closed source compiler in a specifically tweaked build environment that has been lost to the ages" ?
On opens source projects the build system needs to be reasonable enough that anybody can set it up. There are lots of conventions and even tools to help people. On closed source projects it is just Joe the sysadmin who sets up the machines for everybody working on it. Also, open source projects rarely include requirements like "buy a license of this specific version of this proprietary library and install it on your machine".
Doom had the advantage that it was written by a really excellent team with some standout programmers, and it has had plenty of people maintaining the codebase over the years.
> I feel relying on WINE and Proton instead of building a proper GNU/Linux ecosystem will eventually backfire, it didn't happen already because thus far Microsoft chosen to ignore it.
Microsoft can't do shit against WINE/Proton legally, as long as either project steers clear of misappropriated source code and some forms of reverse engineering (Europe's regulations are much more relaxed than in the US).
The problem at the core is that Linux (or to be more accurately, the ecosystem around it) lacks a stable set of APIs, or even commonly agreed-upon standards in the first place, as every distribution has "their" way of doing things and only the kernel has an explicit "we don't break userspace" commitment. I distinctly remember a glibc upgrade that went wrong about a decade and a half ago where I had to spend a whole night getting my server even back to usable (thank God I had eventually managed to coerce the system into downloading a statically compiled busybox...).
Microsoft is going the opposite of what you're suggesting. Their games are coming to Steam, Playstation and Switch. Also, their game division isn't exactly thriving right now. They have a ton of studios, but they are not selling hardware very well right now.
The more that time goes on, and the more entrenched steamOS/Proton becomes, they will not have any sort of easy time trying to lock-in to Windows. Even now in the earliest days of steamOS, there is blow-back when a game does not support the Steam Deck (which means Proton).
I would posit that in this scenario it is Valve who has the deeper pockets. It's a privately owned company and not beholden to the whims of a quarterly driven revenue cycle, and it's a matter of life or death for the organization.
In contrast, gaming is essentially a side show for Microsoft. The resources required to push Valve off it's pedestal would have higher returns invested elsewhere.
Games aren’t going to suddenly start targeting only updated copies of windows 11 though… if they target even win 10 then they need to be API compatible with what’s currently there in windows. It doesn’t matter what new stuff comes out. Just like how we had to keep using ie6 compatible code for ages for the 5% of people still on windows xp even though it kept us from using modern web tech for everyone else.
They can't stop publishers from targeting steam/proton, though. The publishers will go to where the market is. Sure maybe they can restrict the version published to whatever store windows has but they can't prevent the one distributed with steam targeting an older version.
Would that still not be easier than developing something stable and finding ways to force 3rd party developers to support Linux? (when you can offer them anything in return)
Was the Linux port made by Loki Software by any chance? That shirt lived company did a lot to make Linux viable for gaming. They developed a bunch of new libraries to help with porting and open sourced them. SDL was one of them.
Anything erroneously marked as spam can not be released to the forwarding address—-meaning they fail at their one job, forwarding email. Pobox had a great interface for quickly releasing messages to the forwarding address.
Pre-Fastmail, I did not have mail storage space at Pobox; just forwarding ability. I did not use Pobox's own interface for releasing spam mail; I used the standard filter (can't remember the exact name) and almost never saw nonspam in there (not that I checked often).
Post-Fastmail, I still forward from Pobox/Fastmail to the same other Google Workspace account from which I pull mail to my local system with `fetchmail`. I have Fastmail send all mail, spam or not; while the settings UI does not allow setting the spam protection level to "Off" when forwarding is used, the same thing can be achieved by using "Custom" then disabling "Move messages with a score of ___ or higher to Spam". I thus can let Google's spam filter deal with the inflow and, if necessary, manually sort miscategorized mail with my IMAP client.