Hacker News new | past | comments | ask | show | jobs | submit login

There are some factual "gaps" there about how good Snow Leopard was, but I understand the sentiment. As someone who's been a Mac user since System 6 and has been consistently using Macs alongside PCs _daily_ for over 20 years I can say that Apple's software quality (either in terms of polish or just plain QA) has steadily decreased.

It's just that me and other old-time switchers have stopped complaining about it and moved on (taoofmac.com, my blog, was started when I wrote a few very popular switcher guides, and even though I kept using the same ___domain name I see myself as a UNIX guy, not "just" a Mac user).

For me, Spotlight is no longer (anywhere) near as useful to find files (and sometimes forgets app and shortcut names it found perfectly fine 5 minutes ago), and there is no longer any way to effectively prioritize the results I want (apps, not internet garbage).

Most of the other examples in the article also apply, but to be honest I've been using GNOME in parallel for years now and I consider it to be my "forever desktop" if PC hardware can ever match Apple Silicon (or, most likely, if I want something that is _just a computer_).




> Most of the other examples in the article also apply, but to be honest I've been using GNOME in parallel for years now and I consider it to be my "forever desktop" if PC hardware can ever match Apple Silicon (or, most likely, if I want something that is _just a computer_).

I'm there as well. I've been really enjoying desktop Linux lately, but I can't go back to a non-Apple laptop at this point. There's just nothing else on the market that comes close, they all make some tradeoff I'm not willing to make - either screen, speakers, keyboard, heat/battery life/fan noise, touchpad, etc. Apple is the only one that has the entire package.

There's Asahi, but no thunderbolt yet and I'm not sure the future of that project with the lead burning out and quitting. I just want an Apple Silicon-esque laptop, no trade offs on components, that runs Linux, and there's no OEM out there that's offering that experience.

So, until that happens I'm staying on mac, and even with declining quality, it's not all that bad compared to the alternatives yet. I've learned to mostly work around/ignore the odd bugs.


Also while Apples software quality has definitely diminished over the years, Windows in the same period has utterly CRATERED. Like I get along fine with 11 for my gaming PC but with every single update one feature or another becomes notably broken.


My job gave me an expensive high-specced laptop with Windows on it. This is the first time I am stuck using Windows daily. It's W10. With Windows Defender and a bunch of windows, it starts to slowly become unusable. Today, it blue screened for me just fixing again (and again and again) the bluetooth headphones never gets automatically switched to when I turn them on. Forget about having Visual Studio open on it for an extended period of time.

Meanwhile, my 7-year old laptop with Fedora on it I type this is wonderfully snappy and stable. I started to get tempted to actually switch back to a Mac just to get some predictability and stability, but I have avoided macs for years. (And - never having to deal with constant line ending issues)

All I hear from other co-workers is how their perfectly specced laptops lag with Windows. It's freaking Stockholm Syndrome here!


oof I re-read this comment. I would only switch to the mac because the company didn't setup any of the required security rigamarole to run on their internal network, but otherwise, nah. I dropped that years ago - expensive, disposable machines to run emacs, a terminal and a browser ...


Might be, still I no longer feel like baby sitting Linux on laptops as I used to, yes I had another go at it just last year I know how well it works, and I will never pay the Apple tax outside assigned work laptops.


Windows is indeed an execrable shitshow. Every aspect of it assaults the user with incompetence or outright hostility.

First is the endless badgering to log in, LOG IN, LOGGGG INNNNN with an asinine Microsoft account. If you can tolerate that and actually get the OS running, you're wading through a wonderland of UI regressions and defects.

The default hiding and disabling of options is infuriating. Try showing content from your Windows computer on a TV, for example. You plug your HDMI cable in, and you can select the TV as an external monitor in a reasonably logical manner. Great.

But wait... the sound is still coming from the laptop speakers. So you go to Sound in the system settings. Click on the drop-down for available devices. NOPE; the only device is the laptop speakers.

So you start hunting through "advanced settings" or some such BS. And buried in there you find the TV, detected all along, but DISABLED BY DEFAULT. WHY??? Not auto-selecting it for output is one thing, but why is it DISABLED and HIDDEN?

This is the kind of shit I have to talk my parents through over the phone so they can watch their PBS subscription on their TV. The sheer stupidity of today's Windows UI isn't just annoying, but it's demoralizing to everyday people who blame THEMSELVES for not being "computer-savvy" or slow learners. NO; it's Microsoft's monumental design incompetence and user-hostile behavior.

Microsoft doesn't get the relentless excoriation it deserves for its miserable user experience. There's no excuse for it.


I can't say I've ever had HDMI audio mystery-disabled when I try and use it, that's for sure a new one for me. That said the entire audio stack is an utter fucking nightmare. Selecting sound devices usually works, unless the game/software you're using either isn't set up to know about it, or isn't being told by Windows, either or. Then of course there's the fun game you play having two HDMI displays where Windows will constantly re-enable one you've disabled because it doesn't have any fucking speakers on it.


Win 11 must have some sort of contextual HDMI audio switching where it figures out exactly where you want your audio to go and then does the opposite. Because my Win 11 work laptop loves to re-enable HDMI audio and make it the active audio connection despite the fact that neither of my monitors have built in speakers.


I thought I was the only one having issues with HDMI sound, I usually unplug a few times before it switches the audio to the TV speakers.


Wow, it’s almost like Linux in the late 1990s/early 2000s now!


One word: Bazzite.


Fedora atomic distributions in general are great. I recommend Bluefin-dx over Bazzite (they’re both GNOME-based Fedora Atomic from the same group— universal blue) for developers, because it’s really easy to install the packages that Bazzite gives you, and it comes pre-installed with Docker.


I use Bluefin-dx as well, but pointed out Bazzite due to the mention of gaming. It's been rock solid for me for that use case.


Yep Bazzite is great. But the difference between them is mostly just the packages installed. To me it’s easier to install the gaming related packages from Bazzite onto Bluefin.

I have a problem with Docker sockets while installing onto Bazzite, and didn’t care enough to look further into it.


Is it comparable to gaming on Windows? Last time I tried the performance wasn't as good for some games (Deadlock) and it took ages to compile shader (it takes 30 seconds on Windows with the same specs)


I saw long shader compile times for at least one game last month, might have been Deadlock. I have a Radeon RX 7600 & Ryzen 9 7900X3D for reference.

There is mention on the arch wiki about enabling multi-threaded compiles, but also I have read you perhaps dont even need to precompile them now and possibly get better performance as the JIT compiles via a different vulkan framework (VK_EXT_graphics_pipeline_library).

I disabled pre-caching (which effects the compile too afaict) and never noticed any stuttering, possibly past some level of compute it's inconsequential. I also noticed that sometimes the launcher would say "pre-compiling" but actually be downloading the cache which was a lot slower on my internet.

Certainly on my (very) old intel system with a GTX1060, Sekiro would try to recompile shaders every launch, pegging my system at 99% and running for an hour. I just started skipping it and never really felt it was an issue, Sekiro still ran fine.


It’s comparable for nearly every game I’ve tried, and takes less than 30 seconds to compile Vulcan shaders on my rig.

That said, I think anything with kernel-level anti-cheat either does not run or runs poorly.


I also recommend Bluefin-DX. Been running it for about a year now and love it.


That’s not a word


Wiki tells me: https://en.wikipedia.org/wiki/Bazzite_Linux

    > Bazzite is named after the mineral, as Fedora Atomic Desktops once had a mineral naming scheme.
More: https://en.wikipedia.org/wiki/Bazzite

    > Bazzite is a beryllium scandium cyclosilicate mineral with chemical formula Be3Sc2Si6O18.


"That's" is two words, contracted.


I have some hope that Framework and AMD can fix some of those issues. Would love to try out their new desktop (because it's a simpler, more tightly integrated thing) and replace my Mac mini -- then wait for Linux power management to improve.


Linux power management is pretty good. The problem is that defaults favor desktop and server performance. On a MacBook Air 11, my custom Linux setup and Mac OS had the same battery autonomy, despite Safari being much more energy efficient.

The real problem is that, just like the grandparent post pointed out, Apple's software quality has been declining. The Tiger to Snow Leopard epoch was incredible. Apps were simple, skeumorphic, and robust.

Right now, the whole system feels a lot less coherent and robustness has declined. IMHO, there are not so many extra features worth adding. They should focus on making all software robust and secure. Robustness should come from better languages that are safe by construction. Apple can afford to invest on this due to their vertical integration.



The iron law of bureaucracy happens because humans have a finite amount of time to spend doing things. Those dedicated to bureaucratic politics spend their time doing that, so they excel at that, while those dedicated to doing the work have no time for bureaucratic politics.

It's related to why companies with great marketing and fund raising but mediocre or off-the-shelf technology often win over companies with deeper and better tech that's really innovative. Innovation and polishing takes work that subtracts from the time available for fund raising and marketing.


Great insight—thanks for sharing. It strikes me that bureaucracy is inherently self-perpetuating- once established, it rewards compliance over creativity, steadily shifting the culture until innovation becomes the exception rather than the rule.

Perhaps the real challenge isn't balancing innovation and marketing—it's creating a culture that genuinely rewards bold ideas and meaningful risk-taking.


> [Bureaucracy] rewards compliance over creativity

Imho, this is the wrong takeaway from parent's point.

Bureaucracy rewards many things that are actual work and take time. (Networking, politicking, min/max'ing OKRs)

Creativity and innovation are rarely part of the list, because by definition they're less tangible and riskier.

A couple effective methods I've seen to fight the overall trend are (a) instill a culture where people succeed but processes fail (if a risky bet fails then the process goes under the spotlight, not the person) and (b) tie rewards to results that are less min/maxable (10x vs +5%).


It seems most organizations naturally become more risk-averse as they age and grow since the business becomes more well-defined over time and there is more to lose from risky ventures. The culture has to reward meaningful risk-taking even when that risk-taking results in a loss, which can cause issues when people see the guy who lost a bunch of money getting a bonus for trying (not to mention the perverse incentives it may create).


What’s the actual argument that will credibly convince the top leaders of Apple, to push fixing MacOS up the list of priorities?

Because right now it’s clearly so far down, beneath dozens of other priorities, that expecting it to just happen one day seems futile.


IMHO, Mac OS X contributed decisively towards making Apple cool, which was followed by lots of boutique apps and the success of iOS. Loosing that critical mass of developers, even if it's a tiny userbase, would worry me if I was a top leader of Apple.


Apple has had a contemptuous attitude towards developers since.. the App Store? when the iPhone was out? The last two decades? They don't seem to care about this.


App Store was a big improvement for developers when it was new, relative to the alternatives.

The things it does may not seem important today, but back then even just my bandwidth costs were a significant percentage of my shareware revenue.

ObjC with manual reference counting wasn't much fun either; while we can blame Apple for choosing ObjC in the first place, they definitely improved things.


Apple was incentivized to deliver a polished App Store DX when it first released, because it meant apps which meant iPhone sales.

Now that the platform is cemented, they don't have an incentive to cater to developers.


This is a ret-con. If you - as a user - were philosophically and inherently against the App Store, then it may seem that way, I guess?

The reality is that there was a long period of time where Apple built up lots of goodwill with a developer ecosystem that exceeded by many orders of magnitude the pre-iPhone OS X indie Mac developer scene.

There were many, many developers that hadn’t even touched a Mac before the iPhone came out, and were happy with Apple, and now are certainly not.


>This is a ret-con...

Another way to see it is that people who programmed for Mac OS already had reasons to be annoyed by Apple (e.g. 64bit Carbon). The iPhone let it get new people, who eventually found out why the pre-iPhone scene felt that way.


And that’s a huge part of the reason why the Vision Pro will never take off.


I disagree - if the Vision Pro had some strong use-cases then developers would hold their nose and make apps for it. The platforms that get apps are the ones where businesses see value in delivering for them. Of course businesses prefer it when making apps is easier (read: cheaper) but this is not a primary driver.


I think the potential high-return use-cases for VR and AR are (1) games, (2) telepresence robot control, (3) smart assistants that label (a) people and (b) stuff in front of you.

Unfortunately:

1) AVP is about 10x too pricy for games.

2) It's not clear if it can beat even the cheapest headsets for anything important for telepresence (higher resolution isn't always important, but can be sometimes).

Irregardless, you need the associated telepresence robot, and despite the obvious name, the closest Apple gets to iRobot is if someone bought a vaccum cleaner because Apple doesn't even have the trademark.

3) (a) is creepy, and modern AI assistants are the SOTA for (b) and yet still only "neat" rather than actually achieving the AR vision since at least Microsoft's Hololens, and because AI assistants are free apps on your phone, they can't justify a €4k headset — someone would need a fantastic proprieraty AI breakthrough to justify it.


They stopped caring about developers when they dropped the price of the developer program and no longer gave you a T-shirt for being one.


I think the best argument is to remind Apple that they aren't selling the OS anymore, so they don't need a new version every year. And that macOS features is not what is pushing Mac sales. People aren't buying the M series machines because of the new macOS version, they are buying it because of the hardware. The M series chips are impressive and provide some great benefits that you can't get elsewhere.

And that hardware needs to be coupled with solid software to hook and keep people on this computer. So they can take more time to create more compelling upgrades and sand off more edges.

I think they need to desync all their OS's and focus on providing better releases. There really is no benefit to spending the day updating your Mac, phone, tablet, appletv, and HomePod. Especially when there are no good reasons to update. I feel like Apple became far to addicting to habit and routine that it's become more important to keep that than deliver product. Apple Intelligence is a good example of that.


> What’s the actual argument that will credibly convince the top leaders of Apple, to push fixing MacOS up the list of priorities?

Unrelenting bad press. People talking about nothing else but the decline of their software quality. We can already see that with the recent debacle which caused executive shuffling at the top of the company.


That shuffling was caused by Apple utterly failing to deliver a major feature, that was a key selling point for the latest generation of their hardware.

"Bad press" for their declining software quality is like people complaining there's no iPhone mini/SE anymore. Apple just doesn't give a fuck. They've joined the rest of the flock at chasing fads and quarterly bottom lines.


What was the major feature? The complete uselessness of “AI” on macOS? I updated and enabled all the AI features and I would ask Siri from my M1 and it failed every time. Would just continuously try with its annoying ping sound and never work. Blew my mind that they let this out.


Yeah I was talking about the "AI". It's such an utter failure that even Gruber has been calling it out.

It was already the same story with AirPower (the wireless charging mat). They've pre-announced it, even tried to upsell it by advertising it on the AirPods packaging. It just turned out physics is ruthless.

TBH I've been increasingly sceptical about voice assistants in the "pre-AI" era. I sold my HomePods and unsubscribed from Apple Music because Siri couldn't even find things in my library.


A few months ago, for quite a few years, Siri (in the car) would respond correctly to "Play playlist <playlist name>". Now it interprets that as of about two months ago that it should play some songs of the genre (I have a playlist named "modern").

No idea what changed, but it sucks.


> I sold my HomePods and unsubscribed from Apple Music because Siri couldn't even find things in my library.

I have almost the opposite problem this year. I tell the HomePod to turn the office lights on, it sometimes interprets this as a request to play music even though my library is actually empty, and the response is therefore to tell me that rather than turn on the lights.

Back in the pandemic, same problem with Alexa. Except it was in the kichen, so it said (the German equivalent of) "I can't find 'Kitchen' in your Spotify playlist" even though we didn't even have Spotify.


The decreasing effectiveness of machine-local search is just developers fucking up integrations and indexing.

This is a solved problem since ~1970 -- they're just not spending enough time on it.


It's all their OS software. The Messages app on 18.3 will just... not open the menu to send a photo attachment about ~10% of the time now...


I’m pretty sure the touch target only covers the text label. Tap anywhere other than the text labels and it does nothing but close the menu. Really bizarre.


Ya and it’s something in maybe the top 3 of most used user actions. Really indefensible


Apple is addicted to growth. It is as big as it should be, but it acts like an early stage startup always trying to build some new flashy thing to attract the next customer.


It's not Apple, it's capitalism. "Unlimited growth is the ideology of the cancer cell", yet for Apple (or any corporation), it's not good enough to sell 100,000,000 phones. Next year you must sell 105,000,000. And the year after 112,000,000 (not even 110 or your growth is stagnating).

So you get rid of removable batteries so customers have to toss their phones away more often, you gimp other feature, you spend more money on advertising than you did actually developing the product (read this bit several times until it sinks in how crazy it is, yet that's how we are with every major phone, every major movie, etc), and so on.


In 2016 RedLetterMedia did a breakdown of the movies that year, like top and bottom ten grossing movies. They stated that the advertising budget was the same as the production budget, unless they had knowledge of a different number.

I don't doubt that after 2020 the advertising budgets far outstripped the production budgets - multiple times; I am curious if that trend continues now, now that production isn't hamstrung by covid restrictions.


I'm sure everyone has seen this 100 times already but it really fits given modern advertising practice of every major company, especially in designing products to fit advertising plans.

There are also entire "industries" designed to shield people who want to find quality content from big 'A' advertising.

https://www.youtube.com/watch?v=tGKsbt5wii0 For context John Sculley said "Apple was the marketing company of the decade" in the 80s and Kicked Jobs out of Apple


I love how he uses the word “craftsmanship”, something that he understood quite well (considering how close he was working with people like Bill Atkinson, Andy Hertzfeld, Burrell Smith, etc).

Today engineers have to put up a fight to do anything resembling craftsmanship.


Do you want to retire?

Capitalism works this way because its customers, the investors, want it to work this way, because growth is how you get compound interest. Investors include anyone with an interest bearing bank deposit, a 401k, stocks, bonds, etc.

No growth means it would no longer be possible for an investment to appreciate.

I think of a similar thing when I see people complaining about how companies don't want to pay good wages. When you go shopping do you buy the $10 product or the $5 essentially equivalent alternative? Most people will buy the $5 one. If you do that, you're putting downward pressure on wages.

It's in your (purely economic) best interest for your wages to be high but everyone else's to be low. That's because when you're a worker you are a seller of labor, while when you're a customer you are an (indirect) buyer of labor.

Everything in economics is like this. Everything is a paradox. Everything is a feedback loop. Every transaction has two parties, and in some cases you are both parties depending on what "hat" you are wearing at the moment.


Growth isn’t necessary for high returns on equity. And it isn’t necessary for the investment to provide a return.

Equity returns ultimately come from risk premiums. (Which are small now in US equities BTW).

I’m invested in a microcap private equity fund that has returned >20-25% for years. They have high returns because they buy firms at 3-4x cashflow. You will get the high returns even with no growth. And with no increase in valuation. The returns are a function of an illiquidity premium.

With Apple explicitly, growth is expected given the valuation level. If it doesn’t grow, the share price will decline. So yes, in their case, firm is certainly under pressure to grow.

I also don’t agree with your “best interest for wages to be high and everyone else’s lower”. That is one aspect. It is more complicated. Consider Baumol Effect for starters.


I'm talking about macroeconomics, not micro. Risk premium means there is risk; not everyone gets a return at all. The entire society, as a whole, cannot experience consistent returns unless there is macroeconomic growth. If the pie is not getting bigger, someone has to be losing for someone else to gain.

Things like retirement, 401ks, etc., are society-wide institutions subject to macroeconomic rules.


I buy the $10 one because the margin has to come from somewhere. 9/10, the more expensive product is better.


Okay, if you are so confident in your convictions, convince enough Apple shareholders of this.


Why should I? That's not my job.


b- was probably hinting that the confidence of your conviction may be unsupportable?


It's also not my job to prove my conviction to aggressive internet people.

Regardless: that kind of message doesn't feel like HN-worthy productive discussion.


The actual argument would be people voting with their wallets and moving away from the Apple ecosystem, but this something impossible at least in the USA due to these bullshit "blue bubbles"


How do blue bubbles make any difference?


For most of the people here they don't. In popular culture and especially among teens and non-technical twenty-somethings there's this absurd "eww green text!" thing. A blue bubble is a status symbol for some reason, even though there's lots of Android phones that cost as much as iPhones.


At this point this is not an argument anymore, it’s just a thought terminating cliche.

Expecting users to change their daily habits in order to marginally improve the operating system of a trillion dollar company feels naive and a bit disrespectful to people who actually use these machines for work.

Even developers… the vast majority of developers ignored Apple for decades (and Apple was also hostile) and it managed to grow despite that.

Might as well ask people to contribute to Gnome or whatever so in the future everyone can go somewhere better. Feels way more feasible.


But the opposite is assuming that Apple has a "responsibility" towards its existing users and has to acknowledge their expectations from them.

A sentiment which famously led Steve Jobs to respond that he doesn't understand this, because "people pay us to make that decision for them" and "If people like our products they will buy them; if they don't, they won't" [0]

So according to Steve Jobs himself, the only Apple-acknowledged way to disagree with Apple is to NOT buy their products, and by extend into the services-world of today it means STOP USING their products.

Now Steve Jobs doesn't officially run this company anymore, but I don't see any indication that this philosophy has changed in any way.

[0]: https://www.youtube.com/watch?v=i5f8bqYYwps&t=772s


I don't think that's the opposite. The opposite is admitting that people have more than one reason to choose computers, and "voting with your wallet" only works for easily replaceable items, like groceries, clothing, etc.

Most people are not going to migrate to Android, Windows, Linux or whatever else just to make macOS marginally better.

And it's fine: marginal quality improvements of a product are not the "responsibility" of consumers.


This is absurd. You quite clearly don’t experience “blue bubble” envy yourself, because you’ve so obviously corrupted the sentiment, and argument.

Nobody is saying “gosh, macOS is so damned unstable, but I’ve gotta use it, because…blue bubbles on my iPhone?

You’ve just read some story about a company you already hate and are parroting it.


I don't think you're taking their argument in good faith. At least my read on what's being said here is that the psyops lock-in effects that Apple uses are too strong.

It's not just "blue bubbles," but "blue bubbles" seems like a good shorthand to me. It's also things like Hand-off, or Universal Control, or getting Messages on both iPhone and Mac seamlessly, or being on the same WiFi network allowing your iPhone/Watch to work as a tv remote for the Apple TV even if you're just visiting a friend. Features that any platform can and does enable, but that do to Apples vertical can work seamlessly out of the box, across all the product lines, while securing network access in the ways most users will want, creating a continuous buy-in loop wherein the more Apple products you buy, the more incentive there is to buy exclusively Apple.

And it's a collective "you." If your entire family uses exclusively Apple products, then you'll be the only person who can't easily use eg the Apple TV in the living room, or the person "messing up" the group chats with "User reacted with Emoji Heart to [3 paragraph text message]," or the one trying to decide between competing network KVM software platforms so that you can use your tablet when your 12-yo can just set their tablet next to their laptop and get a second screen without any setup. Nevermind that these are all social engineering techniques that only exist BECAUSE Apple chose not to play nice with others, they still socially reinforce a deeper commitment to Apple products with each additional Apple product in the ecosystem.

I say this as someone "stuck in the blue bubble" with eyes open about what's going on. I'll keep picking Apple as long as they're a hardware-oriented company, because their incentives are best aligned with mine for the consumer features they are delivering (for now): consumer integration that sells hardware. It's insidious in its own way, but not like "hardware that sells eyeballs" (Google/Meta) or "business integration that sells compliance" (Microsoft).


Probably nothing as it seems the major push is to get iPad OS and macOS in parity (and I assume to retire macOS completely)


> What’s the actual argument that will credibly convince the top leaders of Apple, to push fixing MacOS up the list of priorities?

That their own products depend on it because they developer their products in Mac. And that the professional people they pretend they cater to depend on Macs, and steadily move away.


> Robustness should come from better languages that are safe by construction.

Nahh, robustness comes from the time you can spend refining the product not from some magic property of a language. That can help but just a bit. There was no Swift in Snow Leopard. Nor there is not much Rust in Linux (often none) and even less (none) in one of the most stable OS available, FreeBSD.

They should just release a new version when the product is ready and not when the marketing says to release it.


  > They should just release a new version when the product is ready and not when the marketing says to release it.
bingo, the yearly wwdc-turned-marketing-extravaganza cycle has kind of ruined for apple i think


> Linux power management is pretty good

> defaults favor desktop and server performance

Desktops are in S3 half the day consuming ~0 power. During use, electricity costs are so much lower than hardware costs that approximately nobody cares about or even measures the former. Servers have background tasks running at idle priority all day so the power consumption is effectively constant. Laptop and phone are the only platforms where the concept of "Linux power management" makes any sense.


My Mac mini (M1) sips ~6W idle and is completely inaudible. It acts as a desktop whenever I need it to, and as a server 24/7. I only power up my NAS (WoL) for daily backups. The rest of the homelab is for fun and experiments, but mostly gone.

"Idle" x86-64 SOHO servers still eat ~30W with carefully selected parts and when correctly tuned, >60W if you just put together random junk. "Cloud" works because of economies of scale. If there's a future where people own their stuff, minimising power draw is a key step.


The low power draw is definitely not exclusive to Macs, a similar x86 mini PC with Linux will also draw around 5W idle.


Does the mini PC go from zero to eleven though? Can I play BG3, Factorio, or Minecraft on the same hardware? Can I saturate a TB3 port? Transcode video? Run an LLM or text2img? Any of that while remaining responsive, having a video call?

If I already need a powerful machine for a desktop, why would I need a second one just so it can stay up 24/7 to run Miniflux or Syncthing? Less is more.


Yes; for about $1000. Eg:

https://www.bee-link.com/products/beelink-ser9-ai-9-hx-370

I have the ser-8 model, and can confirm everything works under Linux. This one has an 80 TOPS AI thing, since you asked about llms.


Huh, pretty cool. Would you mind submitting your scores to Geekbench? Can you also test idle power? Genuinely interested.


Currently also looking at Framework+AMD.

I want Mac hardware but Linux software. The other makers build quality is horrendous. Especially in the 13inch segment which is my favorite. Using a pretty old laptop because there is no replacement right now.

The new Ryzen AI looks really interesting! Sadly there is no Framework shop for me to look at it and they not ship to Japan..


Thinkpad line from Lenovo. Amazing build quality, and you can order them with Linux.

I have a P1 Gen 7 and it’s fantastic. It feels premium, and it’s thin, light, powerful, has good connectivity and 4K OLED touch screen. I’d take it over Mac hardware any day.


Aren't the only Thinkpads with displays in the 4k neighborhood 16-inches? The 14-inch Macbooks are 3024*1964 and have all been like that for a while. I don't know why the PC world (and Linux ready by extension) undervalues high DPI so much, because it makes it hard to consider going back.


The screen keeps me on macbooks as well (well, and the touchpad, the speakers, and the lack of fan noise).

But it is baffling how 1920x1080 (or 1200p) are still the "standard" elsewhere. If I want an X1 carbon, the best screen you can get at 14" right now is 2880x1800 (2.8k). Spec it with 32GB of RAM and it's clocking in at $2700, for a laptop that still has a worse trackpad, worse sound, and worse screen than a 14" MBP at $2399. And the Ultra7 in the thinkpad still doesn't beat the Mac, and it'll be loud with worse battery life.

There truly is nothing else out there with the same experience as an Apple Silicon MBP or Air.

So, my only options for the foreseeable future is wait for Asahi Linux, or suck it up and deal with macOS because at this rate I don't think there will ever be a laptop with the same quality (across all components) of the mac that can run Linux. The only one that came remotely close is the Surface Laptop 7 with the Snapdragon elite, but no Linux on that.


Non-Thinkpad Lenovos have some standouts too. I'm running Debian Stable on an AMD Yoga Slim 7 from a couple of years ago and sure, it's not an Apple, but for the £800 or so I paid for it, it's a really polished machine. Loads of ports, and it's approximately performance-competitive with a Dell XPS13 from about the same time that cost literally twice as much.

The one snag I ran into was that when it was new, supporting the power modes properly needed a mainline kernel rather than the distro default. But in the grand scheme of things that's relatively trivial.

I have an M1 Macbook Pro from work and honestly I'm not tempted to get one for myself. I am tempted by the M3 and M4 beasts as AI machines, but as form factors go I'm just not sold.


The biggest issue Framework have right now is shipping. I can order a ThinkPad practically anywhere. No so with Framework - they are literally leaving money on the table from what I would assume their core segment: affluent tech savvy users trying to get off the planned obsolescence cycle.


I'm not sure I follow. Your complaint is that Framework only sells direct and not through retailers?


No, there is a lot of us who live in countries that framework doesn't ship to.


And if you use a mail forwarder, they deny your warranty.


ditto (insert sad puppy face here)


Tell me you're from the US without telling me you're from the US.

Jokes aside, I had to wait years for Framework to finally allow shipping via a friend in Berlin. I think they ship to Sweden now—they seemed to have an unfortunate misunderstanding that they needed to produce a Swedish keyboard and translate their website before shipping here, which of course is poppycocks.


I am pretty sure that if you have reached the point that you are ordering a laptop online from a brand unknown to the general public, it means you are past the point you need the actual physical keys to match your keyboard layout on your OS settings. You could just have blank keys.


To be fair, some international keyboard layouts actually have variations of key shapes and locations. The shape of the Enter key and the cluster around it is the main example. So it's more than just the labels.


I own both ISO and ANSI keyboards on different laptops and use the same software keymap. I don't think it is such an important factor as I switch from one to another without thinking about it.


> The other makers build quality is horrendous

Out of curiosity, what are you basing this on? From having spoken to people who manage IT fleets, and being the person regular people ask for advice for what device to get, with the occasional exception (which Apple also had plenty of, cf. the butterfly keyboard), you get what you pay for. A 1k-1.5k+ Asus/Dell/HP/Lenovo will get you decent and good build quality.

The cheapest $500 Acer won't.


> A 1k-1.5k+ Asus/Dell/HP/Lenovo will get you decent and good build quality.

And it still won't be on par with a $999 apple silicon air, or a MBP.

I've deployed latitudes, precisions, and thinkpads. They all still make tradeoffs that you don't have to deal with on the mac.

The X1 carbon is probably the "best" but, even with that - you are still getting a 1920x1200 screen unless you spend more than a MBP for the 2.8k display (which is still less than the 14" MBP, and costs more than an equivalent specced M4 pro). The trackpad is worse, the speakers are worse, battery life is worse, and they're loud under load.

They're all fine for a fleet where the end user isn't the purchaser, which is why they exist, but for an individual that doesn't want tradeoffs (outside of the tradeoff of having to use macOS), there's no other option on the market that comes remotely close to the mac. For someone that wants Apple silicon MBP level hardware but wants to run Linux, there are zero options.

The screen is the most egregious tradeoff though, the PC world is still adverse to HiDPI displays and even on high end models 1080p or 1200p is still the standard. I can excuse poor speakers, it is a laptop after all, if I really had to I can deal with fan noise, but I shouldn't have to spend more than a MBP to get a decent 120hz HiDPI screen with sufficient brightness and color accuracy.


Fully agree. Asahi is pretty good now though, and the list of missing features continues to shrink.


At work, our Windows devs use expensive XPSs that are complete crap failing constantly, both hardware and software. As someone who used Latitudes and Precisions when these were the reliable workhorses you seem to describe, the new stuff is just outrageous. (My personal laptop is still an e6440).

My work machine is an M2 Pro MBP and except the shitty input HW (compared to the golden era of Thinkpads/Latitudes without chiclet keyboards) and MacOS being quite bad compared to Linux, it completely trounces the neighbouring Dells that constantly need repairs (mostly the USB-C ports and wireless cards failing).


Maybe if you run a fleet that's statistically true. If you're a regular person you can have incredible bad luck with specific models.

Got two "2k" Lenovos at 4 year intervals.

The first one worked fine but that model was known to have a weak hinge. Had to replace it three times.

The second one had a known problem that some units simply stop working with the internal display and the only solution is replacing the motherboard. My unit worked about a week for me. Seller refunded me instead of repairing because it was end of the line and they didn't have replacements.

Got a "2k" Asus ordered now, let's see how that goes :)

Compared to that, even the one emoji keyboard macbook pro that i had worked for years. The keyboard on those models is defective by design and kept degrading, and I still think Cook should take his dried frog pills more regularly, but the rest of the laptop is still working. Not to mention my other, older apple laptops that are still just fine(tm), just obsolete.


I think price isn't the only thing. PC gaming/consumer laptops lean pretty heavily on price to performance ratios and I think they cut build quality to do it. Business lines like Thinkpad/EliteBook tend to offer worse performance dollar for dollar but they are built better.


Yeah, I happen to need a laptop for that niche between gaming laptop and high end workstation.

Where's a Thinkpad that can run Maya comfortably for a student? AFAIK they only have models with Quadros that have anything but student prices.

So I'm stuck with "gaming" models.

Besides my daughter likes the bling :) If only they could sell me something that doesn't die in a week...


Consider a thinkpad or lenovog yoga pro. I don't think the difference is is that pronounced anymore, maybe it never was, but you always need to look at the premium segment. Somehow people end up comparing budget pc laptops and macbooks.


Asahi?


Yeah, I heard good things about it. I do a lot of gamey development stuff and x64 makes that easier. But Asahi seems to be catching up a lot recently, maybe I should look at it again! https://news.ycombinator.com/item?id=41799068


Asahi is an adventure. I am in the same camp where I got a MacBook for the hardware, but am really a Linux guy. I got really excited when the fex/muvm patches came out for Asahi, and switched to mainly booting it for a couple months. 80% of what I needed to do worked, but that 20% still wasn't there. It was mainly the little things too:

1. Display output from USB-C didn't work 2. Couldn't run Zotero 3. Couldn't compile Java bioinformatics tools 4. Container architecture mismatches led to catastrophic and hard-to-diagnose bugs

There were things that worked better, too (better task management apps, and working gamepad support come to mind). Overall, even though I only needed those things once or twice a week, the blockers added up and I erased my Asahi partition in the end.

I really appreciate the strides the Asahi project has made (no really, it's tremendous!), and while I would love to say that Linux lets me be most productive, features like Rosetta2 are really integrated that much better into MacOS so that I can't help but feel that Asahi is getting the worst of both worlds right now. I'll probably try again this summer and see what has developed.


What dou you mean with more integrated? It is a regular desktop PC with an apu (like is totally common for office PCs, just bigger) and soldered instead of upgradeable ram.

It would be kind of funny, but also very sad, if Apple guys mistook the copying of apple's worst behaviour - producing throwaway devices - as a sign of quality. Though I think we are there for years now with phones, I wouldn't expect such thinking here.


It is fully designed around the limitations of that particular APU and makes the best of it, without being a generic motherboard.


It is "integrated" in the way that the processor is an APU that has specific memory bus requirements. That's all. It is not an integrated software-hardware system that is finetuned, and that board is not any better than a a generic motherboard would be for a regular processor.

My point is that this system is not integrated in the way apple fans usually define the word. I'd claim it is not integrated at all. It is a regular PC (but with soldered ram), which is exactly like framework announced it.

There should be no need to sprinkle some apple marketing bs on that to make it attractive.


I really wish everyone would stop entertaining these borderline crackpot hypotheticals that all rely on the notion of “those damn Apple dummies not getting it!”

It’s absurd.


Thanks, what a nice characterization.

As someone who actually studied human computer interaction, and since I had to work with borderline unuseable macs multiple times in my career now, plus as someone seeing the utter failure of relatives in just using an iPhone (bought since "it is so much easier", now not even able to call from the car system since it is so buggy), the Apple popularity is absolutely a case where you have to look at external factors like social status. And if that translates to "the users are dummies" to you, then that's your interpretation. Plus yes, translating marketing/status concepts like a bogus "integrated" status absolutely is interesting, thus my intent to clarify whether that is really happening here (plus some criticism, admittedly).

Probably not worth it going further into this though, it will only derail.


As a former Mac user, I'm really happy with my System76 linux laptops. The only tradeoff is the terrible built-in speakers. My Lemur is lighter and has better battery life than my Macbook Air and has been bulletproof despite my ill treatment. Each of my Macs, however, have had various hardware failures or the famous keyboard recall on the horrible touchbar Macbook Pro. I also prefer matte screens to glossy, so that's a win for me, but ymmv.


The screen quality is why I didn't get a system76 laptop the last time I did a refresh a couple years ago.


I have found this old comment:

* Battery life is a lie, especially since it drains almost as much battery closed as it does open.

...

Overall, I think I am probably going to switch back to a macbook after this, not being able to go a day without charging and your laptop always being on low battery is a bit anxiety inducing.

https://news.ycombinator.com/item?id=38206173


They must have a bug, because my System76 laptops drain way less with the lid closed than my Macbooks.


This really is exactly how I feel. There are too many tradeoffs to switch to non-Apple hardware at this point. I'd love to run Linux/BSD full-time, as many of the apps that I frequently use on my Mac are FOSS (e.g., R, PyCharm, darktable, etc.) I've been a Mac user since 2002, and Mac OS X served as my gateway to the Linux/BSD world (that, and a short-lived use of RH 6.2 on an old Dell laptop). IMO, macOS really does need a Tiger/Snow Leopard-esque release, but I'm not sure the vast majority of macOS users would even appreciate such a release.


The newer XPS 13 comes with snapdragon x elite now (Qualcomm's answer to Apple silicon). Curious if anybody here runs Linux on one of those


That is highly unlikely to happen in the near future (say 2 years).


I feel like this is the first real step towards a Mac like experience on a Linux system.


It's still waiting for good linux support.


Your loss. I haven't been able to tolerate the MacOS experience since Catalina, running GNOME with a Magic Trackpad has felt head-and-shoulders better for the past 3 years at least. Apple Silicon is neat but was never an option for native development in my workflows anyways. The software matters more to me, and MacOS has been sliding down the subscription slopware slope for years now.

I am perfectly happy to use last-gen hardware from Ebay if it runs an OS that isn't begging me to pay for subscriptions and "developer fees" annually. My dignity as a human is well worth it.


Not quite what you're after but if you want a fanless option that runs full linux and doesn't use much battery, the new argon 40 CM5 laptop that's being built looks like it could be viable as long as you'd be happy with that much of a drop in performance and a few pi based niggles (No USB C video, only one pcie lane for the SSD, etc.)

https://liliputing.com/argon40-is-making-a-raspberry-pi-cm5-...


The reason that keeps me on Windows, is that you left out of your list gaming and 3D graphics on laptops.

Metal isn't really on pair with Vulkan and DirectX in terms of relevance for graphics programming, the M chips aren't up to NVidia ecosystem, SYCL, the two major compute APIs for any kind of relevant GPGPU workloads, and thus don't really matter.

And gaming, well, even though all major engines support Metal, there is a reason DirectX porting kit is now a thing.

So why pay more for a lesser experience, and then there is the whole issue macOS doesn't have native support for containers, like Windows does (their own ones), and WSL is better integrated and easier to use than Virtualization Framework.


Gaming is pretty great on Linux now. I just finished a little Elden ring session and it still blows my mind that when I close the game my Linux desktop is there behind it. No more dual booting, hopefully will never need windows for anything ever again.


You mean translating Windows and DirectX APIs is great, there is hardly a Linux gaming ecosystem.

Proton is the acknowledgment of Valve's failure to entice game studios, already targeting Vulkan/OpenGL ES/OpenSL on Android NDK, Switch (which has OpenGL 4.6/Vulkan support), or on PlayStation (Orbis OS being a FreeBSD fork) to target GNU/Linux.

I rather have the real deal, not translations.


As an actual dyed in the wool game developer.

There’s no such thing as “native”, all the things you’re talking about are translation layers for hardware instructions themselves, and the overhead for doing software based translation is significantly less than hardware accelerated virtual machines- and we as an industry love those.

The reason for this is because the translations are very cache friendly and happen in userland, so the performance impact is negligible, and the scheduler on Windows is so poor compared to Linux ones that it’s even common for games to perform better on Linux than on Windows.. Which is crazy when you consider the difference in quality of the GPU drivers.

I understand that you want it to “just work”, but that tends to be the experience anyway.

You can do what you want, it’s your life, but this is not a terribly good excuse. Valves “failure” is essentially rectified.

I will add though, that it’s actually Stadia that made linux gaming the most feasible, many game engines (all of the ones I worked on) were ported to Linux to run Stadia, those ports changed essential elements of the engine that would have been slow or difficult to translate; so when Proton came around quite a lot of heavy lifting had gone away. I only say this because Valve gets some credit for a lot of work our Engine programmers did to make Linux viable.


> and the scheduler on Windows is so poor compared to Linux ones that it’s even common for games to perform better on Linux than on Windows..

I play most of my games in a window and switch away a lot. A million years ago when I was still playing world of warcraft, the system overall was much more responsive on the same hardware with wow on wine on linux than with wow natively running on windows :)

> it’s actually Stadia that made linux gaming the most feasible

Stadia was the most predatory gaming offering aside from IAP games, sorry. Buy your games again on top of the subscription? Lose them when Google cancels the service? No thanks.

Nvidia's GeForce Now was a lot more honest. Pay for the GPU and streaming, access your owned games from Steam. I'm not using it any more so I don't know how honest they still are, but I did for like a year and it was fine(tm).

The fact that Stadia advanced wine compatibility is great, but technical reasons aren't the only reasons that make a service useful to your customers.


OP is talking about Google (Stadia) throwing money at the problem and incentivizing game engine companies to better support Linux. They’re not talking about pro or anticonsumer the tech was.


I know and even agree with them. I'm also surprised that Stadia was useful for something...


So how many of those ported game engines are actually making a different on GNU/Linux gaming today?

There is certainly such thing as native, one thing is the platform where the APIs were originally designed for and battled tested, and the other is other platform emulating / translating them, by reverse engineering their behaviours with various degrees of success.

Valve's luck is that so far Microsoft/XBox Gaming has decided to close an eye on Proton, and it will run out when Microsoft decides it has gone long enough.


> So how many of those ported game engines are actually making a different on GNU/Linux gaming today?

Not sure, Unreal Engine is pretty popular though and Snowdrop is increasingly common for Ubisoft titles.

https://www.protondb.com/app/2842040 https://www.protondb.com/app/2840770/ https://www.protondb.com/app/365590


> https://www.protondb.com/app/2842040

Star Wars Outlaws

Natively Supports: Windows only

> https://www.protondb.com/app/2840770/

Avatar: Frontiers of Pandora

Natively Supports: Windows only

> https://www.protondb.com/app/365590

Tom Clancy’s The Division

Natively Supports: Windows only

----

You were saying?


I know, I worked on those games.

Specifically, I worked on those games so I know what they natively support and how things transpired behind the scenes.

Proton has absolutely no hope of working without the changes we made because of stadia, the code we wrote was deeply hooked into Windows and we made more generic variants of many things.

The Division 1 PS4 release was significantly shimmed underneath compared to the win32 and xbox releases: this became much less true over time as porting the renderer to linux (specifically debian) made us genericise issues across the OS’s and when Div2 shipped we had a lot more in common across the releases; we didn’t rely on deep hooks into Microsoft APIs as much


> this became much less true over time as porting the renderer to linux (specifically debian)

Strange how you ported the renderer to Debian, and yet you couldn't even find a link to a game that has a native Linux support.

Was there ever a port?

> Proton has absolutely no hope of working without the changes

You keep saying this as the absolute truth, and yet at the time when Stadia launched Proton already had 5k working games under its belt.

Strange how Stadia is this monumental achievement without which Linux gaming wouldn't happen according to you.... and yet no one ever mentions Stadia ever contributing any code to any of the constituent parts of what makes Proton tick. Apart from the changes that engines supposedly made to work on a yet another game streaming platform.


I don’t know how to say this without being unkind.

There is a functioning version of The Division 1, Division 2, Avatar and Star Wars outlaws that run on Linux internally at Ubisoft.

Nobody will release it because it can’t be reasonably QA’d. (Stadia was also very hard to QA, but possible, as it was a stable target and development was essentially funded).

I’m not sure what your problem is; I said - as clearly as I can - that architectural changes to the engine were neccessary for proton.

I know this, for an absolute fact, because Proton was a topic when I worked on those games and it was not until Stadia (codename Yeti) was on the roadmap, and our rendering architect lost all his hair working on it - that Proton started to even function slightly.

I’m not shilling for Stadia - there’s nothing to shill for, it is dead.

Get over yourself, if you don’t like the truth then don’t start going in on me because my reality does not match your fantasy. Sometimes corporations do things accidentally that push other things forward unintentionally.

I just want to share my thanks to Stadia because I know for a concrete fucking fact that at least some of the AAA games I worked on would not function at all on Linux without Stadias commercial interference.


> I’m not sure what your problem is

All I'm saying is that "it’s actually Stadia that made linux gaming the most feasible" statement is at best contentious because in reality gaming on Linux was already made (more) feasible when Stadia had only just launched.

And Stadia used the same tech without ever giving back to Proton at all (atl least nothing I can quickly discover). So the absolute vast majority of work on Proton was done by Valve which you dismissed as "when Proton came around" (it came around before Stadia) and "quite a lot of heavy lifting had gone away" (Valve did most of the heavy lifting).

That's the extent of my "problem".

> at least some of the AAA games I worked on would not function at all on Linux without Stadias commercial interference.

So, not "actually Stadia that made gaming feasible on Linux" but "because Stadia used all the same tech, and there were possible commercial incentives early on until Google completely dropped the ball, bigger studios also invested in compatibility with the tech stack"


You’ve taken a weird position here.

Stadia did a lot to help by being a stable target and by being seen as commercially viable. Google also helped a lot to aid developers, not just financially.

That they didn’t contribute code to proton doesn’t factor at all, I just hate to see people not get their dues for their part in the prolification of Linux gaming- because I saw it first hand.

You are labouring under the delusion that I’ve implied Proton did nothing, no, they levied a lot of existing technology and put in a lot of polish. They were helped by Stadia, by Wine, by DXVK and others.

They didn’t do it alone, that doesn’t minimise their contribution, it contextualises them.

Also: Stadia ports of games were native, they did not use proton- it was architecture changes of the games themselves that made proton work better- not Google making proton itself function better.

That proton was running some games is a weird revisionist take, very few AAA games ran at all, those that did were super old and there was always some crazy weird bugs- proton got better but also AAA games coalesced into conforming to linux-y paradigms underneath better- so support got better much quicker than expected. You can even see this if you track the “gold” released games over years, some of the worst supported games for Proton are from 2015-16; before stadia but after game complexity started rocketing up with next game engines of the day.

Hope that helps, because honestly this conversation is like talking to a brick wall.


> They didn’t do it alone, that doesn’t minimise their contribution, it contextualises them.

Oh, you very much minimised their contribution. From "when Proton came" (again, Proton came before Stadia) to "Stadia made gaming feasible on Linux" (when Proton made it feasible before Stadia)

> Also: Stadia ports of games were native, they did not use proton- it was architecture changes of the games themselves that made proton work better- not Google making proton itself function better.

So, Stadie games were Linux ports. But as a result of this there are still literally no Linux ports. None of the tech behind Stadia ever made it back into software behind Proton. And "native stadia ports" are somehow responsible for more games that target Windows and DirectX to run better via Proton

> That proton was running some games is a weird revisionist take

Funny to hear this coming from a revisionist. I literally provided you with links you carefully ignored

--- start quote ---

A look over the ProtonDB reports for June 2019, over 5.5K games reported to work with Steam Play

https://www.gamingonlinux.com/2019/07/a-look-over-the-proton...

--- end quote ---

> You can even see this if you track the “gold” released games over years, some of the worst supported games for Proton are from 2015-16; before stadia but after game complexity started rocketing up with next game engines of the day.

Or because the actual heavy lifting that Valve did with Proton paid off, and not the nebulous "native ports" and code that never saw the light of day.

> because honestly this conversation is like talking to a brick wall.

Indeed it is.


Yet most games don't make use of Unreal targeting capabilities of GNU/Linux, rather Proton.


Unreal can target Linux, sure, but not all of the plugins you might use will, nor any of your own plugins.

Unreal is almost worse because their first party tools (UGS, Horde) will not work on Linux, so you have to treat linux as a console, and honestly the market share isn't there to justify it.


Which kind of validates the point of Valve's "success" in a Linux ecosystem.


Speaking from experience, Helldivers 2 and Monster Hunter Wilds both ran better on Linux from day one before any special fixes and still do - I'm not sure what "original design and battled testing" is worth or good for if the underlying Kernel and/or OS is a mess.


Stadia's impact on gaming in general is next to zero. And given that the vast majority of gaming on Linux is happening via Proton, its impact on gaming on Linux is similarly next to zero.


What games have you made to justify this statement?

I worked closely with productions using proprietary game engines, I feel qualified in stating that Stadia had an outsized impact on our development process in a way that helped proton succeed.

That you don’t see it as an end user, is exactly my point.


> What games have you made?

You don't have to be a chef to judge what's coming out of the kitchen.

What is the objective impact of Stadia which at its height had a whopping 307 titles [1]? At the time of writing ProtonDB lists 6806 titles as "platinum, works perfectly out of the box" and 4839 games as "gold, works perfectly after tweaks". Steam Deck alone has almost 5x the number of games with "verified" status [2].

What games are being made for Linux thanks to Stadia, and don't just target DirectX and run through Proton? How many Stadia games were ported to Linux thanks to Stadia?

Also, to put things into perspective. Proton was launched in 2018. Stadia was launched in 2019.

In 2019 there were already over 5000 games that worked on Proton. [3]

In 2022 there already were more games with verified status for Steam Deck than there were games for Stadia, and 8 times more games verified to work by users [4]. Stadia shutdown was announced half a year after the article at [4].

Stadia had zero impact on gaming in general and on gaming on Linux in particular as judged by the results and objective reality. Even the games you showed as examples don't support Linux, only target Windows, and are only playable on Linux through Proton [5]

> I feel qualified in stating that Stadia had an outsized impact on our development process in a way that helped proton succeed.

> That you don’t see it as an end user, is exactly my point.

It's strange to claim things like "when Proton came along" when Proton was there before Stadia and already had over 5k games working in the year when Stadia only just launched.

It's strange to claim outsized impact on development process when there are no outcomes targeting anything even remotely close to Linux development, with studios targeting Windows as they have always done.

It's strange to claim Stadia had outsized impact when none of the work translated into any games outside Stadia. When Stadia did not contribute any significant work to the tech that is running Proton. In 2022 they even started work on their own emulation layer that went nowhere and AFAIK never contributed to anything [6]

It's strange to claim that "it's actually Stadia that made Linux gaming feasible" when there's literally no visible or measurable impact anywhere for any claim you make. Beyond "just trust me".

[1] According to https://www.mobygames.com/platform/stadia/ According to wikipedia, at the time of shutting down it had 280 games, https://en.wikipedia.org/wiki/List_of_Stadia_games

[2] https://www.protondb.com/dashboard

[3] https://www.gamingonlinux.com/2019/07/a-look-over-the-proton...

[4] https://www.protondb.com/news/how-many-games-work-on-linux-a...

[5] https://news.ycombinator.com/item?id=43503018

[6] https://www.gamingonlinux.com/2022/03/google-talk-about-thei...


You are literally arguing that your ignorance is as valid as my experience. And you’re arguing that you didn't see the impact; which was kinda my entire point - there was impact beyond what was visible that propelled Proton forward.

You don’t know how the sausage is made just because you ate a hotdog.

Maybe you should consider things more carefully before making yourself look like an idiot on the internet and simultaneously raising my blood pressure.


Strange take. Proton is an acknowledgment that the windows apis are the de facto for gaming. Not sure why the runtime matters. Some games ever run better. Not sure why that’s not the ”real deal” but whatever I’m glad you’re happy with your spyware gaming OS.


Not really, I rather play on the platform they were designed for in first place.


Do you own a Playstation? :)

If you're playing the likes of Fromsoft/Resident Evil/Kojima games on a PC, be it Windows or Linux, you're not playing on the platform those games were designed for.


The problem with your reasoning is that Windows/PC doesn't need to emulate Orbit OS and LibGNM, Sony also supports Direct X and Win32 directly on their engines.


"supports" as in I see articles in the PC gaming press about technical problems with From/Kojima games a year after I've finished said games on console with zero issues.


Where is Windows Proton like for Playstation APIs?

"Technical issues" has many meanings.


The point is, "the platform those games were designed for" is the Playstation API for some titles. So you'll get the best experience on there.

Unless you play benchmarks instead of games, and care about 8k/1200 fps of course.


The point is, the game uses the platfrom APIs on the target OS, and doesn't need to emulate API from 3rd party platforms.


I'd rather use Linux and game with an imperfect translation layer, than put up with Windows.

Proton is a lesser implementation of Windows API, sure, but Windows itself is a lesser implementation of an operating system for power users.


It's not really a failure. Linux distribution and diverse ecosystem brings a level of complexity. The only way to support it long term is to either having your team continuously update and release builds of the game to cater for that which is an impossible task to ask for a lot of studios.

The initial approach of runtimes did help but it's still has its limitation.

If now a studio just need to test their game under a runtime+proton the same way they would test a version of Windows to ensure it's working under Linux it's a win/win situation. Proton becomes the abstraction of the complex and diverse ecosystem of Linux which is both its strength and weakness.

Another solution would have been everybody using the exact same distribution which would have been way worse in my opinion.

And who knows, maybe one day Proton/Wine would be the Windows userland reference and Windows would just be an implementation of it :D


So it is not really a failure when the solution is to adopt Windows and Direct X translation API?

I thought only Apple had a distortion field.


It's a complete failure across the board to create any compelling graphics APIs for desktop platforms (both Linux and Mac) beyond DirectX.


That’s not the goal though. The goal is to play games on Linux. If Valve’s goal was to end up with a Linux-specific graphics api for most games that run on Linux then they provably would have tried to do so.


Is it a failure when everyone writes javascript/html/css instead of doing native applications for non gaming?

Most of HN seems to think using a web browser as a translation layer is a good idea, yet they complain when games use a translation layer.


Is it a failure when everyone writes javascript/html/css instead of doing native applications for non gaming?

Yes?


Yes, definitely, that is why now we have ChromeOS developers instead of Web developers.


You better have made this comment via a native windows hacker news desktop application.


I would gladly have used one, if it existed without being a web widget wrapper.

I miss the days of native apps with Internet protocols, and USENET discussions.


Hacker news is a web site, not an application.

A web site makes for a crap application and the reverse.


When I was gaming on Linux, every game with a native version worked better using the Windows version in proton. I think the only exception was Factorio.


Gaming/WSL kept me on Windows for a lot of the last decade, however after Windows 10 became EOL'd and Windows started turning into ad/spyware I finally gave it up over a year ago after 25+ years on Windows Desktops.

Anyway Linux is liberating, Fedora Desktop is great, no ads in the OS, a Software Store/Installer I actually like to use, curated by usefulness instead of scam Apps. All my Windows Steam Games I frequently use just worked, I have to login to X11 for 1 title (MK11), but everything else runs in the default Wayland desktop. Although I'll still check protondb.com before purchasing new games to make sure there'll be no issues. Thanks to Docker, JetBrains IDEs and most Daily Apps I use are cross-platform Desktop Web Apps (e.g. VS Code, Discord, Obsidian, etc) I was able to run everything I wanted to.

The command-line is also super charged in Linux starting with a GPU-accelerated Gnome terminal/ptyxis and Ghostty running Oh My Zsh that's enhanced with productivity tools like fzf, eza, bat, zoxide and starship. There's also awesome tools like lazydocker, lazygit, btop and neovim pushing the limits of what's possible in a terminal UI and distrobox which lets me easily run Ubuntu VMs to install experimental software without impacting my Fedora Desktop.

Image editors is the one area still lacking in Linux. On Windows I used Affinity Designer/Photo and Paint.NET for quick edits. On macOS I use Affinity & Pixelmator. On Linux we have to chose between Pinta (Paint.NET port), Krita and GIMP which are weaker and less intuitive alternatives. But with the new major release of GIMP 3 and having just discovered photopea.com things are starting to look up.


I hardly find anything interesting about command-line, I grew up in a time where the command line was the only way to interact with home computers, it fails on me the interest on staying stuck in up to early 1980's computing model.

Xerox PARC is the future many of us want to be in, not PDP-11 clones.


Weird flex, most commands, utilities, server software and remote tools are going to be run are going to be from the command-line. All our System Administration of remote servers uses the command-line as well since exclusively deploying to Linux for 10+ years.

Sure you can happily avoid the command-line with a Linux Desktop and GUI Apps, although as a developer I don't see how I could avoid using the terminal. Even on Windows I was using WSL a lot, it's just uncanny valley and slow compared to a real Linux terminal.


> Weird flex, most commands, utilities, server software and remote tools are going to be run are going to be from the command-line.

It's not a weird flex. Weird flex is this: "The command-line is also super charged in Linux starting with a GPU-accelerated Gnome terminal/ptyxis and Ghostty running Oh My Zsh" and then listing a bunch of obscure personal preference tools that follow trends du jour.


That’s not a flex, it requires no skill to install software, they’re just some of the better tools you can install to boost productivity in Linux terminals. I doubt they’re obscure to any Linux CLI user who spent time on improving the default OOB UX of bash terminals.

And you just alias them, so you can keep using the core utility names to use them.


None of those tools are obscure, they might just seem like it from the perspective of mouse dependent vscode users.


Sadly, because many of the authors live stuck in UNIX cli model instead of Xerox PARC REPL approach.

It is like praising Ratatui for what Turbo Vision, Clipper and curses were doing in 1990's, if I wanted that I would kept using Xenix and MS-DOS.


There are huge interoperability advantages to CLI and TUI tools. Composing them, using script(1) on them, etc, are much simpler than the same for GUI tools. They are also much easier to rapidly iterate on.

GUIs are very useful but they are not clearly better (or worse) than CLIs.


REPL-ify your command line then? There's nothing that says you have to be stuck on bash for your command line needs. https://www.nushell.sh/


Already doing that a much as possible.


Gaming on windows is fine, but there's no reason to use windows for anything else. Dual boot to linux for a better desktop and none of the crud that Windows 11 has in it.


I haven't been gaming since when there was huge gap between graphical possibilities and actual design (that is beginning of 3d era) - so I do not miss that. However I can see the decline in macOS, like pushing for 'apple intelligence', more and more restricting gatekeeper, iOS-ification of desktop (ie.: mentioned system settings), constant connections to AWS, etc.

But since I'm not gaming I cannot imagine going back to Windows. On the other hand I'm quite enjoying Linux...

> So why pay more for a lesser experience

...however, with few exceptions, I haven't used mouse in decade... and I haven't found anything like MBP's touchpad yet. Maybe I just need to do better research.


> Metal isn't really on pair with Vulkan and DirectX in terms of relevance for graphics programming

As if Vulkan had relevance to graphics programming.

> and WSL is better integrated and easier to use than Virtualization Framework.

you don't need WSL on MacOS because, well, MacOS is already a *nix environment.


> As if Vulkan had relevance to graphics programming.

It surely has on 80% of a mobile platform, and on a small handset from this little japanese games company.

> troupo 3 hours ago | parent | context | flag | on: Apple needs a Snow Sequoia

> Metal isn't really on pair with Vulkan and DirectX in terms of relevance for graphics programming

As if Vulkan had relevance to graphics programming.

> you don't need WSL on MacOS because, well, MacOS is already a *nix environment.

Agree if everything one wants out of it is classical UNIX experience, that breaks down when having to work with containers and kubernetes locally.


> It surely has on 80% of a mobile platform,

And which platform brings in more money?

> and on a small handset from this little japanese games company.

And not on PS, not on XBox, not on PC (that is, no first-party support).


> you don't need WSL on MacOS because, well, MacOS is already a *nix environment.

Right up until you need Linux syscalls. If you're doing anything with containers it's an annoyance.


My ideal laptop would be the macbook trackpad, monitor and battery life stuck inside any thinkpad. Or just anything non MacOS, even Windows, in the macbook. I despise MacOS with every fiber of my being, but the hardware is damned good.


Windows is far worse.


I would highly recommend giving a virtualized arm Linux installation a go, using the built in Apple frameworks who are blazingly fast.

Have a look at this sample code: https://developer.apple.com/documentation/virtualization/cre...


Why not start supporting Asahi financially, if you aren't already?


i used to run debian on an intel macbook air. regular debian. was pretty nice.


Apple's software quality (either in terms of polish or just plain QA) has steadily decreased

I think the decline of software went hand-in-hand with the decline of the native indie Mac app. They still exist, but when I started with the Mac (2007), there was a very rich ecosystem of native Mac apps. Most stood head and shoulders above their Linux and Windows counterparts.

Apple has nearly destroyed that ecosystem with: race-to-the-bottom pricing incited by the App Store; general neglect of the Mac platform (especially between ~2016 and Apple Silicon); and a messy reactionary toolkit story with Catalyst, SwiftUI, etc. The new toolkits seem to imply that Apple says that it's the end of AppKit, but most SwiftUI applications are noticeably worse.

With their messy toolkit story and general neglect, developers have started using Electron more and more. Sure, part of the popularity is cost savings, since Electron apps can be used on multiple platforms. But part of it is also that a Catalyst or SwiftUI app is not going to provide much more over an Electron app. They will also feel weirdly out of place and you become dependent on Apple working out quirks in SwiftUI. E.g. 1Password tried SwiftUI for their Mac app, but decided in the end that it was an uphill battle and switched to Electron on Mac instead.

I recently bought a ThinkPad to use besides my MacBook. Switching is much easier than 10 or 15 years ago, since 80% of the apps that I use most frequently (Slack, Obsidian, 1Password, etc.) are Electron anyway. Even fingerprint unlocking works in 1Password. I was vehemently anti-electron and still don't like it a lot, but I am happy that it makes moving to a non-Apple platform much easier.


I think most of this is just downstream of the Mac being eclipsed by the iPhone in terms of Apple’s revenue. The Mac just isn’t critical to Apple’s business like it was in 2009 when Snow Leopard came out. They would have started development on SL in 2008, when the iPhone was still a fairly niche product and there wasn’t even an App Store.

Now, ios gets the executive attention and it will generally get the best developers assigned to it, and the Mac has to live with the scraps.


Yeah I think this is the one, in terms of number of users, revenue, etc. The iPhone is more than 50% of their revenue, Mac is only ~8. Lower volume and higher price, but it doesn't come anywhere near their phone. Same with tablets, although they share an app revenue income stream with the iphone which makes up for the difference in hardware sales.


By itself Apple's AirPods alone generates twice the revenue ($18B) vs Macs ($8B). So we can see where Apples priorities are.


> I recently bought a ThinkPad to use besides my MacBook.

I'm on the same boat here. Something is driving me away from my MacBook M1(Pro? Don't even know). I have a gut feeling that it's macOS but can't really put a finger on it yet.

Bought a heavily used ThinkPad T480s (from 2018) and replaced almost every replaceable part of it, including the screen. Being able to replace many parts easily is a nice touch since I am using MacBooks since 2007 exclusively. Guess that's why I somehow overdid it here. Slammed Pop!_OS 22.04 on it and I'm very pleased with the result. The first Linux desktop I actually enjoy since trying SuSE 5-something. Pain points are teams (running in browser), bad audio quality with AirPods when using the microphone and cpu speed and heat. I guess one has to stop using Apple silicon in laptops to realize how amazing these processors are.


and cpu speed and heat

Intel CPUs from that era were quite bad and everyone has upped their ante since then. I was thinking about getting a second hand from ~2021-2022, but my wife convinced me to get a new one, so I got a Gen 5 T14 AMD. It has a Ryzen 7 Pro 8840U and I rarely hear the fans, mostly only when Nix has to rebuild some large packages (running NixOS unstable-small).


> 1Password tried SwiftUI for their Mac app

1Password had a beautiful native Mac app that works to this day. Even assuming SwiftUI is actually bad, why did they have to migrate at all? What was wrong with the existing app?

I'm not disagreeing with the opinions on Apple software quality, but I think the 1Password case is more down to their taking of VC money and having to give (JS) devs some busywork to rebuild something that worked perfectly well.


1Password is also now subscription only and online only. Gone are the days of a forever license and fully offline encrypted database allowing for 3rd party syncing via iCloud or others. The death of their old app went hand in hand with their race to the bottom subscription payment VC backed ecosystem. It's only time until they suffer a breach like everyone else.


> Gone are the days of a forever license and fully offline encrypted database allowing for 3rd party syncing via iCloud or others.

While it's true for 1Password, there are other password managers. KeePass is great for local password database files if that's what you're after.


>What was wrong with the existing app?

It didn't work on Windows and Linux desktops.


Regarding Spotlight, one thing that started happening for me on Sequioa was that Finder and other apps started getting very slow to react to file changes. For example, I can save a new file to a directory, and the Finder window takes maybe 10-20 seconds before the file shows up in the list. If I navigate to a different folder and then back, the file is there. I notice the same delay in apps like IntelliJ.

I could be wrong, but apparently Spotlight is the service that drives this kind of file system watching. I think macOS has a lower-level inotify-style file system event API, which should be unaffected, but Finder and these other apps apparently use Spotlight. I really wish I had a fix, because it's just crazy having to constantly "refresh" things.


My favourite feature is when spotlight tells me that indexing is paused when I am searching for something.

You went through the effort to show some UI when something I am looking for may not be there because indexing is paused... but you didn't think to just unpause the indexing so that I can find it? I feel like I am being spit on, "Yeah, you not finding what you are looking for? I know, I'm not even trying"


I highly recommend using Alfred. I’ve been using it since before Spotlight came out, tried and then disabled Spotlight, and went back to Alfred. It’s extremely configurable but highly usable out of the box. Sort of like creating your own CLI shortcuts to open files, apps, copy things to the clipboard, etc.

https://www.alfredapp.com/


Alfred is nice. I use Raycast these days: https://www.raycast.com/.


I still use Quicksilver[1], the open source app that long predates Alfred and was the inspiration for it. I tried Alfred a few years ago but didn't see anything compelling enough to switch. Am I missing anything?

[1] https://qsapp.com


I use Alfred and I used to use Quicksilver.

Probably not.


This KILLS me. It's so frustrating. APFS is supposed to be great at deduping files and such, but in practice it seems like it really sucks. It's bad at both saving a file to the desktop and dumping a million npm files into a directory.


Same here. Spotlight used to be my everything, i.e. I never use the dock I would always use spotlight to launch applications or navigate to folders. Now it is littered with internet garbage, takes seconds to even return any results, and the results are always useless.

Who the hell thought integrating internet search is a good idea - because "aösldkfjalsdkfjalsdkfj" just as everything else is a valid search result in Spotlight now showing me "Search for aölsdkfjöalsdfjasdlfkj in Firefox"...


Spotlight was never useful, because of an absurd and glaring design defect: It doesn't show you WHERE it found stuff. There's no path shown with hits. Same blunder in Finder's search, and you can't even optionally add "path" as a column. WTF.

So... when the hits include six identically-named files, you can't eliminate ones that you know are wrong (on a backup volume or whatever). The level of stupidity here is just mind-boggling.


You hold down command to see the path.


Where? And how is that option displayed to the user?

I also just tried it in Spotlight and Finder, and it did nothing. Which I consider a relief, because undiscoverable bullshit is worse than the feature not existing.


macOS and iPadOS are full of those undiscoverable "if you do this combination of buttons/swipes while at full moon, something happens". As a Mac user not by choice (work issued) I hate how impossible to discover these are.


As a Mac/iOS/iPadOS user it seems that it’s almost mandatory to watch each Keynote / product announcement video if you want to keep up with new features. Lots of cool features that I only knew about by watching those videos that are completely undiscoverable otherwise.


These kinds of shortcuts are part of Apple software as a whole, and apparently have been a thing since at least OSX. These behaviors were supposed to be covered in the documentation, but I don't know how true this is nowadays.

Special mention to all text input fields in macOS having Emacs-style shortcuts.


It goes back further than that. I remember being able to buy key-combo cheat cards for System 7, and I have no reason to think the shortcuts they covered wouldn't also have been present in System 6.


It's in the documentation for Spotlight:

https://support.apple.com/en-gb/guide/mac-help/mchlp1008/mac

I agree that discoverability could be better, but macOS has pretty consistently had hidden power user shortcuts and modifiers, to keep the basic workflow streamlined/simple for those who don't need it.


Seeing where stuff is in a search function is not a "power user" feature; it's the whole point of what you're doing.

And I don't buy the "keeping things simple" excuse for secret hotkeys in other areas. Falling back on gimmicks like undisplayed hotkeys and "long presses" and "gestures" is lazy abandonment of the design task.

I hate this "saving the user from complexity" lie. It's hypocritical: The "non-power" user isn't going to go looking for these options in the first place.

Finder search is a great example. A "non-power" user isn't going to right-click on the column headings in the results and try to add "path" as a column. So how does it help that user to deny everyone else the ability to add it?

Apple mocked IBM for needing a thick user manual back in the day. To suggest that anyone (especially anyone on this site) should have to read documentation to use perform a basic file search (in a GUI, no less) is apologism to the extreme.


And press command+return to open the ___location in Finder (and the item selected)


> There's no path shown with hits

I guess you do know the path is shown at the bottom of the window if you select the filename in the list of results?


Yep, but that's totally unacceptable because you have to tediously select every entry, one at a time, and peer at the status bar.

It also doesn't allow you to sort results by ___location, as you could if it were a column.


In all fairness, you do need to hold down the command key to show the file ___location in Sequoia. It is an interesting default behavior to pretend the files ___location doesn't exist, mobile-centric.


No you don’t. In Finder search results, the path is always shown at the bottom. For regular Finder windows, you can optionally show the path with “View -> Show Path Bar”


Not a solution, because, again, you have to click on every single entry one at a time, and you can't sort by it.


In all fairness, secret hotkey BS may as well not exist. Are you supposed to mash every modifier key and every combination thereof on every screen and in every menu, looking for hidden goodies?

Absurd.


we have simplified the interface to just one home button and the screen interface, as well as the volume up/volume down key.

To select, just press on the item.

To hover, press and hold for at least 2 seconds.

To get a list of options, press and hold for at least 2.5 seconds, but not more than 3.5 seconds.

To delete, press and hold for 3.6 seconds, but not longer than 3.9 seconds.

To save, press and hold for 4.1 seconds. Pressing and holding for exactly 4.0 seconds activates the archive action. Pressing and holding for 4.2 or more seconds sends the item to the blocked list.

To retrieve the list of items in the blocked list, press and hold and simultaneously press the volume up and volume down key.

To delete all items in the block list, press and hold and simultaneously press the volume up key only.

To completely reset your device, press and hold and simultaneously press the volume down key only, whilst holding the device in a completely vertical plane, and rotating clock-wise and counter-clockwise, smoothly, at precise 2.35619 radians every 30 seconds.

To trigger the emergency call feature, drop the device at an acceleration of no less than 9.6m/s and no more than 9.7m/s

/s (kind of)


No: you are supposed to read the documentation to learn about power user features. Microsoft also doesn’t shove the advanced keyboard shortcuts in your face; you need to read the manual to learn stuff like this.


Showing WHERE things are found when you do a search is not a "power-user" feature. It's an essential aspect of what the user is trying to accomplish.

The whole point is that secret hotkeys are design dereliction.


Is it, though? Most people don’t really have a notion of the file system, or hierarchical file structures. They drop files onto their desktop, or keep them in the downloads folder. Just ask a parent or your next-door neighbour.

That’s a bit of a problem when discussing problems of normal users with power users, because they don’t even realise how what they’re doing is actually not what normies do.

I’m inclined to agree that hotkeys in MacOS are hard to discover, but cluttering the interface with stuff many users simply do not need cannot be the correct answer.


If it's cluttering the interface, the interface design was incompetent to begin with.


That’s just ridiculously broad. You cannot cram an infinite amount of information into an interface, there is a maximum density for your design goal.


Wow, I feel like I almost could have written this except I prefer Plasma/KDE to GNOME. I use Linux + Mac laptops somewhat interchangeably since 2012, and have also seen the marked decline in quality. In fact, it seems like Linux has gotten better at almost the same pace (or maybe a bit faster) than macOS has gotten worse.

The things that most frustrate me about Macs is that they've violated the never spoken but always expected "it just works" in so many ways. Things like how Thunderbolt Displays containing a USB hub which are Apple-certified handle re-connection to a Macbook, should "just work", but require fiddling every time. That's just one of numerous examples I could come up with.

Apple historically was probably the best company in the world in understanding the full depth of what "User Experience" means, and it seems like they've really retreated from this position and are regressing to the mean.


Spotlight is unbelievable bad, especially on iOS. If I type a substring of the name of an installed app, it should find it effectively instantly (say, within 1-2 frames of the input showing up). Instead, it finds it sometimes. On occasion I need to hit backspace (removing a letter that should match) to get it to find it.

I struggle to imagine the software design that works so poorly.


I've yet to find a decent implementation of search-as-you-type anywhere, not just Spotlight. I have that same issue on Firefox, and with Windows Search, for example.

And it makes no sense whatsoever. If "foo" matches "foobar", so should "foob". I honestly don't know how the hell can they still f up such a simple piece of technology in 2025.


> I've yet to find a decent implementation of search-as-you-type anywhere

https://www.voidtools.com/en-uk/support/everything/


Windows 7 start menu search was always reliable and had predictable behavior from my experience. It can be done, just that modern software engineers' skills and career incentives no longer permit it.


Finder search is just as bad. You can be viewing a directory full of JPEGs, all with the jpg extension.

Then you do a search for .jpg, and get NOTHING. But only sometimes. Other times it'll work.


See this same search issue in everything these days for what was a solved problem a decade ago, what “best practice” is causing this


I've been using Macs since Mac OS 9, and Snow Leopard was indeed very good. It remains my favorite version of Mac OS. I actually think it was Snow Leopard that started the rush of developers to Mac as _the_ platform to use.


Exactly.

People don't want animojis, and they don't want other trite new features that only seem to exist because Apple feels it needs to demo something new every year.

What they want is something that just works without annoyances, distractions, failures, or complications.

Give them that and they'll break down the doors trying to get their hands on it, because it's so far from how most tech works today.


Animojis really feel like peak corporate board asking, "What do the kids like these days?" and dumping that shit into the world. Honestly ... AVERAGE age of the Apple board is 68!! This is a company that's reached some sort of corporate red giant stage where it's influence is massive but it's ability to grow is over and it's only real purpose is to generate heavy metals and seed them throughout the rest of the universe after it's eventual explosive death.


To be fair, I'd wager the average of nearly all Fortune 500 companies boards hovers around the 65 mark


Something that just works and is stable is bad business for companies these days.


Why would it be bad business for Apple? Their business model is based on selling a holistic ecosystem. They don’t have any need to chase new features and there steady stream of high margin hardware revenue is at stake.


> Their business model is based on selling a holistic ecosystem

Yeah and they succeeded in that so now it's about selling subscriptions on top of that.


Spotlight straight up broke on both of my Macs after Sequoia. It can't even find exact matches in many directories marked for indexing and re-indexing did nothing. Just searching for apps under Applications doesn't seem to find all apps.


I’ve had so many issues with it as well! To the absurd level where I could not search for settings in the Settings app… People all over the net have had all kinds of issues and there’s never been any help other than „oh go and reindex”.


iOS has this problem as well. You search for a setting in the Settings app. It’ll say “doesn’t exist” (or whatever) while it’s looking for something extremely obvious (like “software update”) instead of just showing a processing icon.

Then when it does show the results, they’re usually in some terribly unhelpful order. It took me ages to try and go through the CUJ of “this app isn’t sending me notifications because I turned them off now I want them back on”


Just yesterday I was trying to find a file in Finder, using the search, and it could not find it even though I was just one directory up from the directory it was sitting in. It made no sense to me at all. Reading these stories, it’s clicking for me.


It’s a relief to hear this is common. I thought this was user error or a consequence of frequently filling up the internal SSD thus nuking the index.


Just adding a "me too" here, Spotlight used to be incredible. Now it's basically only good if you wait 5-10 seconds... sometimes.


I gave up on it because of this and installed Raycast which seems a lot more reliable. I used Spotlight effectively as my launcher for apps/settings, and have the Dock completely hidden and Spotlight set to hide everything else. But when it can't even do that consistently, I have no idea how!


The nice thing is that there are several apps which replace it and do a lot more at the same time. (Like LaunchBar, Raycast, Alfred)


I can't believe I'm saying it, but I agree with you about GNOME being my forever desktop. I used to really make fun of GNOME during the 2->3 transition, which seemed so profoundly misguided, but now I love it. I don't know if they've massively improved it or if my perspective has just changed with time.


Unified button that disguises as two different icons hiding other useful options

You can only cycle windows in one direction even if you try to do some remapping

Choosing keyboard languages hides a lot of options. Once you understand you need to click on English US to see more detailed options then you get them all, UK, Canadian... Then it's unclear which keyboard layout is currently selected and how to select one from the list you made.

I can't fathom how a DE whose is all about human machine interface guidelines whatever and supposed to be the epitome of UX can't figure out basic stuff about discoverability and clarity


Default keybindings have Shift+Super+Tab doing reverse window cycling in GNOME. Just tried it. Also, which unified button masquerades as two icons?

Keyboard layouts are a pain, but there are some solid extensions that clean the flow up and may be upstreamed into GNOME at some point.

It's all opinions, but boy, compared to the mess that is macOS and iOS regarding discoverability ... I'll take GNOME any. day.


True ! So why can I only remap cycling window in one direction and not the other ... ?

The volume and power icons on the top right is actually one button and hides other option like screen lightning volume and wifi etc. If at list they had made a three vertical dots/stacked bars and is the convention for hamburger menus...

From what I heard GNOME devs do not like change and it sucks to be a GNOME extension developer, a quick google search seems to confirm that so it casts some doubt about them up-streaming any of them but maybe you know better. Has it ever happened to other extensions ?

https://discourse.gnome.org/t/developing-gnome-shell-extensi... https://www.reddit.com/r/gnome/comments/pvvku5/why_do_extens...

Haven't really used MacOS or iOS more that five minutes so I can only trust you on that.

On the other hand for example, it is very easy to remap CapsLock to Escape on MacOs. Just go to Setting --> Keyboard and you easily find the option. GNOME ? No, not in settings. Wait I have to use an app called gnome-tweak ? Ok it's in "Advanced keyboard otions" --> Opens a big list of loosely classified options. Oh well it was in miscellaneous category.


I can believe that its easy to bounce off software because of a million paper cuts. But the problem with them trying to address every one of those proactively is that GNOME is a huge undertaking and they do their best to move at a fairly slow pace (now, after the 3 transition, which was akin to ripping a bandaid off ... go fast, piss the person off, but then the bandaid is gone).

I don't know if the CapsLock -> Escape switch is on a roadmap somewhere, but that is a little bananas. That said, my partner comfortably uses GNOME every day to browse the web and manage some files. Has she EVER wondered how to remap CapsLock? No. The people who do want to? Google can give you the answer pretty quickly. Not saying it's good UX, but GNOME balances a lot of use cases, and as this thread suggests, I think they've actually (with a LOT of complaining from engineers and power users) kept that balance pretty damn well to the point where I haven't been surprised by GNOME is a long time, and seems to slowly and progressively get better.

And yes, whoever jumps in here with their own papercut story, I know there is pain in not being the primary audience for software. But honestly, at least I'm in the same Venn diagram with my partner. The primary audience for macOS or iOS now appears to be ... I don't even know anymore. Used to be content creators, now it seems like even Apple doesn't actually know who uses their computers.


It's not just you, the early GNOME 3 releases sucked. It has seen a lot of gradual improvement over time. Of course there are reasonable alternatives also, such as Xfce, MATE or Cinnamon. (And these three 'alternative' desktops have also edged closer over time, sharing more and more of the underlying tech stack even as GNOME itself has continued to develop in a rather seperate direction).


It could be a third option, the bloat in other OSes has made a less bloated OS look very pleasant and useful.


Did you know, you can set your wallpapers to be continuously updating and make macs use terabytes of your network in hours or days depending on speed? https://discussions.apple.com/thread/255329956


I also wish I could preview the wallpapers without triggering a 100MB download. There's nothing in between the 320x240 thumbnail, and the 4k video.

And so many tiny thumbnails wedged into the too-narrow System Settings window.


My biggest annoyance with recent macOS versions that most QuickLook plugins stopped working. Apparently one could re-develop them with their new framework-of-the-day, but I have no doubt a lion's share of what I'm using will just become abandonware.


At one point a few years ago, Spotlight improved enough that I could use it instead of relying on Alfred. So I deleted Alfred, and whaddaya know...a few years later Spotlight got worse and worse, making me regret that move.


I have been using a Mac since the 128k came out. System 7.5.3 and Snow Leopard 10.6.8 are in my opinion the high water mark for both OS’s.

I still have some 10.6.8 install media for both server and client. Truly loved them both.


I worked at Apple Retail during the Snow Leopard launch. I think I still have a boxed disk somewhere, too. I remember it was not a product I had to sell to customers. People came in asking for it.

Another highlight of that job was selling a green iPod Nano to "John Locke" from LOST


Those were the days…

The most ridiculous thing that happend to me was in the early days of the Apple Store in SOHO I stopped in to see if I could just buy RAM.

The music was loud so it was like I was speaking loudly to be heard and asked for RAM and the they thought I was asking if I could buy a gram.


Interesting that you think that of 7.5.3 — it worked, sure, but it could be painfully slow. System 6 was preferable as an OS — MultiFinder was better than 7, at least in the first couple iterations — but much of the software I needed demanded 7. 7.6.x was the first bright spot since 7.1 fixed much of what went wrong in 7.0, & there was a ton of waiting after that. 9 just chugged along for me, for the most part, which was nice.

Loved Snow Leopard too, & was shocked by how bad Lion was in comparison. Glad they got back on track after that.


System 7 was better for me due to AppleTalk file sharing. System 6 was confined to LocalTalk or printer sharing.


You were right I forgot about 7.6.1. I think I had a an WiredInc mpeg video card server based on System 7.5.3 for a project. So it had a particular memory burn. I suppose I need up using System 9 since all life forms were supported by carbon.


> System 7.5.3 and Snow Leopard 10.6.8 are in my opinion the high water mark for both OS’s.

Wasn’t 7.5.3 the worst of the string of terrible releases between 7.5 and 7.6? In my memory 7.5.5 was much better, but I still preferred 8.1.


> and there is no longer any way to effectively prioritize the results I want (apps, not internet garbage)

OMG this one drives me bonkers. If anyone out there knows how to turn off internet results, please share!


Open Settings, scroll down to Spotlight. Unselect the things you don’t want.

You’re welcome :-)


Just noticed that I was sharing my entire Safari, Spotlight and Siri search history in that menu. Why is that setting in Spotlight settings and not under Privacy/Analytics?


Because spotlight indexing is local and not shared with Apple?


Like I pointed out elsewhere, it doesn’t stick for me.


I've found that kind of thing is often caused by a damaged preferences file. The easy way to check that is to make another user account, and see if it happens there too.


Thanks!


Spotlight seemed to go from great to unusable in 5 years


8 years mac user here, never use Spotlight, it's a trash.


Yeah I stopped using spotlight a few years ago. I didn't really notice that I stopped using it until recently. It just became useless. I reorganised my stuff carefully so I know where I put it. I think that turned out to be more powerful than hoping a search engine over the top would be able to sift through the nuances.


... I'm not sure I've ever found any GUI system search reliable enough that I've used it on-purpose (though accidentally, sometimes) on Windows, Linux, or Mac. I always just use "find" and "grep" (and ripgrep when I remember that exists and realize I just grepped a lot and will be waiting for like an hour if I don't re-run the command with rg instead). Or nothing on Windows, which is fine because I haven't used Windows for anything but launching video games in about two decades.


"everything" app on windows works well for me for file search. Incredibly fast.


^^^ this so hard. Voidtools Everything is how I find what I need 90% of the time now.

https://www.voidtools.com/


I've only started using gui search since using fedora. the tracker search in the activities view is fast and finds files in the home folder by name pretty well. The only shame is that the pdf-content search doesn't work in the main search interface but only when searching in the file manager.


Windows 11 LTSC one is quite good because it's so damn stupid. You can indeed hit start then just type what you want. Only does files though, by name, which is fine.


Windows 11 is beyond pale. It’s infuriatingly bad. But it could be a benefit if you do a bit of manual organizing and ignore most of its dumb features. Only use it for work, I will never use it at home.


Try the LTSC version. All the infuriating bits are not installed :)


Alfred


Spotlight was bad back in the day, so I installed Alfred and started using that. Then Spotlight suddenly improved a lot, enough that it was usable for me, and I deleted Alfred. Then about five years ago something happened internally at Apple to the Spotlight team and it just got worse and worse and more difficult to use, making me regret deleting Alfred.

I wish Apple would just fix Spotlight. They don't seem to think it's worth fixing.


I wonder if Apple has internal metrics that most people just stick everything in the dock and on desktop and don't use Spotlight


That is a good question. I like my dock uncluttered. I have it placed vertically on the left side, with only the apps I use every single day: Alacritty, Brave, Cursor, and Zoom. With Finder and Launchpad included, that's only six docked apps. Everything else I use Spotlight to open, so I feel the pain when the usability gets degraded or buggy.


> As someone who's been a Mac user since System 6 and has been consistently using Macs alongside PCs _daily_ for over 20 years I can say that Apple's software quality (either in terms of polish or just plain QA) has steadily decreased.

Similar for me but started in system 7.

It’s lucky for Apple that Windows has got worse faster.


> prioritize the results I want (apps, not internet garbage).

Settings -> Spotlight -> Websites, UNCHECK


Does not work. I happen to know a fair bit about mdutil and the like and confirmed that does exactly nothing for my particular issue. A full Spotlight index reset works temporarily, but after a while it just conks out again.

Also, I vaguely remember there being a way to _order_ results, not just disable them.


> Also, I vaguely remember there being a way to _order_ results, not just disable them.

Good memory! Apple removed this feature in El Capitan.


I'm ancient by today's AI standards :)


Yea this has been happening on all of my family’s MacBooks. Spotlight indexing just hammers the CPU, and also seems to be doing nothing at all.


It seems to churn the index. I have no other explanation for its behavior. And of course filing Feedback in Apple's tools doesn't help.


I have a vaguely related, kind of interesting story related to search indexes, but on windows instead of mac.

My C drive was was super full for some reason I couldn't understand, and Explorer couldn't tell me where the data was. There was about 100GB just unaccounted for.

I don't even use the search index.


> There are some factual "gaps" there about how good Snow Leopard was

Here are some data points I collected at the time:

https://blog.rongarret.info/2009/08/snow-leopard-is-disaster...

https://blog.rongarret.info/2009/09/esata-on-snow-leopard.ht...

In retrospect Snow Leopard deserves the love it eventually got, but at the time it was not entirely clear.

> Apple's software quality (either in terms of polish or just plain QA) has steadily decreased.

Amen to that.


I just wanted to say that I've been a keen reader of your blog for ... I guess, decades. I appreciate your work. Thank you.


Thanks!


> consider it to be my "forever desktop"

Feelings shared, if only Gnome would provide this column-based file navigation that I miss so much


I do love Gnome. If only we had hardware to run it :/.

I'm stuck on a MBP because it's the only laptop with a great screen, speakers, and battery life. Meanwhile my keyboard keys keep getting stuck after a year of usage, and OSX is garbage. Soon as there is similar hardware I can load Linux on, I'll be insta-switching.


AMD AI Max 395 (superb name) proved that x86 can get to Apple silicon perf. (even not that far in power efficiency), but there seems to be 0 devices from non-trash brands (I am not buying ASUS or HP).

I would love to finally get out of Apple ecosystem, I just don't have any decent alternatives right now. Hopefully next year.


I'm going to purchase a framework just because I value repairability. And honestly, before the m1 macbook I was using a t480s, and I'm okay with compromising on hardware, esp. having been burned with the 2016 butterfly macbook. Apart from the haptic touchpad I wouldn't miss much, other makers are finally ditching low resolution 16:9 screens and you can even find nice oleds. I'm mostly missing the polished software that's only available on macos (things like carbon copy cloner or pixelmator). But with my m1 having degraded battery and having to send it off for a week or two to the nearest service center just to get a new battery, the prospect of a repairable laptop like framework where I can just order a new battery and replace it myself is looking all the more enticing.


I personally think that it is reasonable to "want" an Apple notebook. They have great hardware, great battery life and an ecosystem where every device integrates. Only on macOS you can nicely develop software for iOS. Furthermore most vendors release software for macOS, while they don't for Linux (not only Adobe). BTW the apps I miss most on Linux is the Preview App and Apple Mail.

However I'm done with Apple. I think it's a decision - not "reasoning". That decision takes time and is painful. It's also a decision specifically against "the best" ecosystem available in favor of something "ok".

Not only they repeatedly disappointed my expectations - they just suck as a company (in my opinion). It's not about being less innovative for decreasing software quality, they have done so much for the market, that I think GNOME wouldn't even exist as it is without them... Its about sealing off every inch of their software and hardware they can. No repair without paying... Making RAM and SSD upgrades ridiculously expensive, you cannot even put default NVMe drives into a mac mini - everything is proprietary. Even their sensors have serial numbers to prevent hibernating if you change them out without "hacking" the firmware.

Hardware-wise I have high hopes for framework working with AMD - although they did not address the issues I'd suggest (speakers, lpcamm2), they're constantly improving without breaking their promises. This is hopefully not going to change when they get bigger.

OS-wise I'll stay on Linux. After a long journey going from Ubuntu to Debian to Fedora using GNOME, KDE and even NixOS with Hyprland for a short period, I gained enough knowledge required to really enjoy Linux. System76 is working on COSMIC, which could be pretty amazing, once it is released.

In case anyone would like to try my current Linux config, I'm constantly working on an "install everything" script (pretty early stage):

https://github.com/sandreas/zarch

HF ;)


Apple delivered on Steve Jobs' vision of an "appliance computer".

You might not want one though.


Yeah... probably. I forgot to mention that Apple computers are a pretty good deal if you are looking for an AI / LLM experimentation machine due to unified RAM which nearly translates 1:1 into VRAM.


"Apple is ripping you off on DRAM" vs. "Apple is a great deal for VRAM." ;-)

You don't get nearly as much compute as you would with 6 GPUs, but it also uses less power than a single GPU.


Oh wow. I am realizing I have just been living with these bugs as tiny frustrations all day long not understanding how pervasive they are!

This issue with spotlight is so bad. I use the switcher to pull up my Downloads or Documents directories and half the time it can’t even find them!


    > if PC hardware can ever match Apple Silicon
What is wrong with an AMD Ryzen 9 with 16 physical cores? If you need more and you have a virtually unlimited budget, then Ryzen Threadripper is even better. Also: Is Asahi Linux an option for you?


Of course Asahi isn’t an option. The hardware support is far from finished.


Indeed. I complained that Apple design gets a free pass while being haunted by Steve from beyond the grave for a decade. Your comments resemble my habits except rusted sway right into cosmic desktop alpha and done.


> no longer any way to effectively prioritize the results I want (apps, not internet garbage)

FWIW you can massively improve things by just disabling the internet results. It's easily done in the System Preferences


I would much prefer if you could change the order so that _local_ results come first, web results after — not possible (anymore). Sad.


Like I pointed out elsewhere, it doesn’t stick for me.


It's wild how much of the original "it just works" ethos has eroded


> if PC hardware can ever match Apple Silicon

IIRC some competitors are starting to offer a few laptops with ARM processors, I think Samsung has a few. How do you feel about those?


I generally agree it’s decreased steadily but I also remember macOS 9 and early X versions especially being pretty buggy and having awful performance.


> As someone who's been a Mac user since System 6 and has been consistently using Macs alongside PCs _daily_ for over 20 years I can say that Apple's software quality (either in terms of polish or just plain QA) has steadily decreased.

Similar for me but started in system 7.

It’s just lucky Windows has got worse faster.


Honestly yeah, Raycast is the replacement I'd recommend these days for spotlight.


And now spotlight defaults to the whole computer even when I start a search within a folder... for items in the folder... Turned to garbage sometime in the last ~18-24 months.

At least there's quicksilver


If only there was a good Linux version of Alfred.


Apple (at least current leadership) is programmatically degrading it's products so people would buy a new one. Who expects anything good from such a team.


That statement makes no sense, as the new products are worse than the old ones


Umm the new products are better than the old ones. Not sure if you are being nostalgic.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: