I understand and appreciate the sentiment, but I see the intent very differently. Apple is not employing a frog boiling strategy, but rather being responsive to an increasingly sophisticated adversary.
It’s like criticism of the quality of Google search dropping. It has absolutely tanked, but it’s not because the algorithm is worse, it’s because the internet has grown orders of magnitude and most of it uses the same hyper aggressive SEO optimisation, such that the signal to noise ratio is far worse than ever before.
You can also block specific subdomains, too. Useful when I want to be able to see finance.yahoo.com items in my search results, but nothing else from the yahoo.com ___domain.
That rationalization ignores a lot of confounding evidence, such as other search engines being able to deliver great results and adequately keep the SEO garbage out.
That’s kinda the SEO equivalent of security by obscurity though, right? SEO spam puts a lot less effort into optimizing for other search engines, whereas Google is dealing with being the primary target of every adversarial SEO spam site.
This is a great theory but it isn't the reason. Google management made a conscious decision about five years ago to prioritise profit over search quality.
We know this because the emails came out in discovery for one of the antitrust suits.
The biggest struggle is that the original Macintosh was so simple to manage. The original concept of system extensions to expand the capabilities and the file structure built on the hierarchy with the desktop as the top level was broken with the shift to Unix.
Suddenly the users file hierarchy started wherever the Home folder was located and it became an island of user controlled environment surrounded by complexity of computer operating systems.
The result I found overall well thought out but when the desktop became just a folder I felt the Mac moved from it’s simplicity embracing the complexity that was offered by windows.
Simplicity is fine for a hobby project. An operating system having zero concern for any kind of security is a non-starter today.
It's amazing the rose tinted glasses people have about the original Macintosh environment. It was insanely janky and (unless you were ruthlessly conservative) insanely unstable by today's standards. By version 10.5 (Leopard) the modern UNIX-based MacOS was unequivocally superior to Classic MacOS in every metric other than nostalgia.
I understand the trade offs and accept them. I was trying to point out where the split is and how it won't go back. I think the point of view expressed in your comment is s distorted as the ones your derding.
I also believe that the simplicity could have security as performant. The real advantage of the Unix layer is compatibility that the Macintosh was missing.
I sincerely tried to interpret what you meant here, but I failed. I understand the words, and the fragments of every sentence, but I wasn’t able to deduce the intent of your reply.
Are you trying to say that it’s possible for a system to be both simple and secure? Absolutely that’s the case, but with a trade-off — either it needs to restrict the user’s freedom, or be fully disconnected from the outside world.
I have pondering these ideas a long time and what is needed is an intense glossing over of all the details. The original Macintosh did exactly this and was called a toy and with 128k completely useless. Alternatively my unsophisticated Mom saw the Mac 128k demoed at the mall and went into a frenzy to get that tool. She wanted to publish documents.
The threats in the world are real and the internet doesn't help. I 100% agree that a network connection needs to be kept at a distance to make things simpler.
I think the power of language used to describe a system is where simplicity begins.
What I'm working on is creating a crisp line of delineation between "local" and "public" networks.
If by default after is on the "local" network auto-discovery is secure. If things are explicitly needed a user can publish them through physical manipulation to publish to the outside world.
The outside world can now be described using classic Users and Groups which is cultural easy to understand.
I'm trying to create an environment that focuses on making those 2 things plus a third element simple to understand and physically manipulatable.
The freedom I'm looking for is available on the "local" network. The "public" network is where our data is interchanged with the outside world based on our publishing. I don't expect people to interact with this layer much. I expect people to configure it for whatever institution/organization/government.
Most of the complexity I see in computing these days is market drive demand for eyeballs/clicks/...
Actively depleting the good-will they accumulated over the years definitely makes it worse. It's that harder to give the benefit of the doubt to a company also showing the middle finger to their Devs.
> Google
Giving priority to AdSense sites, fucking around with content lengths (famously penalising short stay sites), killing advanced search options. That's just thinking about it for 10s, but to me most of it is totally of Google's making.
Of course Google's algorithm is worse. Google prioritises showing you search results that make money for Google. Google has no incentive to show you anything else.
I can't believe I even have to say this out loud. Look up enshittification.
It’s like criticism of the quality of Google search dropping. It has absolutely tanked, but it’s not because the algorithm is worse, it’s because the internet has grown orders of magnitude and most of it uses the same hyper aggressive SEO optimisation, such that the signal to noise ratio is far worse than ever before.