Hacker News new | past | comments | ask | show | jobs | submit login

1) No true scotsman fallacy at work.

There are situations where you will interact with the desktop, for debugging reasons not-withstanding. Saying anything else is hopelessly naive. For example: how do you know if your program didn't start due to missing DLL dependencies? There is no automated way: you must check the desktop because Windows itself only shows a popup.

2) What displays on the screen is absolutely material to the functioning of the operating system.

The windows shell (UI) is intertwined intrinsically with the NT kernel, there have been attempts to create headless systems with it (Windows Core etc;) however in those circumstances if there is a popup: that UI prompt can crash the process because it does not have dependencies to show the pop-up.

If you're in a situation where you're running windows core, and a program crashes if auto-updates are not enabled... well, you're more likely than not to enable updates to avoid the crash, after all, whats the harm.

Elsewise you will be aware that when a program has a UI (windows console) the execution speed of the process will be linked to the draw rate of the screen, so having a faster draw rate or fewer things on screen can actually affect performance.

Those that write Linux programs are aware that this is also true for linux (write to STDOUT is blocking), however you can't put I/O on another thread in the same way on Windows.

Anyway, all this to say: it's clear you've never worked in a serious windows environment. I've deployed many thousands of bare-metal windows machines across the world and of course it was automated, from PXE/BIOS to application serving on the internet, the whole 9 yards, but believing that the UI has no effect or no effectiveness of administration is just absurd.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: