Hacker News new | past | comments | ask | show | jobs | submit login
Modern software is at its worst (ddns.net)
62 points by adito on Nov 29, 2016 | hide | past | favorite | 52 comments



Ah yes, the good old days: When cars had carburetors (and none of that fancy electronic stuff), maintenance was so much simpler. Cold weather sometimes got in the way of starting up, but once you were running, the joy!

Using punch cards and mainframes really forced the programmer to think his solution through and optimize the hell out of it. Once in a while your cards would not get entered properly, but in the end this made you a better programmer even if you are still in therapy for it.

When games had less than 16 colors, and all you needed were a few pixels on a cathod ray tube. Playing Pong, and Tennis and hockey all on the same machine! Sometimes you'd forget which one you where playing, but that was also part of the fun!

Progress entails transitions, and we are now transitioning. It is sometimes painful and ugly, but it's always necessary unless you stay static forever ...


The only problem: We're downgrading functionality, while upscaling file sizes, bandwidths, processor load, the complexity of the tool chains involved, the costs of development, etc.

Most of what we do was present for decades, and there's not so much change at all. E.g., there's not so much difference in requesting live data in a browser by the use of a hidden frame and reacting to its onload event (or using a mechanism, which became later known as padded JSON), like in the 1990s, and using a nifty AJAX library. (You're still requesting some data from a server to load it into the browser and eventually process it in an onload event handler in order to partially update the UI). The difference is really in the amount of processing and tooling involved in pulling this up.

And then, there's the bitter irony of the history that started with the iPhone. Before, there had been smart phones, too, like the SE's ones, perfectly capable of displaying a subset of HTML4, and even a wide range of feature phones, being perfectly able to display a subset of XHTML. The selling point of the iPhone was its ability of displaying perfectly normal websites without any further modifications or restrictions applied. Now, file sizes are shooting up (because mobile?) and even your 4K screen is restricted most of the time to what may be done on a 5 inch touch display at the cost of high processing loads implied by an abundance of abstraction layers that are never put to a sensible use in the first place.

So transition is nice and a good thing (like, if we had made use of multipart-http, as in HTTP1.1, but, yes, there was this MS bug regarding http chunks -- however, we may get now a second chance), but, nevertheless, we may question the direction of a particular transition from time to time.


You can always compare a skateboard to a car if you want to make that kind of point, I guess.


The way I remember it, carburetors were finicky, needed tuning, and could fail in all sorts of interesting ways.


Yes, my post was meant to be sarcastic, clearly the 'old' is not always 'good'. It's a counter point to the OP's rant.


This. I spend probably 4GB+ RAM only to send IMs to people. (I run WhatsApp, Telegram, Slack and Chrome/Facebook)

The sad fact is that the general public only seems to care about the fancy buttons. And from a business standpoint, cheap software (high level langs/tools that are bad for perf) and following current design trends are more important than CPU cycles.

Perhaps we'll see some kind of "back to the basics" movement in the future where UI and software becomes much more simpler.


The sad fact is that the general public only seems to care about the fancy buttons.

The general public is (by our standards) astonishingly bad at using computers.

http://boingboing.net/2016/11/28/people-really-really-suck-a...

https://www.nngroup.com/articles/computer-skill-levels/

Perhaps most users need the fancy buttons, need the leading-users-by-the-nose-through-a-garden-of-visual-delights sort of experience that modern, beautiful, dumbed-down web UIs mostly provide.

The dense old interfaces the author champions are very rewarding and powerful for people who understand how to use them, but they're a huge turn-off for people who don't understand them and who aren't interested in investing the time needed to learn new tools.

Pretty much every web UI post-mortem you see finds the same thing: The more clicks you require of users to perform a function or use a feature, the less users engage with that function or feature. One click is often too much to ask.

Even so, the web as it stands today seems like (from a comp-sci point of view) a monumentally inefficient way of producing the functions it provides, with a ridiculous amount of hardware and software consumed by the problem of displaying, say, a news article which the reader has maybe a 5% chance of actually reading through and a 1% chance of commenting on.


> The sad fact is that the general public only seems to care about the fancy buttons.

I think general public does care. Every couple of years, people lament the lack of speed that their laptops have. All of your programs are beefy, because people want quick turn around. And then the devs never come back to cut the fat.

A friend of mine joked that 32GB of RAM was the new minimum standard, and someone else took a step further to say that disk I/O is the new RAM. We're still quite a ways away from that, but we're still trending in that direction.

Edit: This isn't to say that we'll be using disk I/O as RAM. It's a hyperbole, but there's no denying that it's now acceptable to ignore memory footprint of desktop apps.


> disk I/O is the new RAM

If you try to run OSX on a machine with a magnetic drive, you'll see that this is the case now. :/

To make it clear, my Mac Mini was performant about three major OS releases ago, and went to crap after an OS upgrade, so it's not the hardware.


Well, with the SSDs in them approaching 3,000MB/sec reads, it's probably faster than RAM from a few generations ago. Can't wait for desktops to fully embrace PCIE SSDs, I'm talking like 6 slots for them + the PCIE lanes required to read them all at full speed. Hopefully Zen will get us closer.

Edit: ooh, and shared HBM2 memory between the CPU and GPU. And make it socketed + upgradable :)


There used to be little animated gifs that we'd put on our Geocities pages with flashing "thank you for not using AOL" or "don't surf and AOL" slogans.

IMHO the web needs a movement akin to "nosql", but for bloated crap like JavaScript.

My web browsing experience has only improved after installing noscript.


I think the world is starting down this path, this is the second link to a .txt file in my news feed today :) I've recently been longing for a way to easily filter the internet to text/plain only content, not only is there no bloat, the content tends to have actual substance.

Edit: Holy smokes, the site hosting the content is what I'm talking about http://weeb.ddns.net/1/gopher I'd still like a text/plain only Google though, I'd be willing to POST the search request for such a thing via cURL.


This was also the second link I followed to a text file today, both on this ___domain (same author?). I went through the hassle of pinch-zooming to make the first one sort-of legible on my phone, because the content (C without the stdlib) was worth it, but I immediately hit the back button on this one. I really don't understand the inclination to make things harder to use, for what purpose, nostalgia?


I think there's a middle ground to be struck between making something harder to use and making a page that won't even load text without JavaScript, something I find more and more.


Same here. Tried to read the "C without the stdlib" article on my phone, it was only readable in landscape mode and i gave up after the third time the screen rotated and i lost the line where i was reading. I tried it later on my desktop with 27" screen and all the content was aligned to left, keeping 80% of the screen black. I opened developer tools to align the content to the middle and wondered where the layout came from, there was almost no css. Then i realised that the author did every linebreak manually with a <br> in html. Yeah that's oldschool... facepalm


According to the bit at the top, it's proxied from gopher. The column width limit is probably automatically inserted by the intermediary.


Is it really harder to use? You get nothing buy the raw content, which you are free to style as you wish.


Yes, it is really harder to use! The use case here is "user reads the text", not "user with large screen reads the text". I click on the link on my phone and I can't read the text. Even with pinch-zoom to fill the screen, I can still only sort of read the text. If I must gain expertise with the stylesheet of a site in order to manually re-style it to fulfill its primary use-case, then that site is very hard to use.

I'm not even arguing with the theory here. Proxying gopher text files to a simple text-based website is a really nifty idea, but this implementation has thrown the baby out with the bathwater. An implementation of this that conscientiously picks and chooses web platform features that aid in usability (for instance, perhaps responsive layout and typography) while rejecting those that are unnecessary for this use case (perhaps javascript) would be neat.


Sorry, I missed your reply.

I'd argue that a client purpose built for viewing this kind of raw-text content with formatting to fit the user's preference is better - and defaults as you describe.

That said, the enforced BRs are silly.

I'm finding as I move more and more to a CLI for every day tasks, things like gopher and plain-text web sites are much more accessible to me. Part of my response above comes from a desire to see a simple, consistent means of providing textual documents with minimal requirements or assumptions about what makes something useable for people.


I've fantasized about creating a browser that can only load markdown documents, and render them with a pretty default style. Maybe I should get to work on that.


If you do and need help ... I would love to!


I would love a no-javascript, no-external-urls, no-captchaflare movement so much.


Its not high level languages which yield bad performance. Its developers not taking the time to optimize their architectures at the macro level (and unlike micro-optimizations, these are very rarely premature), or generally don't have the first clue in how to do so.

You can write badly performing code in C++, its dead easy.


In fairness, it's not the developers but their managers setting unreasonable schedules that preclude any chance of optimization.


That happens a lot as well!

But its our job as developers to challenge the managers when the expectations are unrealistic. They are very receptive when you put it in terms they understand.

I've also seen plenty of developers jumping head first into code with a very shallow understanding of the requirements or the tech stack they're working with.

For a large fraction of developers I worked with at previous jobs, the mention of "optimization" meant arcane low-level micro optimizations using techniques the compiler can mostly automate nowadays. Very, very few of them knew what cache lines or branch predictions are. I've seen entire game engines with shipped games on all prev and current generation consoles built that way. Then they wonder why a PS3 has trouble running a simple 2D game with 20 sprites on screen.


I would love a super lightweight messaging application similar to what we had with IRC and AIM but I don't think that will happen again. The whole point of faster and larger hardware is for software to take advantage of it.


> 4GB+ RAM only to send IMs to people

On my machine, especially if people are getting GIF emoticon happy, Hipchat will gladly consume a full core of my processor as well. Reinstalls, stops/starts, etc... none of these work to make it behave again. A real battery killer.


So this is just a txt file on a server... why is it completely blank when I open it? I could understand if there was a SQL database and a heavy PHP application serving requests and the system ran out of memory or Apache ran out of open connections... but it's a txt file. I could even see there being an incompatibility between my browser and the site's coding. Maybe they're using fancy new Javascript techniques or their CSS doesn't play well with Safari... but it's a txt file.

I guess it's modern software at its worst: we can't even display a plaintext file in the browser reliably.


From what I understood it's actually a proxy to some gopher server. And it returns an HTML page with css etc and not just a txt file. Funny stuff.



I think the page is just down at the moment. Even when using curl it doesn't return anything.


On Vivaldi here, can't see it either :)


macOS Sierra, Chrome, empty page.


Author complains of UI design degrading, later goes on to mention "(the URL is actually one line, had to split it to fit into 67 char width)".

Fantastic.


It's like he reacted to overly complicated websites by going to another extreme of making a text document on gopher. Simple HTML on www would have made his point beautifully and been functional so that I didn't have to copy paste the URL.


Philip Greenspun's site has always been my golden standard of what you can do with minimal, hand-edited HTML. http://philip.greenspun.com/writing/


Is it, though? I find it extremely difficult to read paragraphs that are that wide.


You could always shrink your window.


Yea he had some very good points but that one made me chuckle.


I'd prefer it if most web sites were textual. Also, disabling JavaScript can have some hilarious effects for some web pages :D


I agree. Most blogs should not require Javascript to read - that includes GA. Just a little bit of styling to aid readability and you should be set.

Related:

motherfuckingwebsite.com & bettermotherfuckingwebsite.com


Just do it in HTML(with functioning links), not a text document on gopher like this article.


I think the main reason for worse performance is not adding unnecessary features or covering lazyness. It's due to additional complexity introduced by additional layers like VMs, sandboxes and frameworks. These things make development more fun, quicker, cheaper and easier to target more platforms. I think software wouldn't have evolved so fast if every dev was still doing manual memory management in C.

Don't get me wrong, close-to-metal coding and native, fast UIs are wonderful where appropriate, but there's no sense in doing it for every small cookie bakery website.


I think the sacrifice is worth it. Frameworks and ever growing abstractions mean more people with less experience can implement more ideas faster. We get faster iterations which means more features that help real people get stuff done.

This trend actually helps more people than ever use computers for an ever growing array of purposes. Back in the good old days you had to hire developers from a very small group of qualified individuals and then have them spend months on optimizing new code they just wrote instead of using an existing framework. Today you can just read a tutorial online and create a usable POC yourself. This opens so many doors to both developers and new users.

The resource heavy UI practices actually attract more users. Average users don't want dense console UIs or old fashioned native Windows forms. They want a touch app on their mobile that just works. All the tools we have today allow easily sharing and customizing UI components. This in turn fuels fast iteration and improvement of UI so that actual users actually want to use it.

All this doesn't mean you have to be lazy and just freely consume as many resources as you can. It also doesn't mean you shouldn't shame applications for being slow (I'm looking at you Slack for desktop...) or resource hungry like the examples in the article. But overall I think we are definitely going in the right direction.


Ironically, the text is too small to read on my Android HN client. No zooming either.


This debate will never end. Money. It's money. Start talking to the guy who pays the pipers, not the pipers. Although, unfortunately they won't read your blog because it isn't on Facebook.


The guy that actually pays the piper doesn't even have an opinion on stuff like this.


Archived copy: https://web.archive.org/web/20161129135442/http://weeb.ddns....

I guess this might explain why it broke: Hello there! You are currently visiting gopherspace through a proxy.


The modern software industry can't even come up with a coherent mechanism to evaluate what it produces. It's factionalized according to everyone's path through life.


Kind of ironic complaining about UX, given the website...


I prefer that sort of UX. It places the principle of least surprise first. But then again, I'm not defending a budget for "doing UX." I just wanna be able to read the thing.


Hacker News is at its worst.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: