The biggest problem facing the front-end space today isn't so much of complexity of a particular library, rendering technique, or view/model architecture, but rather lots of bad ideas glued together, creating nightmare scenarios for companies trying to maintain products.
A micro dependency system with never ending breaking changes to glue different tools and libraries together - bad idea.
Using un-opinionated "libraries" that don't scale well, but at scale - bad idea.
Technology organizations trying to stay relevant by simply adopting every next hyped fad out there, rather than stepping back to get a bigger picture of what the front-end space actually needs - bad idea.
The list goes on, for quite a long time.
And all of these issues are further exacerbated by an army of junior developers entering the front-end development space, along with recruiters subscribing to buzzwords to hire them.
To me one of the unsung skill sets of the industry is tool selection. The ability to look at a tool and imagine how it’s going to behave for different pay grades of coworkers, different specialties, and to predict how that will pan out in the future.
Sometimes you pick the simpler tool, and hope it has legs. Sometimes you tweak your product roadmap to dovetail with the tool’s. Sometimes you push to get 25% of a feature set released so you can make progress. Sometimes you temporarily add another tool, and sometimes you fork for a while.
What most people just do though is measure power of the system, ignoring the Principle of Least Power for third party code, if not in fact for the entire project. Approachability is often better than power. Quickly knowing what a tool can or cannot do is generally more productive and results in less politics (who hasn’t worked on a project where person A keeps criticizing Group B for not knowing the framework can just do the thing they wrote, but it’s not immediately obvious that it can? That’s someone tearing down the team to boost their own ego. I don’t like it. I liked it even less when I was the one doing it. Don’t be That Guy, he’s an asshole)
Yes, and even though a lot of tech companies killed the architect role, this is exactly the job function assigned to architects, or more specifically enterprise architects in a lot of more traditional companies.
I haven’t had a lot of luck with architects. In one case I came to a company that got rid of their architect and hired me and another person to replace him (to be fair they were too small to warrant a full time role anyway, not sure what they were thinking).
Especially in the post refactoring era, they tend to have an inaccurate account of what the code is actually like and make bad calls based on bad info. It’s a responsibility that works better when the people have their hands in the code. I’ve had better luck with Architecture as a job description shared by the lead developers. In large enough companies they have Staff Engineer that scratches that itch with perhaps slightly less people skills (though I really don’t recommend it).
yep, if your architects don't have their hands directly on the code, they're making bad decisions.
People talk about enterprise architects as these pie-in-the-sky people who shit diagrams and everything works wonderfully. In practice it's "seagull architecting", with everyone else forced to deal with the reality on the ground.
At some scale architects need to be more hands off. When that happens they need to have a _very_ strong relationship with the people on the ground or it doesn't work well. Even then I would argue they should be getting their hands dirty in terms of reading code, etc, they just may not have time for implementation duties.
I like to go back to the basics. It irks me that folks don’t think HTML/CSS/JavaScript are good enough as-is—but nowadays the default browser capabilities are incredible compared to a decade ago. We basically have a full programming environment and add WASM to the mix! Back in the day the frameworks were first-and-foremost a platform compatibility solution.
People often have it ingrained in their psyche that reinventing the wheel is evil, but it’s not—we do it all the time when we write paragraphs.
It’s of course tempting to grab a library but there’s a learning curve and an often hidden long-term cost with using libraries.
I work at a place that is mostly like this, I can tell you that using html/css/javascript raw, with a bit of bootstrap is a nightmare with a SPA.
Menus are constantly broken, back button is a game of roulette, caching is constantly a problem showing stale data, xss and other vulnerabilities are ubiquitous.
There are modern affordances in many of these frameworks others take for granted.
The underlying issue with web applications is that it’s like a square peg going in a round hole. Remember, HTTP is designed traditionally as a stateless protocol delivering static web pages, more or less.
But here we are, with stacks-on-stacks-on-stacks of layers emulating what truly should be a native application. “Web Application” should not be a thing. HTML/CSS were supposed to be for content and presentation; JavaScript was for sprinkles of functionality.
So no matter how you spin it—each framework is a workaround making the browser do something it wasn’t designed for.
When I say that I don’t mean “abandon all APIs”—I’m just stating why things are so complicated in the web space.
Moral of the story—give us more static content please. “Dynamic” means ads and wasted CPU cycles.
Things change. Just because it was originally made for documents doesn’t mean that it hasn’t undergone such huge changes that it can literally drive FPS games at 60fps without a sweat. And I say that as someone who really doesn’t like many of the underlying abstractions, imo CSS+HTML are just thoroughly badly designed, none of them would be whole without the other resulting in extreme close coupling (layouting for example cannot be done without the other).
Also, should I really download a random exe to order a pizza? Plus, especially because the protocol is stateless a web app makes so much more sense (then replicating the state at the backend side).
Ordering a pizza can be done with static pages and some forms.
You want to make photoshop or a CAD program? That is an app. You want to order a pizza? Just use html and forms, maybe javascript to reload the progress page once per minute. (You don't need websockets or SSE or anything to check pizza progress)
Not everyone is a fan of throwing away secure sandboxed environment easily accessible apps run on to native applications that might not even be built for your OS.
None of what you have outlined is actually a problem.
If your requirement is to deliver an application to users on multiple platforms then a web application is a great target.
People complain about HTML, mean while new platforms (Swift, MAUI, Flutter) still design tree like documents. It's insanely fast, optimised, accessible and the tools are fantastic.
Frameworks don't change browser behaviour, they allow you to go up a layer of abstraction. Static content is great for static content. It would be a terrible fit for a rich text editor, a dynamic chart or anything that requires interactivity.
The ecosystem is great. Native applications can get you a more optimised experience at a cost. That cost is not worth it for many solutions.
Ads are on native compiled apps too, it has nothing to do with HTTP/HTML.
I’ve always been in favor of an entirely different file type, specifically for applications. Imagine if we’d just had something like .aml (application markup language). Then html could have stayed as a document format and we could have avoided all this nonsense.
I liked that react/jsx was born out of the web, and then influenced swiftui, jetpack compose, and flutter for a new way of writing application markup. I wonder: if not for the web and its pain points, would we have landed on similar patterns? Maybe.
Yes! This is the core problem due to the historical trajectory (browser invented for static documents, now being used to build applications) -- one that's no longer possible to sidestep at this point.
I believe there is a phrase "path dependence" to describe such situations.
Many times when people try to forgo them, they end up making their own poorly specified and half-baked version of it that only some people (who may leave the company) understand.
I don’t think ORM is the same since it’s more of an API than a framework (in my mind). ORM also falls into the platform-abstraction arena because it typically has multiple backends which provides other benefits. It also is served as a wrapper library converting from one language to another (SQL).
Web frameworks are more like… clever hacks to HTML to wedge in a “new way to do it” more clever than the last attempt.
Game engines are also somewhat different because the level of abstracted complexity there is vast and heavily ___domain-specific. It targets multiple platforms like ORMS and multiple GPU backends as well. It provides physics APIs and other heavy maths capabilities too.
But the browser standards-bodies provide that for us now. HTML5/CSS3/JavaScript will run well across all modern browsers.
I've only used laraval and knexjs over the last 10 years and have no idea which ORMs people are complaining about because the ones I used do not have the issues they talk about.
I agree. I like pure JS, but it just does not scale by default. You need to have awesome architecture skills and you need to constantly observe your codebase, you need to write half of framework if you want to avoid frameworks. Of course it's possible, but it's not possible for vast majority of developers, including myself.
May be we need some education: how to write 100 kLoC pureJS WebApp and keep sanity. I didn't see that kind of articles. I know that my pureJS web apps can survive few hundreds LoC. Then it becomes a mess. With React it's much easier to structure an app so it's maintainable, different parts are separated, etc.
Yeah the frameworks do force a disciplined approach. It still boils down to separation of concerns like presentation from business logic etc. Historically the front-end computation existed on the backend but with NodeJS everything sort of mushed together (creating more complexity and possible architecture mistakes). Template libraries are great way to Split View from Model—but even that is taken care of in languages now like Golang templates and JavaScript template literals etc.
I suppose the question is: how much time does it take to master stock HTML5/CSS/JavaScript versus mastering a framework, through-and-through.
Frameworks are constantly in flux but the foundation they are built upon is a more lasting skill set. But the more we spend learning framework X we are spending time away from foundations.
> Menus are constantly broken, back button is a game of roulette, caching is constantly a problem showing stale data, xss and other vulnerabilities are ubiquitous.
So… what’s the difference between this and SPAs using frameworks again? Because it sure seems to me I see many of these in sites that are apparently using frameworks. Hell, Facebook — presumably the poster child for the react ecosystem and certainly with the resources to do everything right — is still introducing nav-state related bugs.
Frameworks might focus people’s attention on what needs to be done, but the fundamental capabilities aren’t in the framework, they’re in the browser and the heads of the devs.
And of course, the other possible point the parent is making is not that people should be doing SPAs from scratch (which probably wouldn’t be wise in many cases) but that it’s not wise to start from the assumption that you should be making an SPA.
> Hell, Facebook — presumably the poster child for the react ecosystem and certainly with the resources to do everything right — is still introducing nav-state related bugs.
This doesn't necessarily disprove the framework's value proposition. Bugs like this are hard to squash and at great scale (like Facebook) they're a huge challenge. Frameworks propose trade-offs to manage them, but can't eliminate all classes of bugs. We don't know how much worse it'd be without the framework approach.
If browsers could agree on a themable UI component framework - it would solve so many things for so many people. The jazzy designers can still have their complicated CSS/JS animations and cool layouts. But having a standard solution for normal developers would be so good.
As someone trying to create a frontend with plain HTML/CSS for quite a complex backend, I can attest to the fact that's impossible to maintain consistency between browsers.
It's not even about how each block's styling behaves, but how different combinations of tags, blocks and widgets are able to exhibit very specific issues in each one of the 3 main rendering engines around. In very different ways, that require incompatible solutions.
I'm primarily a backend but I've dabbled in frontend at times where needed, and I've always kinda felt there's value in polyfills if nothing else. Maybe you don't go full Angular/React but JQuery adds a lot of value for not much effort.
I realize JQuery is terribly unfashionable these days and maybe people would rather use some smaller niche polyfill library, but with how quickly the javascript world churns and deprecates, that is kind of a virtue tbh. JQuery is 16 years old and that's ancient in the javascript world, the Lindy effect says it will likely continue to be a pillar going forward as well. You're probably just better off using the standard even if you're not using all its capabilities.
Precisely. There was someone here bemoaning how difficult it was to follow generic advice of using profilers to optimise hotspots in the application code.
He started off with something like: "Why don't you try your 'simple' techniques in a tangled web of hundreds of microservices written in different languages and running on different platforms?"
It's like some people can't see the forest for the trees.
In the last few years, I've come across about half a dozen existing web sites with hideous performance problems, all of which should have been vanilla HTML but were written as Angular monstrosities. The same teams -- against repeated advice -- have started new Angular projects for sites showing static data, anonymously, to the general public.
They start off conversations with "We'll need a web app, an API app, a mid-tier, a service bus, and then this, and then that..."
Choosing a framework for that is bad enough, but you could make the case depending on how complex the rendering is. Still…to go with Angular sounds like the literal worst choice. Like at least React is just a UI library and can just be used server side.
Unrelated to the debate at hand, my front-end team's junior members' only known design pattern is drop-all-state-on-refresh-driven-design and it pisses me off.
Basically every part of the website written before their time functions correctly.
I thought it's the other way 'round. Everyone loves their cool new reactive framework, until they have to tackle the state library/library helpers/library alternative/other library alternative/library helper maintainer social media posts.
Well, the thing is… for any non trivial web app, you do need a framework (spa or Unpoly like). The question here is if you use an existing one or end up writing your own.
I don’t buy the “just use html, js and css”… giving the same developer skills, without a framework that becomes a mess much sooner than with one.
As your code grows you end up creating your own libraries, and your own conventions, and as soon as you (the one with a vision and that knew how to do it) leaves the company and other team members come and go, it ends up being a disaster because there is no documentation, no maintenance, and as you reinvented the wheel nobody used your framework before, so everyone has to start from scratch with it.
Popular libraries and frameworks are popular for a reason. Business wise, it makes sense to not reinvent the wheel and rely on existing battle proven, secure and documented tools.
Just don’t reinvent the wheel. Web applications are not “paragraphs”.
> HTML/CSS/JavaScript are good enough as-is—but nowadays the default browser capabilities are incredible compared to a decade ago.
I'm not sure they remember a time when the choice wasn't "X Corporate Framework" vs "Y Corporate Framework" but "Native Desktop Application" vs. "Website Only Application."
I feel like I'm already accepting a huge set of "dependencies" by choosing to develop a web application, and I have a vast stack of technologies to draw upon in building my application already.
In a modern web framework you still have HTML and you still have CSS files.
The problem is JavaScript and in particular the way that you interact with the DOM: browsers use an imperative API, that this day is obsolete, and makes writing web applications a mess rapidly, and produce spaghetti code difficult to modify and isolate.
While practically all modern frameworks use a functional approach: you have the component, that has an internal state, a function to render DOM elements from that state, and if you need to update the view you don't directly manipulate the DOM elements, but update the component state, that causes the framework to call again the render function that updates the DOM elements as required. That is so much simpler, because you don't have to ensure that the state of the application is aligned with the state of what the user sees on the screen!
I definitely see the elegance of the modern reactive approach, but in practice I'm not sure how much better it really is. I still see spaghetti code, and I still see stupid bugs in production. Doesn't seem to matter whether it's JQuery or Vue. Right now I'd bet that careful architecture and thorough testing are still #1 for making good software.
It does often turn into a weird kind of spaghetti when it comes to things that are inherently imperative. Sometimes hooks make me feel like I’m trying to follow a Tarantino script.
What irks me is the javascript part. You dont need to run random scripts on my system, just send me some fucking text! Make it a touch prettier with css! The javascripts are so bloated, spy on you, are often a malware vector, and offer little real user benefit.
> It irks me that folks don’t think HTML/CSS/JavaScript are good enough as-is
I'm one of those folks. Allow me to explain.
I compare using HTML/ CSS/ JavaScript (or something that transpiles to JS/WASM) to making GUIs in Qt (C++) or with help of QML. I find the HTML/CSS/JS hopelessly complex compared to the Qt with-and-without QML.
Sting based binding of CSS to HTML classes/ids is super error prone. CSS is not really "connected" to the HTML as would be the case in style my QUI with Qt.
Also the widgets (dropdown, etc.) I get in HTML are often underpowered underfeatured, so I have to use widget libraries on top, or roll my own.
There is a reason why modern UI frameworks like SwiftUI or Jetpack Compose look more like React rather than pure HTML/CSS/JavaScript. And it is not because iOS and Android can’t run WASM.
I agree with everything you say, but I'm in the camp that thinks that front-end is stabilizing.
I feel many web projects can go a long way with something like NextJS, a few classic libs (eg, lodash/underscore/ramda), maybe a few libraries for handling data if you really need them. The design frameworks (MaterialUI, Tailwind, etc.) are also fairly stable.
Next.js is really bad, IMO and perpetuates more bad practices.
Both Target.com and Walmart.com are Next.js apps.
Both utilize SSR to render the pages (view the markup in the network tab).
Both then STILL send the full data model to the UI (check the `__NEXT_DATA__` on Walmart.com and `__TGT_DATA__` on Target.com) because Next.js doesn't quite offer the right amount of control over what to send back (compare this to Astro.js, which does offer control over which data is needed for the client-side binding).
Next.js handling of images is ugly. It creates tag soup for responsive images instead of using native HTML and CSS capabilities (again, compare it to Astro.js and it's night and day).
Their stunt and exaggerated numbers with Turbopack further contributes to fracturing the front-end community and introduces Yet Another Tool instead of plugging into Vite.
> Both then STILL send the full data model to the UI (check the __NEXT_DATA__ on Walmart.com and __TGT_DATA__ on Target.com) because Next.js doesn't quite offer the right amount of control over what to send back (compare this to Astro.js, which does offer control over which data is needed for the client-side binding).
Those two particular customers use `getInitialProps` which allows them to respond which exactly the data they need to render those pages. Can you tell me more about what other capabilities you'd like to see there?
> Next.js handling of images is ugly. It creates tag soup for responsive images instead of using native HTML and CSS capabilities (again, compare it to Astro.js and it's night and day).
Have you seen our updated image component? It's just an `<img>` tag, it uses all the native browser capabilities, and doesn't depend on React hydration (it even works with JS off!). Here's a demo[1] and here are the details[2].
> Their stunt and exaggerated numbers with Turbopack further contributes to fracturing the front-end community and introduces Yet Another Tool instead of plugging into Vite.
We've been working really closely with Evan You from Vite to present the data in the most clear way possible. Next time we'll make sure that any project we reference has a chance to submit feedback before we publish. We also contribute to SWC which Vite 4 has shipped plugins for[3]. We're quite confident Turbopack will continue to have a positive impact in the ecosystem.
I don't understand the "right amount of control" bit, there are plenty of options to control what data moves to and fro in NextJS, but the CEO of Vercel has replied to you in a sibling comment; he's going to be able to address your technical concerns better than me.
I'll just add that I'm not familiar with either of those sites, because I'm sitting in Spain and neither operate here, but on first inspection the UX has felt brilliant to me - particularly Walmart's. Everything loads very fast, the search is great, etc. I cannot say if their code or eng practices are a tire fire but the product looks good to me.
> I don't understand the "right amount of control" bit
Try Astro.js and you'll understand. If the data has been used to SSR and has no function on the client side, don't send it to the client.
If two multi-billion dollar, multi-national corporations with multi-million dollar project teams can't get it right in a space where every ms counts, then the average team has no hope.
Also, its implementation details include monsters like monkey patching native fetch which lost me insane amounts of money and time to be found (file uploads with more than 15 kb of data broke everything).
next is nothing more than yet another bloated collage of poor ideas with poor execution, but they have marketing going for them.
Next.js also get into many of my govt's websites. It's the worst in any direction. I can explain why with bullets but so tired of these repetitive well-known reasons already.
Next.js is really easy for developers of all levels to work with though. It’s main benefit is the ease of use, not the optimized client side data transfer.
What I see from Astro.js is a lot of magic. This is great if it works, but chances are it won’t for a lot of people.
it'll be stable when the JS community stops reinventing the wheel several times a year. Every year we have the 'next big thing' and a whole new JS framework and ecosystem thats set to be the one to beat them all.
Fast forward 2 years, its mostly abandoned and upgrading a 12 month old project is a nightmare. That is not stable.
I can pick up a PHP, Python, <insert other language here> script written 2-3 years ago and know it'll work if I try to do something with it. With JS you can guarantee that one of the bajilion dependancies has had a backward breaking change or vanished off the face of the earth and the whole thing falls apart.
JS tooling, and always has been a total disaster, and NextJS hasn't fixed that.
I've been coding my front-ends in React for 5 or 6 years. I don't have any of the problems you describe, I can also pick up old projects and run them without much problem. There was the change from classes to hooks but I still remember classes fairly well. Then there's been NextJS which I've picked up in like a weekend. That's it.
There are new frameworks popping up all the time. Some look very interesting. But React has been around since 2013 and it's a world standard.
I agree that JS/TS tooling is largely a mess, it's why I don't like JS/TS back-ends other than maybe some simple NextJS API routes, or an Express server with something like three files. But on the front-end, React CRA and NextJS (with TypeScript) both have served most of my needs with just a few CLI commands and minimal config. I rarely have the need to meddle with the tooling and I can get up and running in seconds.
I was given a old unmaintained react project to build a pipeline and the only way to run it was on node 12. It would not even build on latest node versions.
JavaScript itself is entirely backwards compatible back to 2002. If anything, I'd argue raw .NET is far worse than raw JS for backwards compatibility, because .NET has depreciated entire languages like VB6.
To compare something like React to .NET, let's look at .NET libraries and frameworks. For example: Sliverlight, WebForms, WPF, WCF SOAP, old versions of EF, or old versions of MVC. It's not been pretty for .NET, and you're in a bad place if you still have a Sliverlight app in 2022 because it only runs on IE11.
Every language and library suffers/benefits from the onward march of progress. It's disingenuous to claim JS is the only ecosystem that has significant churn.
It's becoming a standard, and yes, I personally like it. There's plenty of other great choices if you don't like it.
And even some of the up and coming libs, like SolidJS, feel stable compared to the 2010's - they add incremental improvements but keep things that some people really like, like JSX. In the early to mid 2010's everyone was reinventing the wheel constantly.
Do you have any real commentary against Next.js not being standard, or only generic snide and feelings of superiority directed at "everyone" but yourself?
I'll share a few facts about Next.js:
- Their showcase: https://nextjs.org/showcase#all. The number of super-scale websites using it speaks for itself (doesn't include among others Walmart, which another commenter pointed as an example of how terrible Next.js, but which I've found to be surprisingly good)
- Explosive growth in the 2021 State-of-JS from 2017-2021, with 91% purported retention
- The core tech, React, is voted by far the most commonly used front-end framework in SO dev survey 2021, and 4th most loved. You would never guess by reading HN.
Many if not most people building an SSG or SSR site in React are going to reach for it. If this does not point to a standard then I guess what's only left is to run in circles and argue what a standard is.
Don't take this the wrong way, but all those impressive figures you've shown don't compare to actual experience. Some of us detractors have 15+ years doing stuff on the Web and we've seen this cycle play out 2 or 3 times. Everything is cool and the best way of animating divs, until it isn't.
I do not like playing the experience card, but when someone tells me React is good and simple, it just tells me they have no idea whatsoever what good and simple has ever been.
And if Next.js is supposed to be a standard, then I might as well quit doing framework at all because it is not a very good library, it just has great marketing, and thrives upon the shoulders of the most common frontend library, React. Sorry to the devs which are often here to PR, but that's how it is. It gets you easily to 80% of the way, the last 20% are really where the issues (bad docs, bugs, constant churn) lie.
I built my first website in the late 90's or very early 2000's - I can't even remember. I also think I knew what simple was when users wanted very little interaction. Then in the middle between then and now, building a website with jQuery was a nightmare. So pray, do tell, what you build your websites or web apps in.
> React is good and simple
I in particular never said it was simple. React, particularly in SPA form, starts creating challenges pretty early on in regards to state management and app architecture. But I have never seen solutions to build large SPAs that don't have pitfalls.
In the SSG/SSR realm, I'd argue Next.js is quite simple.
> don't compare to actual experience
> Some of us detractors have 15+ years doing stuff
> I do not like playing the experience card
> it just tells me they have no idea whatsoever
> it just has great marketing
> Sorry to the devs which are often here to PR
> the last 20% are really where the issues (bad docs, bugs, constant churn) lie.
About 80% of your comment is how much better, experienced and genuine you are than everyone else - as opposed to us schmucks you accuse of doing "PR", you are here to deliver simple, unadulterated truths. I would've engaged with any technical points you had made, but since there aren't many to speak of, I guess what's only left for me to say is that it's fantastic that you feel so good about yourself.
Nearly 10 years after the release of the library, React is by far the most commonly used framework according to the 2021 SO dev survey. It is also the number 4 most loved. [1]
I'm a React dev as much as I am a Python or C# dev. I reach for tools that I know I can build in and hire for. It's been years since I use React with a single command line to kickstart a CRA or Next app, and take it from there. Yes, there are pitfalls once you start building anything non-trivial, but so does Django which has been around since 2005 and by virtue of being back-end should be more stable.
Your choice of examples got me interested - because mine would be similar. Would you recommend something else?
Besides Material UI I wouldn't recommend any React-based UI framework (Tailwind is only CSS, solid pick with React nonetheless).
I would not recommend NextJS without context, but it is to my knowledge the most mature full-stack React-based framework.
As the old saying goes: "We have a new framework that will make the existing ones obsolete." This makes n+1, not 1.
An agency usually deals with different customers, different settings.
What I achieved by migrating dozens of apps to a Angular only frontend, is a platform. Reusable components, devs that can easily switch projects. This is the one and only framework we use, monoculture.
This is a beast, we could abstract away certain processes and developed a No Code Editor on top of the component platform. This Editor handles alone 100+ apps.
No pain what so ever with migrating from Angular version to version.
Best decision ever, so many benefits, even in abstract logical terms, for example, we could simply compile into any other framework by rebuilding some of the components in another framework.
You won't get these benefits, if you fall for fad after fad. Monocultures help in certain settings.
I could see Angular working out great for certain type of projects and teams. I used it for one project for front end and didn’t dislike it at all. Unfortunately I see Angular relegated in favor of React mostly..
Angular is still the go to framework in Java and .NET shops when a SPA is part of the project.
It is kind of changing in Sitecore projects, because they have a collaboration deal with Vercel, and are pushing Next.JS/React as the main framework for headless CMS workloads.
Any project that changes often is going to break often. When it involves dependencies that are constantly upgrading or being added it doubly adds risk of breakage.
Test coverage and being conservative with adding new dependencies/concepts is always a valuable culture to develop for any frontend team working with junior devs. A good CTO/senior dev can clamp down on this behaviour.
I agree with the dependency commentary. Tech should value supply chain. Development is slower but Power comes from ownership. I'm also anti-JavaScript, at least in abundance.
I noticed this in gaming as well as software... engineers always build for the best hardware. Despite everyone knowing that extremes (in the wild) are a minority.
I believe tech needs to focus on building better 4-cylinders rather than expanding cubic inches.
Partly think that's why there was some dismay at Deno's decision to tunnel packages from NPM, a contingent of folk who wanted a reset on the JS ecosystem and (lack of?) culture
This! The problem is also that as our apps grow the server side part of it and the client side part of it grow at pretty much the same rate. For each new functionality added on the backend we end up writing more UI code. This is a problem. We need start thinking of how a UI frontend can be implemented as a separate app completely independent and agnostic to the backend. To the extent that I can simple take the same frontend code and plug it to any compatible backend and things will just work. We need to develop a frontend client that can work with any backend (your app, my app, her app). The UI across all apps is mostly the same and we keep rewriting the same JS/HTML (albeit using different JS libraries) each time we build a new app. This is the problem. Let's start thinking of how we can build a general-purpose frontend client which doesn't work on low-level details (JS/HTML) but on higher level abstractions.
the primary solution to all this is to remove technologies which are not required and especially those that duplicate what the web as a platform provides (ie works in browsers/runtime directly); for example React has been technically obsolete a few years, yet that doesn't mean anyone understands what to do instead, regardless of whether those technical implementations were ubiquitous 1 year ago or 15 years ago; I find it remarkable to read job listings for up and coming concerns (eg Anduril, BioNTech, etc) basing their platforms on long outdated notions and implementations, which bring with them multiple vectors for risk to security and more (unnecessary) costs to developer productivity, complexity; SASS and npm/yarn are other obvious examples of tech that costs more than it benefits in modern work.
Over the last 3 years I've started to feel more and more vindicated.
I explicitly opted out of most web development when it turned into angular/react and later vue. I've done work in both angular and vue so I've not been able to avoid them completely, but mostly.
And the reason was a fundamental disagreement with their approach. It has _always_ felt to me like people took a good idea (AJAX) and took it entirely too far.
And they found themselves fighting fundamentally with the browser, so the industry's solution was to create new standards such as the history API, when the REAL solution was to just not do what they were doing.
Does react bring legitimate good ideas? Probably, somewhere, but thank god we're starting to see the light at the end of the tunnel.
imo the best thing to come out of it all is typescript.
I use React and Svelte for different projects. I keep things really simple and it's rare that I need to rely on an external runtime dependency (build tools are another thing), if something is relatively small I will code it myself. I have also built my own state management library (not open sourced yet) that is literally exactly what I wanted out of state management, so I don't need to rely on gazillion plugins to make something possible, I just code it myself.
Seems like a half-baked agency-built version of htmx[1], no?
I'm actually very intrigued by the whole "let's take a step back and move a lot of our logic back to the server" approach to modern dev, but IMO when you need more than statically rendered pages, it should look a lot more like htmx or Phoenix LiveView (if you're using a framework) or something ultra-minimal like Slimvoice[2] if you want to go the bespoke route. Not this.
Alternatively, web developers could calm down and just write mostly HTML with some plain javascript where warranted.
I've recently done some web development for the first time in 12 years, and I was horrified at all these pointless frameworks that break every usability win web browsers have made in the past 20 years, cost extra bandwidth and have atrocious performance. Like, why? Can web devs really not learn the 4 functions that make up the websockets API? Do they really need a special framework named after a coffee bean?
This kinda crap is why the gmail tab uses a GB of memory.
Tools like htmx aren't the reason the gmail tab uses a GB of memory. You'll find the majority of cases where pages are using way more resources than it should, are due to reasons forced on web developers, like a billion different trackers and several different ad networks, and workarounds to ad blocking to sell more subscriptions, etc.
This is what is so shocking to me when HN spends such an absurd amount of time rallying around this idea that frameworks are the reason sites are bad. Mind you, I do think a lot of them are misused, but 99.9999999% of poor websites aren't because of the framework chosen.
According to my browser, the default view in gmail needs 8.32 MiB of javascript spread over 90 files to render.
While IDLE, the gmail tab uses 10-30% of an M1 CPU core.
That stuff is not down to tracking - it's because the damn thing keeps messing with the DOM, because it has some insane structure where javascript controllers are attached as strings to DOM elements, because it uses about 100 divs to render every row in the list of messages, and because every message is nested literally 15-30 layers deep in a pointless tree of divs.
I am sorry, but web frontend is an insane dumpster fire.
I haven’t tested the following claim and not at the computer atm (^U, see below), but divs must be cheap. They’re just structures which a rendering engine walks through and computes hierarchical offsets of via stylesheets.
I’m not arguing for using hundreds of divs, because that’s a sign of engineering insanity imo, but it’s not a performance issue. You can create a static page with thousands of them and (likely) see that it still doesn’t use 250Mb+ neither idles at 30% cpu core.
Added:
damn thing keeps messing with the DOM … web frontend is an insane dumpster fire.
Totally agree with these parts.
—————
… I just tested 50 div-rows with 200 spans containing numbers from 1 to 199. Total 10k elements, text doesn’t fit on the screen. Bootstrap css+js is imported via CDN.
Mithril.js implementation uses 50Mb of RAM consistently.
A static page with the exact same content uses 71-96Mb depending on the run.
Both use 5-14% cpu when I shake my mouse and 0% when I don’t.
Refresh is significantly faster with Mithril, idk why, probably because static html parser is slow. (Opera, Windows 10, no extensions).
With a single span per row instead of 200 both take 23Mb.
Removing bootstrap from the head inconsistently frees 0-2Mb for both.
Chrome on an M1 Pro Mac. This is the number that Chrome’s task manager shows me.
I do have a lot of emails, but I think it should only have to worry about 100 of them at a time, so I don’t know why it’s different for me and you. It is possible that my account still receives beta features, because I used to work there, but I remember it being like this for multiple years, since the new gmail came out.
Are you using Chrome through Rosetta? I'm still unable to replicate this on either my desktop or laptop, both of them just idle when I open Gmail and neither will push past 5% usage while I'm opening emails. Your metrics seem a bit odd here.
This used to be true but isn’t anymore. For a while they pushed Angular, nowadays I think a lot of teams are using react. GWT is long deprecated and gone.
From what I understand (no special insider information) React is essentially non-existent inside of Google, and NPM is not even available to engineers (be default)
It depends what org you're in. Cloud seems to use a lot of Angular, but the rest of the company is coalescing around an in-house framework called Wiz (you can poke around the open sourced jsaction library to get an idea of the design philosophy).
To be fair, gmail was quite great at the time of GWT. Not sure how much use their Clojure “platform” still get, but I think it was very ahead in terms of maintainability of large js code bases (or even ahead as we have it now)
If you can limit yourself to exactly one, maybe two external deps, which are really just to make your dev life eaiser, you may have something of a point.
You have to remember that a large chunk of "web apps" means "electron apps" which are absolute bloated pieces of junk next to their desktop cousins.
You also have to remember a 'framework' typically doesn't stand in isolation - tens, hundreds, thousands of dependencies typically lurk.
We all agree tracking junk makes it even worse, but the whole thing is already overcomplicated before you get to that point.
> You also have to remember a 'framework' typically doesn't stand in isolation - tens, hundreds, thousands of dependencies typically lurk.
While this is true, at least by having a large number of people commit to the same set of dependencies, you stand a higher chance of complications between those dependencies being ironed out, either directly by the framework maintainers or by the user base of the framework.
> This kinda crap is why the gmail tab uses a GB of memory.
Actually no, and the real reason is organisational.
It has been a pattern for years now for front-end projects to consist of multiple, independent modules developed by separate teams - banking apps are a prime example of this.
Gmail appears to have went the same route, because it now sends over 200 requests when loading - a hallmark of a highly modularized front-end.
The organisational benefit is less knowledge required per developer, which in turn brings other advantages like resistance to turnover.
You can say that about any engineering problem and be right 99% of the time. Because it’s always true, it’s not a super useful analysis.
Somehow, software on embedded, the kernel, graphics, flight control, etc. is all developed by equally large organizations and while they’re far from perfect, they produce far better software than the vast majority of web UX organizations.
Something about web dev is uniquely terrible, in addition to the common factors you mentioned.
I think you judge it like that because errors in UI are usually the most visible.
Meanwhile I can't get my laptop into sleep mode at times because such a seemingly simple feature can't be reliably implemented for some reason and it's been like that for years - that's just one out of many instances of non-web software being terrible.
There's a lot of great software that's not written by people who work with web development. There's also plenty of utter garbage software that's not written by people who work with web development. I don't think web development sticks out enough to make it unique
I’m actually happy about how garbage reddit is. I considered installing a redirect plugin to send me to the old.reddit subdomain[1], but ultimately decided I shouldn’t encourage myself to use Reddit, having it load into that dumpster fire of a front end has really helped discourage me from using or interacting with it. I can happily report my reddit usage is at an all time low!
[1] fun fact, you can shorten this to “ol.reddit.com” which is nice because it only requires one hand to touch type “ol.” and get an autocompletion. I always thought reddit was a great name because for right handed users, “redd” could be typed with the hand that would typically stay on the keyboard and would 100% get an autocompletion. Alas, most users are now on mobile.
Yeah, I agree - the old version of gmail was perfectly reasonable. But the current version is IMO representative of trends in web dev. Google is both influential and tends to follow the recent trends. Gmail is by no means unique - every web app created in the past few years is the same insane jumble of frameworky js code and a DOM 100 layers deep, just because someone can't be bothered to learn CSS and plain JS.
Google's frontend engineering tends to be pretty poor and out of step with wider industry practice in my experience. They have created not one, but 3 of their own frameworks (Angular, Angular 2 and Polymer), two of which (Angular and Polymer) were rather poorly performing, and the third of which is kinda ok but doesn't seem to be that widely used internally. They tried to create their own frontend language (Dart), which nobody uses.
Meanwhile most of the industry is using React, which is actually pretty well engineered and perfectly fast (you may well find yourself using a slow React app, but that's not React's fault).
P.S. Talking layer of DOM nodes, I actually did a few samples of this the other day. Gmail has 34, which was by far the deepest of any website I checked and probably isn't helping it's performance.
The original version of GWT was mainly about solving compatibility issues between browsers (of which there were many). Modern GWT/J2CL is all about leveraging the mature Java development tools, especially code completion. GWT can be used with just the standard DOM APIs very effectively these days via the elemental2 library.
That makes sense, but I also recall it was about trying to use the same language on the server as the client. And it worked as well as the efforts to use JavaScript is the server. Which is not to say it can't work. There are a lot of traps there, though.
Material Design is also Google’s fault. Regrettably a pretty sizable chunk of the industry either uses it directly or—worse—incorporates elements of it in their design.
I also have a pet peeve with the Roboto font family developed by Google and used everywhere on the web. But I’ll admit that it is a fine font face for its purpose, just very boring and used way too much.
function getMaxNestLevel() {
var i = 1, sel = '* > *'; /* html > body is always present */
while(document.querySelector(sel)) {
sel += ' > *';
i++;
}
return i;
}
console.log('total nodes', document.getElementsByTagName('*').length);
console.log('max nest level', getMaxNestLevel());
I feel qualified, because I’ve been a SWE for 20 years, have worked in lots of domains (including web, sometime ago) and so I feel I am able to draw comparisons.
Not for nothing, but when I wrote a couple of web apps recently (most recently a market monitoring tool and a physics simulation in wasm and webgl for rendering), I tried different frameworks, but always ended up dropping them. The end result is snappy and fits in kilobytes, and the freaking back button works.
I did of course use some libraries for rendering things like graphviz. I am not a crazy person. But I fail to see what value e.g. react.js* adds, or why there are libraries to wrap websockets.
* For react specifically, I can actually imagine scenarios where it’s useful, but in practice, 95% of places where it’s used should have probably just been HTML with some minor JavaScript for bits of interactivity.
I've also worked as a SWE for many years, and I can tell you hands down that self-rolled frontend codebases are _always_ harder to maintain and add features to, than ones that lean into frameworks and libraries. They are shared understandings that are easily transferrable between contexts.
You don't mention maintainability in your examples, purely things like back button support and disk footprint. They're just pieces in the puzzle.
React and Angular are both blazing fast, and are definitely not pointless. Open create react app, spin up a page with 50000 buttons, compare that to a plain js version, then come back and tell us about the atrocious performance.
Apparently you've never heard of event delegation, which has been around for at least twenty years? (I could swear I was using it on IE6, but I might be misremembering.) At any rate, the plain JS way of handling 50,000+ HTML buttons efficiently is a long-solved problem.
> This kinda crap is why the gmail tab uses a GB of memory.
It isn’t. Ironically things like memory leaks are why pages use a ton of memory and in my experience everyone coding from scratch results in more of them, not less.
I really don’t agree. When I worked at Google, I filed a couple of bugs for memory leaks in the new gmail, and to my knowledge they still haven’t been addressed two years later, because there are a million dependencies and the thing is so complex no one can debug it, or even come up with a reliable repro.
Strong disagree. Frameworks are almost always going to be better profiled, better tested and have more eyes than any home crafted code. The number of lines isn’t really that relevant.
Well, the thing is… for any non trivial web app, you do need a framework (spa or Unpoly like). The question here is if you use an existing one or end up writing your own.
I don’t buy the “just use html, js and css”… giving the same developer skills, without a framework that becomes a mess much sooner than with one. Unless you’re just building a small landing page or todo app.
Unpoly is pretty good, and operates on a slightly different level than htmx, which is all about AJAX. Also, this slideshow is from 2016, when htmx was pretty much unknown.
>I'm actually very intrigued by the whole "let's take a step back and move a lot of our logic back to the server" approach to modern dev,
What's old is new again.
There's an entire generation of developers now who have no concept of the old world. They started with React/Angular/Vue and have no idea that SSR was the standard default for decades. And now it's a "new idea" that is held up against JS development with no understanding of why and how we got to where we are, and the tradeoffs that were made along the way.
I started out hand-coding static html pages, graduated to Drupal and Wordpress php stuff circa 2009 (glad that's over!), then worked on a largely server-rendered Rails SAAS in the APM space that I guarantee you've interacted with if you've been in the web game for more than a couple years. These days, I may sling React for my 9-5 but a lot of my personal projects and internal tools are vanilla express apps with very little client-side code.
It's true that a lot of newer devs really don't have a great deal of context for why React & company became the de facto default for building websites and how much we used to get done with largely server-based architectures. I wish we did a better job of teaching fundamentals here rather than bootcamping everyone straight into building SPAs.
However, it's also true that the baseline expectation for web experiences is a lot higher in 2022 than it was in 2009 in terms of the level of responsiveness, interactivity, and overall "app-like-ness", to the point where I think that even with the massive improvements in bandwidth, latency, and web protocols, we still need to accept that there are many cases where just shoving full HTML documents over the wire isn't enough to satisfy user & stakeholder expectations. That's where stuff like LiveView, htmx, and Web Components become interesting to me. The web has evolved, and finally the server-side-application paradigm feels like it's starting to evolve along with it, and these evolutions do feel like they're novel & useful enough to deserve being called "new ideas".
> However, it's also true that the baseline expectation for web experiences is a lot higher in 2022 than it was in 2009 in terms of the level of responsiveness, interactivity, and overall "app-like-ness", to the point where I think that even with the massive improvements in bandwidth, latency, and web protocols...
What? The exact opposite is true: the average web app (including ones like gmail) is far less responsive, slower and heavier than it was 10 years ago. On my M1 Pro, gmail renders at about 20 fps and takes 2-3 seconds to load on gigabit fiber. The web has never been shittier than it is today.
Agree 100%. Also makes me laugh when people talk about server side rendering. Serving a html page isn’t rendering. The fact that a couple of megabytes of JavaScript ever became responsible for making the simplest of pages even display is possibly the worst technology regression I’ve ever seen.
I think 'rendering' here means something like 'doing what's needed to produce the final HTML for the browser to render'. You can produce the final HTML on the server (server side rendering), or the front-end (e.g., via DOM manipulation), or mix them up (e.g., HTMX replacing parts of the DOM with pre-rendered HTML from the server).
i.e., there's two different steps that each do something that can be called 'rendering': templates to HTML, and HTML to what the user sees in their browser.
I'm inclined to agree with the main sentiment of your post though, and lean myself towards solutions that rely on the server heavily with just a splash of javascript on the front end.
> However, it's also true that the baseline expectation for web experiences is a lot higher in 2022 than it was in 2009
Is it?. I think this expected behavior and increased complexity comes more from designers and product owners than from real actual users.
I, as a user, still enjoy a lot more the old Reddit (old.Reddit.com) than the new one. I even prefer hackernews than many other more “modern” forums that feel slow and bloated. And I also prefer the current GitHub to what I bet it will become the moment they start moving it to React.
I mean, the broad problem is web performance. I think that client-side rendering is a red herring, and really not the main problem. I think part of that issue is that developers don't spend as much time optimizing performance, because it's less clearly necessary up front.
Web in particular is very hard to optimize for performance just because of its inherent platform limitations. For applications, people expect as-good-as-desktop. But you can't just download a big .exe and install it -- you have to transfer everything over the wire, which is obviously very expensive. So the big optimizations are reducing how much you need to do that. In apps where people click to new pages and do server-roundtrips for interactive data constantly, you can reduce the amount of time your blocked by caching stuff in the browser. For pages which transfer a lot of data, you must use webpack and similar tools to compress and optimize the various bundles.
The second issue is that when people experience slow and annoying websites, it's normally because of ridiculous ad tracking and subscription popups, which are things most web devs don't really care for. But this is also how the web is funded, so you can't just wish it all away. But this is completely unrelated to client-side rendering frameworks.
Another point is it's not like servers are automatically fast. Sure, more powerful than client computers, but also serving many times more people at once. You'll have to do performance optimizations for servers too.
So my main point is that 1. client-side rendering being slow is usually a red herring. and 2. with any site, you should be optimizing performance. That includes both client and server, both of which can be tricky to optimize
And I do think baseline expectations are higher. For example, on old.reddit.com, when I open a post, it opens a new page. When I go back to the post list, the whole page reloads, which takes time (and I'm on a gigabit network). It's a lot faster in new Reddit, where everything happens on the client. At the same time, I'll certainly agree that it's very poorly optimized in many cases. I think there are some memory leaks. It's a solvable problem, though. It's not bad because it's using React; it's bad because they haven't taken the time to optimize its performance.
And talking about GitHub. The new client-side nav is a lot faster for navigating around a codebase than the full page reloads it used to do. It makes sense, that's a huge part of the page you can just leave there. It's a performance optimization to use more client-side rendering in this case, pretty clearly. Fewer round trips to the server (very slow), less data to fetch overall because half the stuff on the page doesn't need to change... etc.
And products like Google Sheets and Google Docs have also really pushed hard on client-side interactive features that were not common 10+ years ago, but are more common today.
And talking about hacker news... here are some problems hacker news has that modern websites typically don't have:
- Form submission weirdness, particularly around the back button
- Have to go to a new page and reload the old page when writing/editing a comment. More round trips to the server.
- Styles are poorly optimized for visibility on both desktop and mobile.
- On mobile, touch targets are very small.
- Form markup is limited and opaque (e.g. most people expect to have a toolbar for rich text options).
Yes "app-like-ness" when that's appropriate. But a lot of the web isn't that. Yet devs / agencies are using a sledgehammer (e.g., React) when a Phillips head screwdriver is what's need.
Users get experiences they don't need (nor want). Site owners gets a maintenance dependency they don't want (nor need).
I think the basic divide is something like "CRUD forms" vs "interactive applications".
I've built basic CRUD forms with ASP.NET MVC. I've built them with Rails. I've built them with React (+ a hundred random libraries). I've also built interactive "apps" in those languages.
Looking back, the amount of "interactivity" that React adds to a CRUD form is NOT worth the added complexity. But! Right now my dayjob is creating an _insanely_ complex app that you could not have done five years ago with Rails (or, like this presentation is about, something like Unpoly).
I think a problem is that React is just more _fun_ to work with than basic server stuff so devs want to work in it. The added complexity is worth it to have more fun. Maybe that's just me, though. I know a lot of people see React as a hammer to hit every nail with, and I was like that for a long time, but I'm starting to come back around to more server-driven use cases for simple sites.
I've been messing with Fresh a lot lately and it's a nice middle ground of defaulting to rendering _most_ stuff on the server, but you can have "islands" of interactivity that get sent to the client as JS. I'm not sure if it will end up gaining traction, but it's pretty nice.
This stands out to me as one of the cases very few JS frameworks have gotten right. Remix, with its "all mutations are just form actions" approach, is the only one that stands out to me as having done it well.
I agree with everything you said up to the part about fresh and deno, where it looks like you’re falling in the same trap of those wanting to use react. You don’t even notice it but this is the problem. Always thinking the next shiny tool will solve the problems.
Rails, Lararavel and similar frameworks are great for crud apps. No need for fresh or demo or svelte or next.js for that.
I don't think that Fresh or Deno will solve any problems that aren't already solved. In fact, there are a lot of very frustrating bits about Fresh that worked a lot nicer in i.e. Rails. I would not use Fresh in production right now; if I were making a generic CRUD app I would probably use something a lot easier and with a bigger community around it.
From my experience, this problem is also rooted on designers and product owners.
My own team “modernized” a forum/blog tool used for internal documentation by moving a lot of it to react and SPA architecture and added ton of “app like” features, and I just hate the thing now. The old version, which we still run on some installations, is way, way faster, easier to use, more responsive and more reliable.
SPAs solved a labor problem I think. It was the reason that front end development grew so fast.
It is much faster to train people on a combo js+css framework rather than training someone on backend languages, databases, queues, authentication, scaling + html & css for server side rendering.
That seems strange to me as being an "old school" developer I have a lot more trouble trying to get a hang of the new front-end stuff than I do learning a different back-end framework.
But I guess that is just me/generational? Ironically, I feel like there are a lot more job openings for front-end than for back-end now, and I'm much more comfortable on the back-end.
We use SPAs and a pretty strict typescript/react stack. And I still have to know:
Cloud Tech (AWS) which also includes, lambdas, Iam management, dynamoDB, cdk or serverless, API gateway, S3, secret managers etc.
Add to that list the technologies that often get thrown in for extra monitoring testing etc. Jest or mocha/Chai for unit tests. Dynatrace for monitoring. Kibana or something for logs. Some tool for analytics. Github actions for setting up deployment and CI. Maybe you need redis for intermediate caching, etc.
In addition, before you kind of sort of had an intuition of how the data flowed through your basic stack from the database to the web page. Nowadays who the heck knows what's happening. You'll hit some api gateway endpoint which auths through a random lambda who knows where on what server, then it will go hit the actual lambda that holds the function you want to call which may reach out to a database but get intercepted by the redis cache etc etc etc.
I see this “AWS for everything” Well-Architected stuff everywhere. AWS benefits greatly from inserting itself in between all our architectural layers.
There’s nothing stopping companies from using the cloud for its primitives (compute and storage), maybe with managed FOSS services (RDS Postgres). We don’t _need_ to go all in on AWS to build a ‘modern’ web application. Yet somehow much of the industry dances to AWS’ tune on how to architect software.
Architects in my company are required to have aws certificates. They can be monkeys but if they can list a dozen of AWS products and put everything there they are given the keys of everything.
You may work at a company that requires fullstack expertise, but there are plenty of other companies where frontend developers are fully insulated from all those cloud technologies. I'm not sure what your point is here tbh.
A friend of mine moved to a company where they use plain Laravel (with just some sprinkles of js using Alpine) and they deploy to heroku. A team of 10 devs, no systems/devops/Infra and he’s fascinated how well things go and how fast they ship stuff compared to where we worked together before (with all the usual react/redux stuff and an elixir backend on kubernetes)
I wouldn’t say modern web dev is simpler, I just think it is more decoupled. This helps with training for & staffing the more specialized roles that emerged.
Users wanted responsive UIs and Gmail showed the power of AJAX in the browser. In the mid-2000s, server power, network latency, and maintaining state were the challenges. The UX was more powerful when the client tracked state, only requested the data it needed, etc.
Things have flipped. SPAs became bloated as abstractions were introduced. Network latency and server power is not an issue anymore. Rendering a bunch of HTML is as quick as rendering JSON.
As a vet of the IE7 days, I love this trend. Leveraging the best of server compute and browsers is going to simplify web app development a LOT.
Not quite what I remember. XMLHttpRequest was invented by Microsoft and used in a outlook webaccess. This was only possible in IE6.
But Ajax is merely a pattern that was enabled by the ‘dynamic html’ that was made possible by having a DOM and JavaScript. It was possible in Netscape years before IE6. I did a production app with Ajax in 1999, using IE4. Before the term Ajax had been coined.
AJAX in early Gmail was a massive improvement. It was mostly hand-coded javascript in a relatively thin page and it was the sweep spot. Today's version of gmail is the most bloated pile of spaghetti framework code imaginable and is far less responsive and usable than the plain HTML version.
People want responsive interfaces, absolutely. Whether the interface is implemented as full page requests to servers or AJAX requests is irrelevant to users.
Bare AJAX itself almost always will perform better than a full page request, but as you layer on additional requirements, frameworks, libraries, etc. that isn't always true.
I think old reddit and new reddit are a great example of this - both are processing the same data and presenting a very similar UX. But at least for me, the relatively javascript light old reddit interface with full page reloads feels much more responsive and usable than the new site.
Ask yourself: Do "people2 prefer looking at "spinners" or more-or-less animated "page loading..." texts?
Waiting time is the issue, the technology is not.
A server generated page that loads fast beats a Framework-generated page hanging every time!
As for technology:
With "classic" server generated pages "people" will know what is going on (browser indication tht page is loading) and they will know what to do (wait a few seconds at most). With frameworks "people" are left out in the cold with no indication what the problem really is and no apparent remedy or path for solution as a page refresh might interfere with state logic or whatnot bringing totally undesired results.
A lot of dev managers like that they can turn one team into two teams by splitting everyone into front and back end. This makes hiring easier. I cannot emphasize this enough. Easier hiring is a huge deal.
As a secondary effect it also allows for more kingdom expansion. It's much easier to have two teams of five than one team of ten.
That being said, I'd rather manage a team of five good full stack engineers than ten average front/back end engineers. The communication cost of trying to get features out the door with two teams of five is very high.
IMO, I think this is a valid question (though asked a bit crudely). To generalize, it's because someone who has done JavaScript programming can point to NodeJS and say, "I'm full stack." Which, while technically true, skates over the reality of NodeJS as a less than ideal backend (see Ryan Dahl and the dawn of Deno). Second, it's more lucrative to bill oneself as "full stack" even if in reality a person isn't. At one point in my career I would have considered myself "full stack". Over time I realized that a "full stack" engineer is a jack-of-all-trades type, which in theory, can be valuable in the right circumstances as an individual doing work alongside non-full-stack engineers. Asking a room full of "full stack" engineers to design and build a product of any complexity above CRUD will naturally lead to a self-sorting of UI/UX and server side engineers.
Exactly. Full stack engineers are sorely needed in every project because they can see the forest from the trees, not because they are experts in everything from botany to carpentry.
As somebody who started in the days of cgi-bin and SSIs, I've lost track of how many wheels I've seen re-invented over the past 25 years. Fortunately I do backend which is at least slightly more stable (modulo the fact that I'm sending more or less opaque-to-me javascript rather than html).
OTOH I have really surprised some PMs by producing a Perl cgi wireframe site during the course of the meeting where we designed the wireframe so we could actually try it. It's a skill more devs should really have in their back pocket.
I've tried to use htmx exactly two times. Both times it just ran directly into a brick wall, because it's so limited and has no escape hatch where you can put your own logic. Their solution to this is their half baked new language, which isn't ready, a new language, and seems very much unclear. All they needed was proper hook points.
for client side scripting, I think you are talking about hyperscript, which is definitely more speculative than htmx, but I wouldn't call it half baked: a lot of people are using it successfully in production
a less-esoteric alternative would be to use alpine.js, which is similarly embedded but uses plain ol' javascript (and offers reactivity as well, hyperscript is more of a pure event-driven language)
I had this same experience. I thought it sounded like an interesting idea, but in practice I didn't think it lived up to some of the hype it often gets on HN.
I think htmx can do the same with built-in events---just not built-in. In fact you may even skip that one today because custom elements are wildly supported since 2020.
More like a hotwire but instead of being extracted from Basecamp's work it feels like it could have been extracted from Makandra's work maintaining many different Rails apps.
I think it's very good and for me, the best of the bunch. I'm moving a very large app to it (from old school RJS) and it seems to have thought of everything that I need.
I'd say 95% of the apps we build are now based on Unpoly, the rest on React.
We now believe SPAs are not a good default for the type of apps we're building. We're still reaching for SPAs when requirements demand high-frequency user input and optimistic rendering, e.g. for a chat or online game.
My problem with liveview IMHO is that it requires an active internet connection and doesn't have offline support, which is a step in the wrong direction in terms of UI performance to me.
It makes sense for a subset of applications that require connectivity but many apps or tools should be able to work offline or without a constant connection.
Livewire is intended to be used where you’d otherwise need to do an API call anyway. It is not intended to handle every single click and toggle and key press. You still do that on the front end, usually with Alpine.
It is a way to avoid writing backend APIs.
If you’re sending every click or every key press then it’s not the tools fault. I agree though, that this should be better explained in their docs.
A web application requires connection to the web, which only exists with network connectivity. What you're trying to do is create a desktop application with tools built for the web. Therein lies the problem.
This is not true. Applications only require network connectivity for the initial resources to be downloaded. After that they can be cached, service workers can also be used to provide for offline support. Plenty of applications can run only in the browser and don't need to have a constant connection to a back-end. I can think of a calculator app, where all of the application logic exists on the client in JavaScript, in this case a network connection is an unneeded dependency. But once you build it with a framework like liveview it cannot work without a constant connection.
To me it looks like the opposite. It’s more batteries included, easier to use and has been around for longer. I like htmx too, but too me it’s only better on the marketing side.
The way it componentized the different ‘cards’ or fragements with URL-style lego blocks seems to be the primary novel feature that’s layered on top of HTMLX.
This is already the direction that Remix and Deno’s Fresh framework seem to be taking, at least spiritually. Isolated server-rendered-first approach and only bare minimum JS loaded for fragments/islands that need to be interactive beyond what can be loaded on pageload.
The html attributes vs JSX style templates is really a matter of taste IMO. Although I haven’t used htmlx I get the feeling it would be limiting for more complex cases and involve a lot of hackery or pigeonholing complexity better served by straight up Typscript.
I'd try to optimize logic on the client before moving more logic back to the server. This is essentially what Svelte and SolidJS do - they clean up a lot of the existing overhead of SPA solutions. Of course, rendering server-sent HTML as in the HTMX model (or perhaps HTML generated in-client via WASM) instead of doing costly fine-grained DOM updates could then be added as a further optimization.
I had this going back around late 2000s with jQuery and Spring Web Flow. It was absolutely glorious. Server side declarative navigation (great for dev coordination), seamless transitions, less bandwidth consumed, super simple client side rendering, no more issues with the PRG pattern and user interference, and I even think I put in even listeners to execute animations based on the return fragment.
Never was allowed to put it into prod - "didnt need it". Oh well.
Like others have mentioned, this seems to be from ~2016. The lack of HTTPS on the provided link ages this some for me, but the use of coffeescript really dates this[1]. I even thought coffeescript had been deprecated, but it does seem that the project is being kept alive[2] which is really cool.
Perhaps, what is most interesting is that it took nearly 4-5 years for the front-end community to collectively come to the conclusion that SPAs are not _always_the answer. I don't think the zeal for SPAs came from a bad place either. I can remember how poorly ASP.NET and other frameworks of the 2008-2012 era packaged an overcomplicated way to pass data to view layers. There's lots of curmudgeon-ining from non-frontend folks but, in my opinion, the lack of performance and ergonomics with existing frameworks, combined with the newness of Node.js is what brought about the explosion of tooling and frameworks.
There is a place for SPAs, though. VS Code, Spotify, and other apps that need a desktop / browser experience to feel like a mobile app are great candidates. Twitter, for example, shouldn't be a SPA or SPA-like application. I find that it frequently over-caches content and will randomly refresh my feed at times while I'm browsing. It feels as if a simple web page that needs to deliver more JSON responses as I scroll is trying to do too much.
It's interesting that they appear to plan to use with Rails as a back-end (they mention Rails bindings), as Rails 7 release corresponded with a solution which appears somewhat similar to me, turbo/stimulus. (I want to provide a link to it, but I honestly don't know any great docs for it!)
But I haven't used stimulus/turbo myself. I'd be interested in a compare/contrast between Rails' Stimulus/turbo and "Unpoly" covered here.
There also seem to be a number of other non-Rails-related offerings in this space too. They seem to be really multiplying, which I think shows the exhaustion with JS front ends and desire for things in this space (I am not sure quite what to call it). But I wonder if we can converge on one or two instead of creating more and more. One of the benefits of a popular layer is that you can start building share-able re-usable solutions which compose with it -- you can find lots of solutions for certain things for React, like you used to be able to for JQuery, but splitting between stimulus/unpoly/htmx/etc...
I don't consider a demo app with no docs (that presumably uses stimulus and turbo?) to be anything to close resembling good overview docs on what stimulus and turbo are, but thanks.
Why can't anybody talk about the fact that HTML was not designed for being dynamic?
Isn't there any format that is a real alternative to HTML, which is truly interactive, lighter than the DOM, use text as a mediu, is multi usage and platform agnostic, can use whatever scripting language, and be easily rendered?
It's not another new format, it's just that a new format is needed to make things simpler.
I have zero patience when it comes to learn angular and its framework model thing. I don't want framework, I want formats and protocols.
I don't even know gopher but something new and fresh and different is really needed. Maybe as long as it's used in a controlled environment for internal/pro software.
HTML wasn’t designed to be dynamic but it certainly evolved to be, and personally I don’t think it’s that bad per se, but modern frameworks such as Angular and React have added so many unnecessary layers of complexity, that most developers have to reason about and work with several layers of abstraction just to make simple things work. There isn’t a real alternative because all the popular modern browsers only fundamentally understand HTML, and have been evolving and refining this over the past three decades or so. What would be nice is to have a new markup that is natively rendered and handled by the browser and that behaves
similar to the document/page model everyone is used to but with built-in persistence, session handling, and dynamic rendering with a very simple and intuitive API. Obviously we have all of these already in one form or another but it’s all just hacked-together HTML/JS/CSS under the hood. Edit: minor corrections because mobile devices suck
Whenever I build something on the web, I can't help but feel like I'm dealing with a word processor. A very advanced one, but nevertheless still a word processor at its heart. Coming from native apps, I despise the idea of text just being out there without a TextView or something.
That’s exactly what it feels like. A word processor distorted into an application development platform. With nearly 30 years of technical debt as well.
This is becoming untenable. The web has only gotten more and more difficult for novice users and developers alike to publish onto. That spells doom for the web in the long run.
I know exactly what you’re talking about from when I transitioned to web dev, but I feel that much less these days with TypeScript (done well) and VS Code. It’s actually a pretty decent experience even if it does feel like a bit of a facade over the top. Once you’ve got all your setup and dependencies sorted of course…
Same here, that is why when coding Web apps, I gladly take the back seat, and although I tend to rant about WebAssembly, I am grateful for it bringing Flash like development back.
Have you tried htmx? Maybe HTML wasn't designed to be dynamic, but IMO htmx does a great job extending it to provide interactivity in a way that feels natural and simple.
This is on point. The frameworks will always remain crippled due to the underlying platform. Of all the things in the Web Landscape, WebAssembly seemed like a practical move towards what you are speaking about. You can compile Rust/Golang/C++ into a "Web Binary", which in turn can be shipped realtime to clients. However, there's a long way to go, before WebAssembly works as a smooth container environment with better APIs for enhancing the user/dev experience for dynamic apps.
No markup format will support interactivity well, you need a full scripting language with all of the standard features.
I don’t think it’s possible to couple the interactivity and visual layout in a single language and have it make sense. We can certainly do better than JavaScript, HTML and CSS but I think there will still need to be at least two languages to describe layout and interactivity.
Nobody cares that you can build up a DOM differently, that's decades old. What matters is: how do changes behave? Your example shows nothing of the complications of lifecycles or state-render loops.
The example above is brief due to forum limitations, so yes, it doesn’t include lifecycles or the whole implementation of a rendering loop. But the context of that comment was using one or more languages for building interactive hierarchies of widgets, and not the topic you brought up, so it didn’t even have to.
Yes and no. That you can build a hierarchy with Element(attrs, children) where attrs may include callbacks is really decades old. Plain JS does it, react and angular do, and also a million template languages.
The question is, how can we not end up in a spaghetti ball? Your scripting style alone brings nothing to the table.
I would look up to something like Elm, which does answer OP's question.
This is so overdue. I'm ecstatic to see this finally happening. The frontend framework bloat trend honestly went on ten years too long. It was like the thin client and thick client debate turned into the thick client and thicker client debate.
People will roll a gigantic create-react-app mess for the tiniest of frontend projects. Yes, it's an instant codebase! But that's all code you have to maintain. Stuff like hotwire and htmx can't come fast enough.
This isn't the only one... JS frontend world as a whole has been moving back to SSR for the last 5yrs. But importantly while still maintaining the benefits of desktop-style interactivity, instant page transitions, componentized code organization, treeshaking/
+ lots of small JS/CSS files only loading what's needed vs one massive asset, cached/offline friendly JSON data streams, etc, etc.
It is still SSR but it's much much more than just going back to 2005. Combining the lessons of the last 15yrs with the as much of the past SSR world as possible. It's not simply throwing it away and regressing to the old ways.
Yeah it's gonna be somewhere. I guess it's really about the sweet spot that minimizes the complexity on both sides. This captured it: https://imgur.com/a/mx7Y0uD
That might sound convincing to someone who already prefers working on the backend, but you can basically invert your statement to describe a typical serverless setup: “You need a frontend anyways. Shifting to a client-side codebase is eliminating duplication, and arguably browsers have better DX so win win.”
You still have to connect with data somewhere which often means running an api server connecting to database. There's certainly options to outsource this to managed services, but "serverless" and distributed systems can often be "more complexity" unnecessarily, without any corresponding productivity or functionality gain.
There's a clear trend back to the server to some degree, so on the margin what I'm saying seems to have some basis in many people's experience, where the sweet spot of many apps does not require the added complexity of SPA's or lack of expressiveness of writing so much app code in javascript, building distributed systems, or having to use Firebase style non-relational database services. Moving much of this to the server reduces some of this complexity often with productivity gains.
Additionally, you as the developer have full control over the backend environment which isn't true of the front end.
That reason alone is why I prefer having the server doing the bulk of the heavy lifting. It reduces variables in the most functional parts of the app and reduces my need to try to herd users into using particular browsers.
Probably worth mentioning that the submitted link is just the presentation by the author of Unpoly, with some history and reasoning, but the better explanation of Unpoly itself is on their website:
I think it was about breaking up with SPA frameworks/libraries. The whole slideshow is about unpoly which is a JS library which provides SPA like experience without the complexity of a full blown SPA
The title is poorly chosen at best and misleading at worst. They’re not abandoning JavaScript, they’re just abandoning client side rendering (angular, etc)
I think the intention was to spark conversation loosely around SPAs vs SSR. It seems we're at yet another turning point and frankly, I'm enjoying the debate.
I don't understand why so many people are dismissing this because the post is from 6 years ago. Looks like the project is still actively maintained, which, after the 3 lifetimes that 6 years represents in development timelines, seems to be a stroke in its favor.
This is not a website. It's a presentation made with reveal.js framework. This has been made as slides for a talk. It is very much feature that people accidentally don't back out of their presentation while they are doing it.
There’s a browser API for preventing people from leaving pages if they are in a context like that. And unlike this method, it works for the first page, works if you try to close the tab, and works if you try to navigate to a different link. And it doesn’t become entangled with browser navigation.
Well my point is that it's a bit unfair to blame the authors of the presentation for this functionality. Reveal.js is pretty much the standard open-source javascript presentation framework. It's used and developed by slides.com.
I don't know why it works like it does. But mind you this is 6 year old presentation with reveal.js from 2015.
The back button has evolved alongside SPAs, and users mostly don't expect or want their back button to take them back through the hundreds of small state changes they've caused by interacting naturally with a UI.
Yep. Browsed the slides and then couldn't go back to HN using the back button. Is there a max history depth setting per ___domain in Chrome? I haven't run into this issue in awhile and forget why it happens.
Is it really worse now though? As a full-stack developer since 2013 (Django on backend since the start, React on frontend since 2015), I think it's finally getting better in the past couple of years, and I think it was worst between 2016-2019...
After years of not knowing what to use because the libraries I've used 2 years ago are no longer maintained or their APIs changed 3 times since then, there's now Next.JS which seems like a well supported, batteries included, opinionated framework with good documentation...
with vite and esbuild on the rise, the days of fiddling with webpack and other complicated build configurations may soon be behind us...,
typescript vs. flow seems to have ended with typescript being the clear winner and having great support in most libraries, frameworks and IDEs... (although I'm a bit scared that the JS native type annotations proposal may again fragment the typing world here...)
browser-side APIs are no longer evolving so rapidly, IE11 & EdgeHTML are dead and there aren't that many features/bugs specific to Firefox/Chrome/Safari anymore...,
Does Nextjs make you use Django less? I think my basic process going forward is going to be Nextjs + something like Supabase as default and only add a more complex backend as needed.
At work we don't have enough manpower*time to convert our React spaghettis to Next.js, but I'm starting to experiment with Next.js & prisma & postgres aside and plan to use it for my next hobby project which I'm never gonna finish just as the rest of them..., If I'm gonna like the approach and not miss some of Django's features right from the start I'm gonna think about letting Django in the past...
Supabase could be enough for a simpler project where you know you won't need any advanced features.., I wouldn't just start with it if I knew the project's gonna get huge in the future, but I did a couple of smaller projects with Firebase alone..
Is it really? I have the exact opposite opinion. I mean, I feel like the industry has pretty well standardized on React in Typescript for the front-end on web apps. Sure, there are other technologies that do different things (e.g. Svelte, and someone else mentioned Phoenix LiveView), but for the standard "I'm building a CRUD-focused web app", there are simple choices to make and it's easy to "do the right thing". I contrast this with 2016, when things were still in flux so it was much easier to make what turned out, in hindsight, to be the "wrong" choice (e.g. Angular or Flow). Plus, tooling support is much better now.
I'd still argue that React is very much the default for new applications where legacy interop or existing team familiarity isn't a factor.
I mean, this is obviously an extreme example, but COBOL is also still used extensively in the enterprise, yes it's still not really used for any new projects.
I would say things are improving. Through WASM we are now getting more languages running in the browser. They are starting to bring alternative frameworks and ways of doing things to the table as well. Instead of this frontend / backend notion, we should start thinking again about the notion of networked applications; like we used to do before the web became a thing.
Inevitably you want some computation to be close to the source data (for efficiency) and some other computation to be close to the point of interaction (for responsiveness). It's not an either / or proposition. You can do both. And phones and browsers can do a lot locally these days. So there's no need to pretend that it is still 1999 in terms of browser capabilities. They can do so much more now.
Instead of AJAX, we now have companies like Tailscale doing all sorts of funky networking stuff in a browser. Likewise, people are running entire 3D games, photo and video editing tools, or design tools like figma, etc. in a browser. All enabled by WASM. Most of that stuff does not involve a whole lot of css, javascript, or html. That stuff is increasingly optional. Browser application development and desktop application development are finally merging after being considered completely separate things for more than 2 decades. It's all just application development. It may or may not involve talking to servers via a network. You don't have to limit yourself to HTTP when doing that.
At this point web development has become so complex and fast-changing that by the time you're done explaining how everything works the whole world has moved on to the next generation of complexity. We're almost at the point where web development is becoming "unknowable"; like knowing the position of an electron... You can know where it was at some point (maybe) but really the best you can do is estimate its rough position in the cloud.
Yeah, perhaps if only there were a structured markup language that could be used to provide this information in an accessible manner with separation of concerns between content and display.
I still don't consider myself a veteran but I saw the web evolves from 2010 up til now. And imho we took a wrong direction. Web apps and pages have become so bloated and complex that's crazy.
I'm just starting to recover from my previous work. I had to maintain migrate and add features to a legacy system (built in 2017) which had initially a GraphQL api and a SPA, but that was later split in 12 microservices (with cycling dependencies because hey why not) and 3 Big react applications, all of that in Javascript + Typescript (added later with all the gradual typing stuff that makes you think that your code is correct).
All of that to serve 300 monthly users between 35 and 50 years old, who just cared about filling forms here and then and have some charts over data. 0 rocket science.
So far I remember history going like this:
- 2010 Clean SSR with rock solid boring technologies and some vanilla JS where it made sense, I remember page load where crazy fast and you could not feel the need for an SPA.
- 2012 NodeJS arrives. It came to save us from slow blocking IO apparently (??)
- 2016 React arrives. It came to save us from boring old SSR apps
- 2017 All React + SPA frameworks (in house often)
- 2017+ Someone has the brillant idea to do SSR with React. "It's a revolution"
- 2017 GraphQL arrives. You are saved.
- 2020 Nextjs arrives. You can now do static generation for SEO and SSR and apis ! That's great. Thank you Next !
- 2022 Someone wants now to do Server Component that will make your page load going insanely fast (which we already did in 2010 when we carefully loaded vanilla JS at the time)
I learned a bit of ASP.net core recently, and I think that boring razor pages are still relevant.
I remember my first lead in 2010 telling me that ORMs and SPA where completely overrated. I didn't agree with him at that time. Now 12 years later I think he might have been correct.
I'm not saying SPA have not their place. Some apps can't do without it. I'm saying that we are victims of "hype driven development" which involve having some trendy hashtags in your resume and convince your manager that you have seen an incredible new tech that will make the business incredibly wealthy.
Except it does not and you end up with a legacy system that nobody wants to maintain because too complex.
Maybe like Jonathan Blow stated in one of his videos, we make a confusion between going forward and progress.
Now all the new kids learn React in some boot camp or online but have no idea how to implement a map function. And I feel old and grumpy. And tired by all this.
> ASP.net core recently, and I think that boring razor pages
Came to a similar conclusion lately. It's a superior, cleaner alternative to traditional MVC controllers. Currently building a side project with Razor Pages + HTMX.
What most people do not get is, that modern frameworks long have allowed for a mixed approach. One can easily render templates server side in a backend non-JS framework and merely use some VueJS or whatever for a really interactive component on one of those rendered templates. Can even have a noscript block. It will be easy to understand code, work without people having to trust JS code, except for those few interactive parts. No harm done. Instead people go all in and think that it will be better to have everything in JS as an SPA.
This SPA unnecessarily approach eats up so much time, it is not rare for things to take weeks instead of days compared to simply rendering templates on the server side in a better language than JS.
It should be a simple thing to understand: Use the right tool for the job. Not JS for everything, just because of hype. But I guess part of the issue is the mentality of some frontend people to only learn JS, so they are incapable to going for the mixed approach.
I got very excited at the premise and the first few slides, thinking someone was going to talk about what a secret weapon vanilla JS is for productivity...only to find out they're advocating use of a front end library much larger than react or Vue.
At this point I'm getting a bit tired of backend people claiming that the cure to the frontend mess is a big library that pretends to be html.
Maybe it's just me, but needing to press "back" twenty or so times to get back to HN after viewing some of this presentation really made me question the author's authority when it comes to "how the web should work".
Unfortunately this particular delivery used a typical slide presentation which deviates from a bunch of its message; and unfortunately polluting history is a pretty typical problem with these slides-as-webpage affairs. I really wouldn’t conflate the medium (which is pretty much typical of the web way before complexity as timelined) in/for the message.
From my experience, the horrors of javascript come from letting the client handle too much stuff.
Frontends tend to run a ton of logic that should be handed off to the proper tool for the job: SQL.
The time and complexity you save on the backend is simply not put to good use these days.
I like to program SPAs like a reactive thin client and let the backend be more SQL or ORM heavy.
That is true. I have seen a big aversion to writing JPQL or native SQL whenever an ORM like Hibernate or JPA is involved. It seems to be like you home in on a single technology and try to make it fit it into things where another tool can do a better job in less lines of code.
I once worked with a team where the devs didn't know what hibernate generates inside the database, it was shocking how much trust is in these layers of abstractions.
The concept of unique and foreign key constraints, sequences or indexes was unknown.
It may be cleaner to you and written in the same language you are familiar with, but some problems simply aren't that easy to solve.
Looks very similar to what NextJS is doing, especially with their app directory API [0], ie have everything on the server with React Server Components but then stream in client-side JS as necessary. It's basically the "sweet spot" that TFA talks about.
Implementing these library/frameworks are just a small piece of the application. You implement strict development policies and standards that these frameworks wrap around, fit within the architecture. Too many devs rely on the framework to be the architecture.
A SPA is a piece of an overall architecture and these frameworks are still just a piece of the overall architecture.
Almost all of these frameworks can/will assist in strict standardization. Bootstrap is a good example. BS with a well designed style guide is a blessing, without it's the wild west. Angular, React, Vue are all very useful (back button included) if thought out and as part of an architecture.
In the end, they get you much further along with support, than a team could do with vanilla.
I'm always interested in ways of building partial page fetches, dropdown menus etc in a way that is thoroughly tested to work well with common screenreaders.
I'd love to see a frontend framework that includes detailed documentation (and ideally video demos) demonstrating how effective their ARIA screenreader stuff is.
Unpoly takes special care to always move the focus to the next relevant element in an interaction. E.g. when a link updates a fragment, the focus is moved to that fragment. Or when an overlay is closed, focus is returned to the link that originally opened that overlay.
Why would the story be different than SPAs? The screenreaders read final html in the browser. It shouldn't matter if comes from server rendered template OR javascript template rendered on the client.
Seems like it depends on how you code the html/templates not how they are rendered?
SPAs very often have terrible accessibility. I think it's important for frameworks that help people build interactive applications (SPA or otherwise) to provide the best possible defaults, plus guidance as to how to use them to build accessible interfaces.
> Looking for developers to maintain 150K lines of Angular 1.5 and 7000 abandoned npm packages.
I completely get that reason. Any front-end framework app is on live support in a few years. It feels like the equivalent of buying a cheap refrigerator that will break in 4 years, that is easier to get a new one rather than try to fix the old one.
Nice, though it would be nice to eliminate the need to write any JS. I disagree with page 17, I think server side apps could handle those things fine if they were made to support them.
I've been working on a Ruby framework with all the fancy stuff (reactive VDOM, hot reloading, scoped css) but it's 100% server side. Imagine React, but with Ruby and Haml instead of JS, and all logic runs on the server.
The GitHub link can be found in my comment history. It's not ready to be used for anything serious, but I think it's an interesting approach and maybe there is someone who would like to play with it.
This is where Angular shines Vs React et al. One "batteries included" library that does everything, no external dependencies unless you go out of your way to add them.
I have (and still do) written SPAs that only use Angular and nothing else. I don't even have NPM installed.
While I don't know that I'd go that route to build my own thing when Rails 7 already has turbo_streams and Stimulus I understand wholeheartedly why they'd move away from SPA-for-everything. I can't help but feel that the whole industry is under the spell of React and other single page application frameworks and use them even if other ways would have been faster to build, cheaper to maintain and less complex.
I don't get page transition being handled server side. They're basically just lying about that right. Maybe the server side framework baked in the page transition Js in it's response but it is still client side code
I was just building a very light version of this today. I'm just a backend developer sick of the bloated crap I have to maintain in my day job with React. I'm going old school, and going back to server side templates. Native compiled jinja2 templates using askama in Rust, tailwindcss, Postgres, and a little JavaScript magic to do things like server side form validation without refreshing the entire page.
I'm loving it so far. The pages load instantly, they're extremely lightweight, and all the state is on the server where it's easy to debug and all strongly type checked with Rust. I called my little "framework" curmudgeon, because, well I'm under no illusions about what I'm doing. Get off my lawn with those SPA JavaScript frameworks.
It’s wonderful to build this for yourself. However, when working on our larger projects or with larger teams, they’d have to learn all the random things you came up with, in your own way of thinking and so on. That doesn’t go well sometimes but sometimes it does work out. But if you leave the project, your favorite creation could become a newcommer’s nightmare that being a reason why we need to standardize quite a bit. If someone had to maintain your code, would they feel the same way you do about the code you hate maintaining now? If the answer is a resounding no then you most likely are on the right path.
It's literally a hundred lines of JavaScript, they'd take more time to learn the changes in a point release of React. I think it's a non issue.
But the bigger thing is it's just for me, because I'm the only one working on it right now. If the project is successful and I hire people one day, they'll put up with it because I'm the boss :) Meanwhile I'll enjoy working on frontend again, something that hasn't happened in years.
Micro dependencies and dependency hell are two of the Javascript ecosystem's most frequently mentioned issues. Anyone aware of any initiatives to lessen dependencies in important projects? For instance, the typical number of dependencies in a JS project would probably decrease significantly if large libraries and frameworks like webpack, babel, react, etc. all stopped using dependencies like left-pad or is-boolean (and also a bit more sophisticated ones). It would probably be sufficient if the top 1% of packages tried to remove as many dependencies as they could, because those are the dependencies that 90% of developers use. Additionally, large packages likely have the resources to implement helper libraries on their own rather than relying on them.
I find the complexity graphs to be a little misleading. If the client side complexity grows, but server side complexity doesn't decrease at all, then something has gone wrong.
If you have a fully client-rendered SPA (not saying an SPA is the only thing you should ever build, but many of the complaints seem to be aimed SPAs), then why do you still need Models, Controllers, and Views on the server? Or Routing for that matter. Many modernish SPAs would handle all the routing client side. So your backend complexity might look more like:
Asset Packing, API, Authorization, Dependencies
So you've moved some of the complexity from the server to the client. You may or may not like that, and there may still be a net increase in overall complexity, but it's not the same as necessarily suddenly 2x complexity.
Also I don't know what Authorization is doing in both graphs. Or why Virtual DOM is there - yes you might be using a Virtual DOM, but it's presumably part of one of your dependencies, and not something you're interacting with directly or maintaining. Also what is a "Controller" in React? React (and similar frameworks) don't fit neatly into the MVC paradigm. I guess it's more like Components + Models? Although the Components aren't always cleanly separated from Models/business logic, which can be a strength and a weakness.
This dance, shuffling code between backend and frontend, has been happening for a long time. It's not the first and not the last time someone claims to have found a solution (not even the best, just a solution for certain scenarios). That is fine until the client comes up with a new requirement that completely demolishes your assumptions, or some new awesome JS API appears, and then we're back to SPAs. Ups, who could have known, right? A couple years later, browsers adapt and a bunch of people, again, come up with a revolutionary idea: if we squint a little we can just about squeeze that code on the backend and save a bunch of work. Until new big products come out that change what users consider state of the art and...
It's a never ending cycle.
SPAs are a pain in the ass, like all distributed systems are, but they're also the most flexible and you don't need to reinvent half your framework or use dirty hacks if you have to do something slightly out of the beaten path. They're not the problem.
The real friction comes not from where you put the code but from the fact that you have to use different languages with different ecosystems. I'd claim most of the arguments against SPAs would suddenly disappear if one could write most of the code in whatever, compile to WASM, and slap a generic JS bridge on top.
This reminds me a lot of frames [1]. I wonder if the creators are aware that at one time html did have a way to update different parts of a page independently, but due to various limitations, it fell out of favor and is now deprecated.
Over the last 10 years we have been growing our JS + PHP + MySQL framework and unifying the code across all concepts.
We started before Promises were standard, before composer and NPM. Before Virtual DOM. Back when we started, there was CodeIgniter and Kohana and jQuery (anyone remember?)
Along the way, we kept the vast majority of our code in-house. We didn’t want to pull in libraries or updates unless we understood their code.
Looking at the latest and greatest, we were often wondering if we did the right thing. Some developers (especially JS) used to chastize us for not using the latest techniques like “two way bindings” of Angular 1.0 or JSX of React
Then - lo and behold - Angular 1.0 is totally rewritten and all those concepts are redone. Over and over. And people come around to our way of thinking. Web Components and Templates appear. Shadow DOM.
Our way of thinking is: use standards for HTTP/REST, HTML, HTTP, JS, CSS as they were intended. Every concept should work well with all the others.
Result is at https://github.com/Qbix/Platform if you want to give any feedback. We have been using it for ALL our projects since 2011.
I’ve been saying this for years, and receiving the subsequent down votes.
Front end frameworks are an anti-pattern. They hviolate DRY principles and exemplify premature optimization.
They did not grow organically out of the open source community because they solve a problem only a handful of companies on the planet have. They had to be pushed into the dev mind share by the mega cap tech companies. Unfortunately, they’ve been successful over the years.
The key question for me is not whether there is too much complexity, but whether or not this complexity is something that adds value for the actual user. Too many things do not actually do that. I use too many pages where I know it's some kind of an SAP simply because the user experience is buggy, clunky and shitty. Why that is, is a really big topic and there could be many reasons, but it's not really discussed.
I think innovation needs to happen in package management. Has it happened yet? Because I don't know for sure, so I am asking because I might have overlooked it.
Even projects from a few months ago can be become unusable unless you are willing to sit down and spend a couple of hours fixing the problem. And don't get me started on packages that have been abandoned altogether but are like glue to functional software.
Innovation isn’t going to prevent people from abandoning projects.
You have to make smart decisions when adding dependencies based on how mature and supported is it. This often means sticking with the big names even if the little project has more hype.
The core thesis is to turn the browser into a RIP terminal ( https://en.wikipedia.org/wiki/Remote_Imaging_Protocol ) to some degree where the server is 100% in control like a BBS/MUD. As an example app, the main IDE uses it, and it's great how lean everything is.
Clients are getting more interactive over time. People want more interactivity, not less. Interactivity increases with visual complexity, which in turn requires increasingly complex code. UIs are evolving in the direction of more realtime updates, more animations, more menus and options, etc., etc.
It's no coincidence the most popular frameworks are analogous to video game game engines. They both listen to user input, calculate the input against the state, and then render the results to the screen. It's also no surprise UIs are moving in the direction of becoming "gamified" (actual term), and adding elements that guide the user/player's attention. What this means is that we can expect UIs to increase in complexity until they match game engines, at least in terms of how it models interactivity. I should add, there will always be the static, simple pages, but web apps which are currently considered complicated will certainly keep evolving by providing a more immersive experience.
Nobody cares about "interactivity" or "immersive" experiences except the devs who make them. People want to get their shit done as quickly as possible using software that feels simple and works as they expect the first time.
A SPA might be the best way to achieve that. But often it isn't.
That's not what the trend shows. Apps have added more interactivity like live chats, conferencing calls, more editing options (i.e. editing videos when uploading to YouTube), and this is probably going to keep going adding things like voice commands, customizable layouts, and more realtime information being displayed at any given time. The reason it converges with video games, is because both are clients and both process lots of user inputs and then display the result to the screen.
edit: i should add I'm not saying it's ALL going this direction, but certainly the big apps with lots of users.
There is jump going from "I see this happening" to "this is what user's want." For example, the trend is also towards massive tracking, more ads, and heavier and heavier pages. Users don't want these things, but business forces create them.
I think this is probably true of many (not all) of the kinds of things you are calling "interactivity". In any case, the reasoning isn't sound, even if you think your conclusion is.
I originally tried using htmx a couple of times because I was a stubborn backend engineer that needed/wanted to build some frontends. The idea of partially swapping subset of DOM made a lot of sense especially as a way to make old school full page reload style more modern.
However, I migrated to react-router 6.4 and it has a very similar concept of embracing web technologies and partials can be defined as an outlet of another page where only the data for the partial and dom for the partial is loaded if sub sub-tree is navigated in. They say they took this idea from emberjs.
I felt the same way as the authors of unpoly and many in this thread looking at most UI frameworks until I stumbled on react router 6.4, and it very much enabled me, a primarily backend engineer, to be productive in the frontend. I highly recommend it to people who are trying htmx but want something that is more IBM "No one ever got fired for picking react"
Wow - lots of opinions on front-end dev. Not much of a surprise. It is a bloated mess. But "breaking up with JavaScript"? There's no way to build anything useful without JavaScript.
Build it without frameworks? Absolutely! Build with HTMX? Makes sense to me. It's a light-weight and isolated extension. It would be great if such partial load semantics was added to the web standard.
Isn't the core question whether the web client is stateless or not?
My front ends have been very stateful for long time. But I build desktop equivalent web apps. I jumped on the AJAX track in 2000 and never looked back. My state has been represented in XML ever since. I sync with server via "updategram" semantics (borrowed from SQLXML), and transform state into HTML with XSLT in the browser. I use no frameworks or libraries/ Has served me very well for 20 years and likely will until retirement.
Much of this and more was achieved with NextJS. Yes, javascript, but you have absolute control of what you want to render server- or client-side. You don't have to write APIs. And you still have the massive Node or React catalog at your fingertips. I don't think there is a better realisation currently for what they aimed for in this presentation.
Web front ends have been expanding, per Parkinson's Law, to fill the bandwidth allotted.
If it's not an explosion in JS/CSS from frameworks, it's the content image/video size.
Upside is that a "fast" user experience is easily achieved by defaulting to first principle JS capabilities of the browser, and delivering compressed content from the CDN edge.
Since TFA is from 2016, it's pretty silly to discuss about trends here isn't it? There's certainly no less JS today, except due to being there less web sites in the first place.
What's with this htmlx shilling anyway? If you want an app, just use JavaScript; what's the point of arbitrary syntactical barriers between page content and logic, except for maybe CSS done by UX experts - though if you're bad at CSS then why tf are you into web frontends anyway?
If OTOH you think about content-heavy sites, document-oriented workflows, and publishing (ie. what the web was originally made for) then just use a competent markup processor. HTML is based on SGML after all, giving you templating, markdown processing, stylesheets, pipelines, toc and search indexing, and whatnot; there's a wealth of SGML/XML-based tools available.
Wow. So much complexity here that they seem to not realize that they've added. Feels like they're just replacing one problem with another, rather than solving anything.
As someone that pretty much has stayed with Java and .NET SSR frameworks, with snippets of dynamic content where it makes sense, this kind of back and forth is kind of ironic.
As a web developer, I can now distribute and "install" a sophisticated software application on the work or home computer of one of my users within less than a second just by sending them to a specific link, and that user can even use it offline, with a lightning-quick database saving all of their data for them. Pretty awesome if you ask me! None of that would be achievable without Javascript frontends.
whut? You can simply distribute a statically linked binary that does not require admin rights to run? Stores its data in the local users data folder.. How big could a chat app be, for example .. 10mb? 20mb? (without excessive resources) ? With minimal memory use? Use WTL and plain platform api? Pretty doable IMHO. With a lot less dependencies and probably lightning fast dev iteration cycles.
Sort of sounds like Apache Wicket (https://wicket.apache.org/). I used it for a few projects in the mid-late 2000s. I really liked it being server side and the concept of having object-oriented HTML (code paired with HTML snippets). I haven't had a need to use it since 2014, so haven't kept up with the project.
Depends on what you're doing. There is great value in SSR but how does one even make anything dynamic/interactive without JS? You can't, so you start adding pure non-spa tainted js. Then you add features until your amazing purist solution is an unworkable mess stuck in the mid 2010s.
This is just problem of structure. I am not convinced that there is particularly higher chance to end up with "unworkable mess" with SSR compared SPAs (of course most projects end up being a mess).
If you look at the hottest trend of the "island architecture". New frameworks like Astro (https://astro.build/) and Fresh (https://fresh.deno.dev/). They are basically what are you describing. All SSR but with dynamic parts. Seems like these "islands" by their nature are less entangled than full SPA.
Separating the frontend from the backend doesn't necessarily mean putting all of the frontend logic in the client. Your server can also have separate frontend (or "view") logic.
The one reluctant use I see of JS based UI (does anyone have an alternative solution?) is to reuse API. Right now we have two code path one where a user action renders the html server side and returns it another where an API call for similar work returns a JSON. A JS UI page can consume the JSON and can render the page reusing the core API.
Soon Flutter will have WASM support too, so it'll be a great time. I've been looking into stuff like Iced, Yew, Tauri etc but the multiple client situation for cross platform isn't too great, unlike with Flutter.
Although, this solution feels to be heavily inspired by https://hotwired.dev/ Is there a reason, why new framework (with some extra cool features) have been developed as opposed to contributing to hotwired/turbo frames?
The biggest problem with front-end development is that it goes all the way from static websites to complex web apps, and developers can't tell the difference.
Everyone jumps on the latest hype, going from either server to front-end, or nowadays front-end to server side, without thinking where your application resides on that scale.
So if one were starting a project in 2022, what recommendations would one recommend in terms of framework, tooling, and/or the key questions to ask to determine the right frameworks and tooling?
(I'm using Nuxt on a project, but not finding v3 to be as stable as I expected...)
1. If you expect to need a build (since you’re using Nuxt now that seems like a fair given): Vite is probably the best current intersection of easy setup, good defaults, easily discoverable tweaks, high quality integrations. I’ve held off build tool churn for several projects for a couple years now, but after spinning up yet another local exploratory project with Vite today, and minimal fuss for a very unusual use case[1] I can’t recommend anything else anymore.
2. If you’re already familiar with Vue, and have no problems with it, the only reason I’d stray is because stuff in JSX-land is getting around to addressing many of the problems people (rightly or wrongly) complain about here. Weather React Server Components, Qwik which ships minimal JS bundle and data by design, or Solid with recent support for partial hydration… the component story being good for UX is increasingly compelling for JSX. I wouldn’t recommend any of these in particular unless you share more about how you’d prefer to develop.
3. If I were to recommend any meta-framework it’d be Astro. I’d have stronger recs for Solid Start but it’s explicitly not stable at present. Astro also has the benefit of being UI library agnostic which is a good gradual story.
4. Take literally all of this with a grain of salt. Part of the reason all these tools have a bad rep is from trying stuff when existing tools are imperfect but serviceable. Happy to share some ideas of where to look, but if I’m choosing anything for my current projects under maintenance I’m starting with how they can be adapted and reconfigured to solve whatever problems they don’t already solve. I’m only even looking at other tooling because there’s serious gaps in the existing setup.
1: Porting a legacy XML-centric project away from proprietary server dependencies which take ~8 min to build and has gobs of incidental complexity and tons of performance problems… to run in a browser really fast. It took a few minutes to figure out how to set up the project to load static XSLT assets as ESM modules, as well as dynamically loading XML payloads both from local fixtures and as library input. Builds are instantaneous and runtime is fast enough I can sell running it in a headless browser as a major incremental improvement. I can probably also sell scrapping a whole server target, or at least its persistence layer, because it’s fast enough the primary responsibility (caching) is either moot or solvable in the browser.
This slideshow is relatively old, and frameworks have gotten better, not worse, since then.
Server-side DOM mutation is deeply interesting, but I don't think it can be grafted onto existing server-side frameworks. Someone will need to rethink the MVC model with client-side state in mind.
The cost of maintaining up to date JS (security patched, fixing breaks in functionality afterwards, etc.) seems larger than any boutique site with 1 dev could handle. I’ve been exploring ripping all my JS out and using LiveView instead for this reason.
Welp, I couldn't back out of that site after making 100 url changes. Your argument would be a lot more convincing if you addressed that issue. That's such an old issue, please don't bring that back with your new throw-back style.
C’mon, put a minimum effort to understand what it’s all about before making useless comments.
It is not a framework to organize your own JavaScript (like react, vue, etc). It is a framework so that you don’t have to write JavaScript in the frontend.
Of course it is written in JavaScript, because that’s the only thing that can run on the browser.
C׳mon, put a minimum effort to realize, that it was the same kind of original intention behind any js framework - so that you don’t have to write something unnecessary.
This is absurd statement.
It doesn’t mean that the OP framework is bad or that SPA idea is not generally abused etc and doesn’t call for some changes etc.
It’s just a ridiculous and somewhat arrogant way to promote your yet another fancy js thing.
Is there a version of this that’s, like, a web page? That you can just scroll through in a couple of minutes, rather than clicking and waiting for a second or two after each tiny slide?
Modern browsers make a no-JS (and that means not even HTMX) fairly nice. Live updates might be an issue but if you don’t need those…
When the browser can reason about your site it opens up nice cool stuff like right click to open in a new tab, bookmarks always working, fast load times, caching, etc.
Author here. The link is a 6 years old presentation. The demo app has since been rewritten.
The old demo app focused on the downsides of classic multi-page apps (MPAs), i.e. the reason the world moved to SPAs all these years ago. In an MPA clicking a link loses all transient state, like focus, scroll positions and unsaved form state. This can all be solved with updating fragments instead of full pages, while still keeping rendering logic on the server.
one of my startup projects failed in 2014. one reason was the complexity slide shown for 2014. After that experience I don't do SPAs anymore. React only complicated the problem further and it is expensive for anyone creating their own projects to hire react devs. what's worse is new devs think react is the best thing to learn and start with that.
There are ways to keep things simple with the front end, only add js for the things where you want instant visual feedback. It requires a little pushback on the part of the dev team, against UX and Product management.
From what I've seen streamlit seems to be very focused on data science interactive dashboard type apps. Have you been able to develop much more extensive that power e.g. multiple pages and user interface driven workflows with streamlit?
A micro dependency system with never ending breaking changes to glue different tools and libraries together - bad idea.
Using un-opinionated "libraries" that don't scale well, but at scale - bad idea.
Technology organizations trying to stay relevant by simply adopting every next hyped fad out there, rather than stepping back to get a bigger picture of what the front-end space actually needs - bad idea.
The list goes on, for quite a long time.
And all of these issues are further exacerbated by an army of junior developers entering the front-end development space, along with recruiters subscribing to buzzwords to hire them.