Hacker News new | past | comments | ask | show | jobs | submit login
Modern Web Applications are Here (pocoo.org)
262 points by craigkerstiens on Nov 15, 2011 | hide | past | favorite | 108 comments



With the last three projects I have started (none live, yet) the backend is nothing more than a REST API and the frontend a single HTML page with javascript, javascript, javascript.

This is what web applications are now and will be. The user experience can not be compared to the old style of application. If you are still building applications today that are GET, fetch, pause, render, etc. then you are years behind. It is awesome being able to click on a link to a 10-page forum thread or blog comment page and have it render in ~100ms.

I think all of the web server frameworks will have to adapt - from RoR to Django etc. since a lot of what they do is being moved to the client (and it becomes even cheaper to run large-scale web services because of this). There are also tons of gaps on the client side - from more capable and cachable templating engine through to a full-stack framework (something like RoR for Javascript but less confusing and hard to use than the current)

The server now is just db+REST, auth and pubsub - soon enough somebody will release a generic PaaS that does all this based on a schema. Almost no more, or very little, server-side code (unless you insist on supporting old HTML clients).


> With the last three projects I have started (none live, yet) the backend is nothing more than a REST API and the frontend a single HTML page with javascript, javascript, javascript.

You might be doing that already, but the article mentions (if I got that right) that if you GET an arbitrary URL, the server will still serve the whole rendered page - asynchronous loading only happen on any subsequent requests.

This way, you don't "break" URLs, pleasing search engine crawlers and allowing copy/paste of the links, and still delivering a great experience for the user.

I truly believe there is a big demand for a Node.js framework to facilitate this. There are already a few that allows this kind of architecture (to some degree), but none are quite "there" yet.


> I truly believe there is a big demand for a Node.js framework to facilitate this.

How do you feel about derby? (http://github.com/codeparty/derby)


First time I see this, looks extremely promising. Thanks for the link!


What are some of the frameworks you've seen? I've been looking for frameworks that allow client-side javascript routing to be reused on the server. So far I've found two, both using a combination of backbone.js and jsdom to render pages on the server:

https://github.com/Morriz/backbone-everywhere https://github.com/developmentseed/bones


Rendering arbitrarily on either the client or server is hard. Other solutions that I saw forced the developer to have to think about it, so I developed FaxJs which allows the system to handle that complexity for you.

https://github.com/jordow/FaxJs

The gist is that you just write standard declarative ui structures in pure javascript and the system disassembles the markup on the server, and reassembles it on the client with all the events still in tact. You wouldn't know all that is happening just by looking, though.

    ...
    var twoDivs = {
      className: 'outerDiv',
      onClick: this.outerDivClicked,
      innerDiv: {
        className: 'innerDiv',
        content: 'inner-most-div!'
      }.Div()
    }.Div();
Or you can just do it all on the client too, if that's your thing. You'll need to work to get this integrated into your routing technology as with any rendering system.


Try this combo: Express(server routing), Sammy(client routing), Jsdom, Jsviews(templates).


ASP.NET MVC does a really nice job. I have a pattern that detects whether the request is for JSON or HTML (via a GET or POST parameter) and return a user control (module or what have you) or the entire page respectively.


I'm mostly on the same page as you, trying to somehow integrate Backbone on the server side. In the end, the mess I was creating became really unmaintainable and I left the project on hold.


I am doing async on even the initial requests. If you land somewhere 'within' the site the path will be passed from the backend router to the javascript router and then triggered on the client side.

the decision to be made there is do you want to load all the objects (models, views etc.) or have an init where it is all either loaded async or from the localstore cache. I went with the later so that the html page can be static cached.

this has taken me a while because there was a lot of js to write - for eg. a sync replacement that uses localCache and the server smartly, that responds in async with 'ok' so the view can be updated rather than waiting on the server, and then triggering server updates every x seconds or when the user closes the window (this all has to be cross-browser, which is the hardest part)

a node.js framework that does all this would be great, while I am using Python atm I think these types of apps are very well suited to run on node.


TBH there's no reason it needs to be node, you just need to use a template that could work on both client and server.

Not that I know of any yet.


Mustache (http://mustache.github.com/) has implementations in 20+ languages including javascript. The templates themselves are entirely declarative - any logic is contained within a separate object.

If you don't need any custom logic at all, you can easily use JSON to pass that object between client and server with no further changes. If you do need some custom logic, it'd be slightly more work, but then again you could just opt to use javascript for any small pieces of logic, as there are V8 and/or Spidermonkey wrappers for most or all of the languages supported by Mustache.


Jade (very HAML/Slim-like) can do both client- and server-side rendering, and it can also precompile your template into pure JavaScript so users don't need to download a bulky template engine for it to work.


Zappa with CoffeeKup can do server and client template sharing. Or jsdom and jsviews if you find Zappa too fancy.


"...then you are years behind." I don't agree with you. With AJAX you still have to GET, fetch, pause, and render. The only difference is the amount of data transmitted. Battlelog is transmitting layout and data the first time. Afterwards it is transmitting data only, but still needs rendering on the client side. When your website has little layout-data the overhead is marginal.

So I don't know what a 10-page forum has to do with it. I can render a 10-page forum in ~100ms without Javascript. It all depends on server-time and layout-data.

Maybe you should tell PG this site is years behind ;)


I agree that Modern Web Applications are here. The key point is that they don't have to be architected the same as Battlelog to be considered modern.

Most of us are too concerned (or straight up scared) of the JavaScript/HTML5 revolution that we can't see what is possible right now. Sharing templates server and client side is a magic bullet that everyone should be using.

Web applications are only going to get faster, and if you want to keep up, you're going to have to do implement modern solutions like Battlelog and Google+ do.

I can envision a time when full page refreshes are the exception and that isn't a bad thing.


Most of us are too concerned (or straight up scared) of the JavaScript/HTML5 revolution that we can't see what is possible right now.

What I see in practice is the exact opposite. Everyone writing about "what is possible" with JavaScript (custom 3d rendering engine for every website!), very few people are concerned about architectural and practical issues of it all.

Web applications are only going to get faster, and if you want to keep up, you're going to have to do implement modern solutions like Battlelog and Google+ do.

I can envision a time when full page refreshes are the exception and that isn't a bad thing.

Full page refreshes are not what makes web applications slow. I used to browse the web on 133MHz computer over a mediocre modem connection. Today, despite all these promises of speed, many JS-heavy websites seriously lag on anything slower than 1.5MBps and 2GHz dual-core. (Try browsing via Kindle.)


Even on my BF3 gaming PC, the UI feels slugish and lags on most clicks. In addition, it feels like the heavy-js sites are much more error prone than simple, hard-to-fail get+get+get sites. Battlelog is often not in sync with the real state of the server and your computer. Only page reloads can help in those situations.


You don't need to involve the server in generating the UI at all. You can have a static index.html, that bootstraps a javascript environment from static js files, whose first order of business is to contact a web service to fetch the initial content and configuration as JSON data. This makes all content static except that which is truly dynamic, which offers amazing CDN and caching possibilities. The JSON data can be cached in local storage, for easy offline support without constantly revving manifest files.

Once you get at that point, the dev experience starts to look a lot like traditional desktop client-server app development. Except desktop apps are built using components instead of templates, components that combine rendering and behavior into one encapsulated entity that you can treat like a black box. There's nothing preventing a web app from using that same architecture. In fact, ExtJS does just this.

But I don't think it stops there. I think we're going to get meta-configuration languages like mxml to send the layout of these components to the javascript environment, which will, essentially, mean that we're building another browser inside the browser. And so all software evolves until it can render a web page, including web apps.


Can somebody tell me what the point is? How much of a saving are you getting by transferring json data+metadata versus what would normally be sent. I'm ignoring the extra time that would be required to implement and test it.

And like you say, it starts to look like a traditional native app, so why not just create a native app, if you're working in a ___domain that requires...whatever benefit you're getting here. It seems like you're just trying to approach a similar appearance to the user, while simultaneously wrestling with lots of technologies that don't suit the idea (CSS for example).


If you do this, crawlers can't index your pages. Also your users see the page loaded blank and being gradually populated with content, which is not a good UX.


If it's a true application, with user interaction and everything (and not just a set of glorified Web pages), then the former is not an issue; you don't really want your "pages" to be indexed. And the latter is easily solved with a "loading" bar or any other of UI tools previously invented for the exact same purpose on the desktop.


Not true. If you are building such application:

1. You need to have a non-JavaScript version. That is your main page index.html is a full functioning page without JavaScript. When JavaScript is enabled, it turns the page into a Web App.

2. A simple loader will solve the problem. You might not be able to show the progress for the moment.


Why does he need to have a non-javascript version?

There is (thankfully) no law which say that your service must be usable for people who turn javascript of.


Yeah, there's no law which says your website must be accessible...


Working with Javascript off != accessibility. Most screenreaders (the common face of accessibility technology) run on top of regular browsers, so your JS experience needs to be accessible too if you are legally mandated to (or even better if you care).


If javascript being off makes your site not work, or not viewable at least, then that's the very definition of not being accessible (i'm not limiting 'accessible' to only mean 'available to people with disabilities' like you appear to be)

Edit: Graceful degradation should be a tenet of web development. Having no provision for a person who doesn't have javascript enabled seems bizarre to me, and being bullish about the requirement seems even more bizarre...maybe i'm just getting cranky with age or something.


That is like saying that I must make sure my programs work for people who don't have a computer and rely on anotated screenshots being sent back and forth. To my programmer brain, that is just stupid.

It is one thing if we were talking about a webpage ala 1995 for joes autoshop. At most it will have a comment field (or, the horror, a guestbook) and each url will map logically to a specific public part of the site. Most likely there will be no concept of users at all.

The same thing goes for a blog although you won't get the auto updating twitter feed and, if the blog uses diques, you may have to click a couple of links to see them.

But a webapp is more like Google writer. It just lose all meaning to make it work without Javascript.


Accessibility cuts both ways, though. Choosing to prioritize the HTML-only version decreases accessibility to people who have limited bandwidth or download speeds.


This surely hasn't been a major problem since probably the 14.4k modem stages for typical sites, and, even if it was, wouldn't limit accessibility (except in the sense that it'd delay viewing by a few milliseconds->seconds). Especially when the majority of the bandwidth cost will be data that has to be sent regardless (maybe with a necessarily increased amount of metadata).


Your first point is not correct. Crawlers can crawl JavaScript only websites using the Ajax crawling API: http://code.google.com/web/ajaxcrawling/docs/specification.h...

Your second point has merit. However I'd counter it thus: * You can load a HTML page first, containing a representation of the end content, so you can avoid the flash * You can show some sort of loading indicator on page load. Users are usually fine with waiting for the initial page load - it's only subsequent interactions that need be fast.

In other words, there's no reason why the page need be blank.


Only Google does that currently.

And while people can build crawlers that can fetch Javascript rendered pages, these scripts will have to drive real browsers with rendering engines, not just simple HTTP clients. And this is a problem, because fetching millions of web pages will require serious infrastructure work, versus a crawler using libev that can fetch dozens of pages in parallel without a sweat.

Also, that spec you gave is not exactly how Google operates. That's the advice Google gives for making web apps compliant, but the reality is that Google is smarter than that and it takes Google a good amount of effort to fix other people's screwups.

Accessible to crawlers doesn't mean Google-only and you should be careful not to throw the baby along with the bathwater.


For most purposes, search engines seeing you = Googlebot seeing you. I have never had a competitor past single digits in percentage of search engine referrals.


A crawler is not necessarily providing content to a classical search engine, as it can serve many purposes. Ultimately such crawlers may generate links to your website and while not bringing you too much referrals traffic, it may increase your ranking, which ultimately means more traffic from Google.

Also, if search engine referrals is all you're interested in, if Google drops you from their index, your website might as well not exist, right? As I said, be careful not to throw the baby with the bathwater. Google is not what made the web great, but a Google dependency might break it and website authors can be blamed for this.


However by using hashbang URLs, you have to keep that little snippet of JavaScript forever to maintain permalink even after the web has moved on to another solution.

It's not a future-proof solution.


Hashbangs are only a stop-gap at this point - modern browsers ship with the HTML5 History API.


Does IE8 ship with the History API?


No, IE8 is not a modern browser by most definitions.


Which definitions? I've seen definitions of "modern browser" that only exclude IE 6 and 7.


They are wrong. They use modern to mean "quite new and available to most clients" instead of "actually implementing the majority of the latest standards".

IE 9 comes close, and can be considered modern.


I'm not sure what you mean by little snippets of javascript, a link is a link. If you change why couldn't you just 301 them to your new format? It's a trivial redirect that you're keeping for the g-juice.


The part after an hashbang is not sent to the server, so you need client side code to load the content.

This is because the # in an URL is supposed to be an identifier of a element in the page, not an actual resource.


Or you could just keep a tiny snippet of Javascript that searches for old style hashbangs and redirects all of them to the "another solution" handler to decide what to do with them. It can be just 3-4 lines of JS.


The problem is that you have to keep that snippet forever if you ever use the hashbangs. That's why I'm saying it's not a future-proof solution. Not that few bytes matters, though.


Your first point is not correct. Crawlers can crawl JavaScript only websites using the Ajax crawling API: http://code.google.com/web/ajaxcrawling/docs/specification.h....

This sounds like a huge imposition to implement -- so much so that I seriously doubt anyone will bother for anything even moderately complex. Not only do you have to produce an API for the client-side javascript to call, but you also have to render static HTML for Google to index.


Rember this is an application, would a search engine want to crawl the innerds of photoshop or outlook, or a web page promoting them? Just because it's on the web doesn't mean it's a web site that wants to be searchable.


In most community-based, social-networking, the companies behind them do want more exposure. The more people who join, the more vibrant the community is.

FYI.


"the dev experience starts to look a lot like traditional desktop client-server app development"

I have been wondering if we might see the rise of a general purpose JSON/RESTful API that plays the same role for these kind of apps as general purpose database engines did for the old thick client model.


Funny thing is that we were building precisely that nearly ten years ago: https://bitbucket.org/BerislavLopac/waexplorer :)


And what Flex and Flash developers have been building for 5 years ;)


Everyone understands that but we really want to move to open standards. Html5 everywhere vs Flash wherever Adobe has the resources and time to do a good port.


Or in most cases a not very good port...


Once you get at that point, the dev experience starts to look a lot like traditional desktop client-server app development.

... with all the security/privacy problems that go with that, and then some.


"I can envision a time when full page refreshes are the exception and that isn't a bad thing."

Agreed. Different clients should get different views. I don't think we're that far off from having to support the js-light "web crawler" view, although we already have the tools to make this really easy.


> Sharing templates server and client side is a magic bullet that everyone should be using.

Out of interest, what solutions are there for this currently?


I've been playing with Mustache, as it works in Ruby and JavaScript (and something like 13 other languages.) Have to be a little careful to ensure that you only pass objects to the template, of course, so you don't bleed any language-specific stuff into your template.

But what I've tinkered with is very exciting.


I'm not sure I agree with the "work of beauty" statement. A browser plugin? Rendering all pages on the client via some massive JS framework? Intercepting all page loads and hooking into browser navigation? Compiling templates into Python & Javascript? Is all this complexity really justified?

I know that web apps is all the rage those days, but given the native plugin, pickiness about browser version, the fact that they apparently not care about being indexed by search engines, and all the trouble they went through to make it all work together - wouldn't they be better off to just implement the whole thing as a native app?


"Work of beauty" is relative. The field that we're talking about here is video game server browser design, which isn't exactly known for being cutting edge. In general, game designers have almost always produced interfaces that are passable at best, because no one really cared. The only one I've used that was moderately decent was Valve's[1], the rest are a tacky mess with scrolling that barely works and filters restrictions that make it hard to find the game you actually want.

Battlelog is really good. It responds almost instantly. You can middle click open server info pages, then middle click open the stats for every player on that server. Updates don't require a patch to the client, just change the HTML and JS that Battlelog is sending out (this was already done to add queues when servers are full).

It could certainly have been done in game with embedded WebKit/IE or something though, prepackaging the plugin alongside. I don't really mind either approach, since Battlefield 3, for once, is a game on Windows that has absolutely no problem with the use of alt+tab.

[1] http://www.blogcdn.com/news.bigdownload.com/media/2008/11/le...


This is a throwback to thick client-server computing. Welcome to 1992 everybody!

I'm not a web dev, but I like the client side rendering. (Even if I hate JS with a passion). This is a fantastic move as it lets you very clearly seperate presentation from business logic and the database. Also, the rendering HTML/JS can be easily cached. So you get a normal page load the first time you try the app, every time after that you get _instant_ results. Nice.


Why do you hate JS?


It's an immature language with lots of design mistakes. When the bible to JS developers is called, 'JavaScript the good parts' it's a worry...


1. The browser plugin is just to launch the game, not render the pages.

2. The "massive JS framework" won't seem so massive once you browse a few pages – the payload savings will likely pay off pretty quickly. Notice that the AJAX data for the index page was 4KB, vs. 18KB for the fully rendered page.

3. Cross-language templating languages aren't so rare (Mustache being a very popular choice).

Consider this: once you want even just one tiny widget on the page to update via AJAX or push notifications, you've already got the code to support half of these features.


If you render the initial view on the client you still get indexability. Twitter actually still uses the old version of their site for this purpose.


So does this mean you've to support two...paradigms and keep the results similar for each version?


Yes and no. If you have a templating solution that works well on the client and server then this shouldn't add much complexity. Rendering the initial view on the server has the added advantage of supporting javascript-disabled clients too.

I muddled my point by mentioning Twitter - it was very nearly a non sequitur. They're supporting two paradigms but not because they want to. My understanding is that they don't have a single solution that works equally well on the client and server so they are using two different solutions.


Ah ok understood, thanks.


While I'm excited about "modern web applications," my understanding is that they're harder to develop for and test (I haven't done one yet). If that's true, it's worth considering if the ROI is there to make your app a single-page javascript app vs how fast you can iterate via the more standard method. You may not have the traffic to where offloading rending to the client makes sense. It's likely many actions your app does are "fast enough" as well, so a rich UI experience isn't going to be a huge improvement. Reserving the highly interactive bits for where it really counts may be a better idea.

That said, it's only a matter of time before the "harder to develop and test for" goes away and rich apps will become more of the norm.


I think they're only really harder to develop in so much as it's more like developing two applications. One backend that throws out JSON/XML or whatever and a frontend in HTML/Javascript.

I actually find that bit easier, because I can separate out the the parts and worry about things a piece at a time. The other benefit of this is if you chose to develop for iOS, Android or any other platforms you already have an API. Equally if you wanted a developer program, again it's already there.

The complexity comes when you make the decision as to whether or not to support clients that can't or won't execute Javascript, if you want to make it work for them too you're stuck with more work and things do get more difficult.

As far as testing goes, there are plenty of mature javascript unit testing frameworks and if your inclined to unit test your javascript already this doesn't add a great deal of overhead. Your really just testing something in JS you would have otherwise done and tested server-side.


I have to agree with you that "developing two applications" is actually simpler, especially if your application matures and when you want start developing different kinds of clients (e.g. mobile).

However, my experience is that it still has some mayor drawbacks. Javascripts frameworks (testing and otherwise) are still less mature, then their server-side counterparts are. It is not always as obvious how things should be done. There are still lots of incompatibilities between browsers. And when an error happens you can not log it (or at least you have to do more effort).


Interesting, as Battlelog has been criticized by the gaming press including the Penny Arcade guys, who call it buggy and hard to use.

I have to say, too, that in my experience the more client-side state a web page keeps, the buggier it tends to be. Building applications this way is harder, and I hope we don't end up losing the characteristics that have made the Web so successful in the transition.


Battlelog by itself would be fine if it was just for statistics and socializing. On the PC it's also how you launch the game as there's no in game server browser. When you launch BF3 it launches origin which in turn opens your browser to battlelog. The context/app switching is annoying and slow as you go from your browser to the game and then back to the browser when it's time for a server change. When the game first was released and things were slow and buggy (in game and on battlelog) it was very painful.


I never had any problems, and I have to say it is easily the best server browser I've ever used. Valve's pre-TF2 updates one (still toggleable back on) is probably the only other one I've actually liked before. It's a huge improvement over DICE's browsers in the past, which have always been merely bad at best (the BF2 menu had to load when you pressed escape).

A lot of users complained that there was no feature to allow you to wait in a queue for a full server. So... they added it. Pushed a server update. No patch, no new binaries to download. A new checkbox simply appeared.

Now, there's no reason that this has to be in browser. EA and Valve both clearly have WebKit or IE implementations (I think I heard the "clicking" noise in the Origin browser) that play nice in fullscreen games. They could certainly be integrated as part of the game interface, but Battlelog makes it clear to me that HTML and CSS are the way to go with video game server browsers in the future.


During the first few days the game was out I routinely received errors trying to get the server list at all but as I've said, they have been incrementally improving battlelog since release. My biggest complaint is that the server browser isn't available in game. The game completely closes when leaving a server from in game or closing a game from battlelog. Then it has to be relaunched when starting the next game. This is really slow for many of us and there doesn't seem to be a very good reason for it. The use of origin also doesn't add anything for the player. I also would prefer being able to launch the game directly without having to open three applications (origin, browser, bf3).


Exactly. As it currently stands, battlelog is not an improvement over previous systems. If you were to compare the time it took to get into a game with friends on Bad Company 2 (the previous Battlefield game) and Battlefield 3 you'd see this.


I've heard the complaints about getting friends in game, and then on the same squads. I haven't run into this problem yet as none of my friends bought the game due to origin.


It definitely is harder.

I've done my shared of such web-application using GWT.

This technique, known as the "single page application" is like writing a desktop-app but with additional complexity such as maintaining a history on your own and deciding what "back" button action means depending on the context.

The other additional complexity is the "offline" mode. Now suddenly you have sync issues.

Pretty difficult even with frameworks.


in my experience the more client-side state a web page keeps, the buggier it tends to be.

I've noticed that too. You still need to load a fresh DOM from time to time. I'm sure future frameworks will have some sort of a semi-refresh where all of the static elements stay the same, but the framework runs a sort of cleanup on the DOM.


FYI, Armin doesn't mention it in his article but the guys that built Battlelog, ESN, is releasing the web framework behind it: http://www.esn.me/product/planet/. They also have a great service, BeaconPush similar to Pusher, only better IMO. Both because it supports the notion of users and also it has both a Flash websocket and XHR long polling fallback where as Pusher only has Flash websocket fallback (believe me, this matters, I've tried both).


We are a startup called Glancee. We build a mobile app (iphone version + android version) that finds people in your area with friends or interests in common with you. The apps are native objective-c and java apps, and the backend is a mix of python, mondodb, and a bit of erlang.

A month ago we decided to build a facebook app to reach users that don't have a smartphone. We chose not to change one bit of code in the backend, and we were able to build the web app in 3 weeks with backbone, jquery, and websocket-js.

You can try it here: http://apps.facebook.com/glancee

The app is just one 40-line html page, the rest is javascript (and templates embedded in js). You never refresh the page when clicking a link, which gives you the feeling of using something as fast and robust as gmail.

CSS files and JS files are compressed with requirejs before being deployed, so to load the page you need three requests (plus images). Right now our biggest bottleneck is the facebook api, which is tremendously slow.


My problem with Battlelog isn't necessarily that it's browser based. My problem is that I'm forced to run 2 other system based applications on top of it. Namely Origin, the sole focus of which seems to be forcing me to buy EA games through EA exclusively.

Battlelog as a stat tracker is great. As a system of convenience run in conjunction with Origin and the actual Game EXE it sux.


And herewith another example of history repeating itself.

If we make the assumption that: - The vision of Web 1.0 (mid- to late 90's) was Web 3.0 (the modern web app). - Web 2.0 really just evolved the technologies and tools.

During the 90's we deplored thick client apps. We had 2/3/n tier on the desktop, and we wanted web apps.

Now, 15 years later, we're building 2/3/n tier apps in the browser - but we make the same architectural mistakes we made with thick clients, we ignore user control and consent, we expose devices to all sorts of attacks that don't exist in thick client apps...

That's hard-core irony, right there.


For those of us who weren't around, could you please tell me why we deplored thick client apps in the 90's? Was it just that they had to be MS Windows Win32 or MFC apps? Or is there some other reason?


"All the pages can be rendered on both the client side via JavaScript as well as the server. How this work I cannot tell you"

There are plenty of templating engines that have been ported to Javascript. Mustache is the first example that comes to mind: http://mustache.github.com/

Once you have the templating engine the rest of the logic is pretty easy to build.

Rendering web apps entirely on the client in general is pretty awesome although there are two problems: 1) the push state API is not supported in all browsers yet, which forces you to resort the fragment identifier + onhashchange to approximate the same functionality. And of course the fragment identifier only affords you a fraction of the same luxuries as the push state API. And of course onhashchange is not supported in older browsers. 2) When you fall back to the fragment identifier rendering on the client is actually a little bit slower. The fragment identifier is not sent to the server meaning the javascript has to be loaded in the browser before anything at all is rendered. Does this lead to several seconds of delay? No. But it is noticeable. At least with push state you have the option of rendering the initial content on the server and all subsequent requests on the client without increasing complexity too much, assuming you have a good templating solution in place.

But yes I agree. Modern Web Apps are Here :)


> The real interesting thing about Battlelog however is a Windows PC specific component. If you are heading to Battlelog from a Windows PC and you own the PC version of Battlefield 3 you can launch into a game right from within the browser. How does this work? It works with the help of a browser plugin that exposes additional functionality to the in browser client. Namely it has a function to start the game and pass it information as well as a general purpose function to ping an IP address which is used for the server browser.

Unreal did that back in 1998. They register the unreal:// protocol in windows so any hyperlinks that contain an address such as unreal://127.0.0.1 will launch the game and connect to that ip. The good thing about this is that it can also be used by third party websites such as promoting your clans server and it is completely browser independent. I don't know if it could be abused to "rickroll-launch" the game but i haven't heard of any such incidents.


Even before "Unreal" you could find the servers, check the ping, see the players' nicks, scores and statistics via browser in both "Quake" (technically: "QuakeWorld") and "Quake 2". The plugin to do that was called "QPlug"/"Q2Plug" and it was co-created by John Carmack himself.

Personally, I'm not a big fan of such solutions (Battlelog included) - mainly because switching between fullscreen game and browser is, in most cases, very jerky (although it has improved since "Unreal" times). Also, the difference between in-game and in-browser UX ruins the flow of the whole thing. Have to admit though that viewing friends, statistics, etc. through a web browser is very convenient.


I was thinking this as well. Why not just register a protocol handler calling out to the battlefield client. Seems much more x-browser. Why wouldn't that work, and why is the browser plug-in real interesting? What am I missing?


You are missing the two way communication that is impossible with URL handlers. A URL handler can only transmit information in one way and cannot do that implicitly. The user has to click on that link.


Same with e.g Call of Duty 4. If you go to Game Monitor and click on a server's IP address, it'll open a link like:

    cod4://212.85.69.40:28960/


I don't see how this is different than using any of the myriad javascript mvc style frameworks that are out there in tandem with websockets.


I'm glad to see someone on here doesn't think this is quite so sensational. The only relatively new thing they've done is launching the native game from the browser plugin - and I also wondered if a custom URL scheme would not work instead (and be simpler).

In terms of game manufacturers it is light years ahead of what they've done before. I just don't see why the author is quite so blown away by it..


I tend to agree with you. XSLT web apps can render on the client and just transmit XML. Same idea. GWT does this too. And I'm sure there are older examples.


The point here is that progressive enhancement is baked in all the way through, eg. using the same URLs regardless of whether a page is being requested by the thick JS client, an old dumb browser, or search engine crawl. The realtime stuff works regardless of whether websockets are available (Flash & long polling fallbacks).

What this means is that you can actually deploy these new features in the real world without freezing out part of your audience, as happened with Gawker's ill-fated redesign.

What's more, you can do this without having to develop everything twice, using abstractions like client/server shared language-agnostic templates and realtime libraries like Socket.IO.


Now that the client is the MVC execution environment with the client-server interaction used mostly for data replication, plus some extra invokable server-side behaviours, we can congratulate ourselves on having more-or-less reinvented Lotus Notes.


I was surprised to only see one brief mention of GWT in this comment thread. I use GWT on one of my personal projects and SmartGWT on two client projects, and even though the development process has some difficulties like long Java to Javascript compile times, it is great to be able to write/debug both client and server side code in the same IDE.

Something I read a few months ago: Thoughtworks paper on technology that described GWT as a bad idea, very well implemented. :-)


I should have added for people who are not familiar with GWT: you write your client app in Java, almost like writing a Swing app, and it gets compiled to 6 different combinations of Javascript. Google's setup code determines browser capabilities and downloads the compiled Javascript best for your environment. After that, the only data passed between client and server is model data for the UI.


I also use GWT on a project I'm working on, except the back end is a combination of nginx, nodejs, and postgresql, so I don't get the luxury of coding everything in the same language. However, so far I've found GWT pretty great to work in as it gives you the discipline of java while letting you easily write client side code.


Companies that properly implement service oriented architectures will be very well positioned to create these advanced web applications that push the computational cost of rendering the UI to the client machines. An added benefit that many overlook is the ability to execute on mobile strategies. I do not buy the idea that mobile applications will one day all be html based. Client-based applications will always produce a richer, more integrated experience. But, if you have already designed your html application to function more like a rich client. You already have the api calls that any other mobile, or desktop application would need. You have forced yourself to design the back-end based on services that encapsulate a good amount of business logic that will not have to be repeated when implementing the different "views" of your application.


That's the big win I think this architecture has, it's already an API and already ready to create platform specific implementations.

It also makes a developer program something that's quick and easy to support, again the API is ready and waiting.


This reminds me of Quora's LiveNode stack: http://www.quora.com/Quora-Infrastructure/How-does-LiveNode-...


Is this phrased accurately?

The framework then hooks into your browser's navigation code an intercepts all page loads. Instead of letting the browser replace the page with something new on load it instead does the HTTP request via Ajax and adds an additional header to the HTTP request: X-Ajax-Navigation.

Wouldn't it be better to intercept the page leave event, rather than load?


I figure it probably just intercepts all click events, which usually trigger loads, hence "intercepting page loads".


Does anyone have any good examples of a substantial web app using this architecture that doesn't require purchasing a video game or creating an account? I'd like to take a look at Firebug (actually, Chrome developer tools) and experience the snappy performance the author talks about.


I like poking around the javascript "applications" of Twitter and Stack Overflow. The latter uses jQuery, and both provide for a nice code "book" to read (granted, the variables are minimised, and the code compressed, but you can fix the latter by running it through jsbeautify).


Quora uses a very similar architecture.


Worked on something similar with javascript, nginx, gevent, 0MQ, and a C backend. The only issues we had was with file uploads for transfers of large amounts of data.

How do other people do file uploads with an asynchronous web server?


I would suggest to use the nginx file upload module, which would only pass the file path to the uploaded file to your gevent/tornado app server.


Thanks, I'll check that out.


to get battlefield 3 working on ma friends windows vista pc i had to install 215 updates. it took 6 hours. why? because the machine had to have IE9 installed even though that wasn't his default browser and wasn't the browser started when origin kicked off battlefield. so for sure this is cool in theory. the idea is certainly plausible and fun to talk about, but melding the complexity of web apps with the complexity of system apps makes me shudder.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: