Its incredible multi-process took this long. Just goes to show you that your architecture decisions last a long time and are often difficult to change. Chrome had this from day one and never had a big and old codebase to worry about. Yet it took Firefox many years to get multi-process going and my understanding is that its much more limited and simpler than what Chrome or Edge do.
I'm also a little surprised there hasn't been an attempt to launch a completely new Firefox from the ground up. Regardless of what they're doing right now, its still a legacy code monster and much more laggy than the competition. Maybe this is Servo's ultimate purpose, but every Firefox advance is welcomed but always feels like another layer of lipstick on this pig.
Disclaimer: I use Firefox as my main 'non-work' browser several hours a day. Its good, but its very obvious when I'm not in Chrome from a performance/stability perspective.
|| I'm also a little surprised there hasn't been an attempt to launch a completely new Firefox from the ground up.
I (hopefully) think this is what Servo is and will end up being.
Has no legacy code, and is tiny in comparison to Gecko. It's OS support is fairly modern, and has no interest in supporting Windows XP.
It's also written in such a way that allows them to be more multi-threaded and more concurrent than traditional rendering engines.
At some point you have to draw a line in the sand and start again, and I hope Servo is that.
> I (hopefully) think this is what Servo is and will end up being.
Not really the plan. Servo may eventually become a product, but that would be in the very far future. But Gecko can use lessons learned from Servo, and try to share code with it, so you can incrementally replace parts of Gecko.
> and has no interest in supporting Windows XP.
I don't think that's the case? We don't have users on XP so we may not support it right now, but that's just an issue of priorities and not having the resources to fix all the things at once.
There are more XP users of Firefox than there are on Linux (probably for other browsers too). If Servo was a product it would probably care about XP.
Yup, hence my "hopefully", I'm aware small parts of servo may go into Gecko incrementally, however im still routing for something like browserhtml to be at a usable state, and for servo to run on its own.
Well, that was 2014. And yes, there's not much interest in putting the effort to support XP now.
That is because Servo is currently not a product. You said "I (hopefully) think this is what Servo is and will end up being.", which talks of a future where Servo is a product (which can happen, though it would be in the far future). In such a case Servo would reevaluate supporting XP (unless it is so far in the future that XP support is no longer important, in which case most probably other browsers will drop XP support too). I say it in that comment too, "we're currently a research project" and "If/when we stop being a research project, maybe, but I doubt it". I'm less doubtful now, but yes, it's possible that Servo would continue not supporting XP as a product if there was a good enough reason for it.
The reason Servo doesn't support XP is not because Servo is Servo, it's because Servo is not a product.
Addons are a big part of the firefox ecosystem and they reach deep into the internals of the browser. So just replacing the browser wholesale would break a lot of things over the course of a single release.
> I'm also a little surprised there hasn't been an attempt to launch a completely new Firefox from the ground up.
It's interesting how the history goes.
We had Netscape 4, and it was crap (remember when resizing a window reloaded the whole page?)
So they was a ground-up rewrite of the rendering engine, resulting in Gecko, which was put in Mozilla.
Gecko was great but Mozilla was a bloaty amalgamation of features, so this was created a pared-down Gecko version called...
~Phoenix~ ~Firebird~ Firefox, which had the great rendering engine and a lean, native-like UI.
Then there was KHTML/KJS which was built for KDE, which had a lean architecture but didn't have the investment to get the compatibility 100% there...
Which Apple then poured into the project in their fork to WebKit, which paid off in spades on the resource-limited iPhone a couple years later.
But Safari was only on the Mac (aside from their "Cocoa on Windows" version and a few open source ports), which Google took as a reason to create...
Google Chrome, where the biggest innovation was rendering each page in its own process (since Safari could freeze up due to one page going in an infinite loop in JavaScript).
So will someone out-Firefox Firefox and do what Chrome was to Safari for multi-processing? Take the rendering engine and put a new chrome on top of it?
KHTML was originally forked from QHtml (part of Qt) I believe (or they shared the same origin?).
Webkit later became a Qt component, the circle of life.
I would argue chromes biggest innovation was the UI, putting the address bar as part of the tab where it belongs (cue holy war) and better tabbing in general.
Mozilla was also the first to integrate the search and address fields, which they unfortunately stripped out of firefox (and many people think chrome introduced it).
> Mozilla was also the first to integrate the search and address fields, which they unfortunately stripped out of firefox (and many people think chrome introduced it).
Do you know when this was introduced/removed? I'm curious as to what this implementation was as I don't recall when this happened.
> So will someone out-Firefox Firefox and do what Chrome was to Safari for multi-processing? Take the rendering engine and put a new chrome on top of it?
I highly doubt it, as there's very little incentive to create a gecko based browser vs a webkit or chromium based browser.
As a fully rewritten browser engine, how will they handle malformed HTML? Will it, like other browsers, try to 'understand' and fix some errors or will it stick to the specification?
The HTML5 specification now fully specifies what to do in the face of malformed input. That's one of the biggest differences between HTML5 and previous specifications.
Well, it is for now. This isn't yet a concrete plan that Mozilla has, but from what I've read, the Servo developers would definitely also like to write a JavaScript-engine in Rust, simply because of the gain in security.
Also, browser.html already exists in an early form (and is bundled with Servo). It provides a new UI written in HTML, CSS and JavaScript, so it can be rendered by Servo as well.
> Servo developers would definitely also like to write a JavaScript-engine in Rust, simply because of the gain in security.
(Servo developer here)
This isn't really the case. There is interest in doing this, but it's not something we definitely want to do. The problem with writing a new JS engine is that you need to duplicate years of performance tuning so while it might be safe, it would take immense amounts of work to make it efficient. Rust's safety benefits get reduced when you have JITs and all involved, too.
It would be nice to have, sure, but the amount of work in making a usable one is huge.
> Its incredible multi-process took this long. Just goes to show you that your architecture decisions last a long time and are often difficult to change
I'm also a little surprised there hasn't been an attempt to launch a completely new Firefox from the ground up. Regardless of what they're doing right now, its still a legacy code monster and much more laggy than the competition. Maybe this is Servo's ultimate purpose, but every Firefox advance is welcomed but always feels like another layer of lipstick on this pig.
Disclaimer: I use Firefox as my main 'non-work' browser several hours a day. Its good, but its very obvious when I'm not in Chrome from a performance/stability perspective.