To echo what Paul said, this will not be a browser you will want to use as a replacement for all your day-to-day browsing. If you try to, you won't end up happy.
Web compatibility is a long road, and it's crucial for us to be able to know what missing functionality is most important and the places where we need to focus on performance the most. The purpose of the package is to help us find and prioritize bugs and missing features. We want to know which sites are the most broken (and, even more importantly, which missing features are breaking those sites). From the Web developer side, we also want early feedback on use cases that may be slow today, so that the browser engine can eventually become a great experience for everyone.
Thank you for the warning, but I suspect I for one might end up happy anyway. :)
I already see lots of broken pages because of NoScript, I often use bookmarklets to strip the page of excessive (mis)use of design... so as long as you can show me the text I will be fine, thank you. ;) Not the typical user though, I know.
On the other hand I am thrilled about security and privacy possibilities this brings to the table! So I can't wait to try it out. Keep up the good work!
ironically, it's NoScript that blocks the javascript URI scheme. I just never bothered to find out how to unblock it selectively (to stop sites linking to those) or at all.
I already use a crippled browser for my day-to-day browsing: Tor Browser. I'd be happy with a browser that's as non-leaky as that. Tor Browser works fairly well, but I'd prefer a lighter browser if I had the option.
I would recommend against using the Servo alpha against Tor, though; some web security features (eg stuff like CORS, though our CORS support is okay) aren't there yet IIRC.
(Of course, this will change over time, just that I don't expect it to be 100% secure by June)
> From the Web developer side, we also want early feedback on use cases that may be slow today
Have you ever tried Google+? In my experience this is one of the most demanding pages out there and it is almost unusable in Firefox for me. The only reasonable way for me to use it is to switch to Chrome. (Linux, cheap Intel Celeron with 2.6 GHz)
Even worse, Google Keep with ~3 years of notes. Actually unusable in a web browser for me at this point - chrome, Safari, Firefox, doesn't matter. Trying to figure out the best way to export all my notes. Sucks, because it's fantastic on mobile, and the built in reminder features are really useful. Anyone have a good mobile + webpage alternative?
I would give you a count if I could. My guess is on the order of thousands, but not tens of thousands. Seems like the browsers are trying to download every single note I've ever created at once (from looking at the network requests and seeing the huge amounts of data being transferred).
Also, just tried loading the site on a much faster internet connection and everything worked, although text entry was slightly laggy on a new note. Still couldn't figure out how to get a count of my notes though.
Have you tried it without any extensions enabled? Google+ has always performed roughly the same in Firefox as Chrome in my usage — fairly good but not great due to the massive amount of work they're doing in JS. That's improved over the years but it's taken a long time to approach native scrolling performance in any browser.
I have an idea to give Servo a great use case. At this moment, for crawling websites and testing web apps phantomjs and Selenium are used. While those libraries are okay, it can definitely be replaced by something better.
PhantomJS is based on WebKit, which is a commonly used HTML rendering engine, using Servo in PhantomJS will only make sense once Servo has double digit market share. Selenium isn't a browser engine, so not sure how Servo could replace it.
Are there daily binaries available somewhere? It stopped building for me starting some time ago and I was too lazy figuring out what's wrong (plus, build-times for prod-build are quite long).
Are you on OS X? Apple stopped shipping an OpenSSL header that's needed by a Servo dependency (https://github.com/servo/servo/issues/7303). If you're using Homebrew, you should be able to build by manually pointing to a brew-installed copy of OpenSSL:
>To echo what Paul said, this will not be a browser you will want to use as a replacement for all your day-to-day browsing. If you try to, you won't end up happy.
Well, we obviously all gonna use it that way though :-)
Congratulations to both the Servo and Rust team for making it this far. You set out to slay not one, but two of the biggest dragons in all of software engineering, at the same time, and while you may not be done yet, you, uh, err.... have the lance definitely sliding pretty far in and the dragons are definitely noticing and quite upset?
Sorry. The metaphor kinda broke down there. Point is, congratulations. Rust+Servo is one of the most absurdly ambitious projects I've seen in the last twenty years, to make a new browser engine and a new systems-level language. The level of success achieved even to this point is astonishing. I know the road is still long, but I wish you the best in finishing this journey!
To be clear, this will be a very early release (nightly builds) of Servo with a HTML UI (browser.html). You won't be able to replace your current browser with Servo just yet :) … there's still a long way to go. The goal is to make it easier for people to test Servo and file bugs.
Do you have any advice for how quick we should be to report bugs? I just noticed my old homepage[1] crashes webrender (not normal rendering) after scrolling a bit and was wondering whether that kind of thing is worth reporting or if the code is in such active work that it's not worth wasting someone's time managing the issue.
If it's a known bug we'll just close it. If you want, see if you can find dupes on the issue tracker before filing, but don't spend too much time doing that.
A year or so ago, I've read that servo project is fairly easy to contribute to even if you have no prior rust knowledge, other than the core basics, and are willing to learn it as you go. The reasoning was that there are tons of core functionality missing, therefore there are plenty of low hanging fruits.
I was wondering, is it still true (or ever was true)?
This is still true. We're pretty good at creating easy issues, though there's also a steady flow of newcomers so often they get snapped up quickly. A lot of the low hanging fruit is gone, but with careful planning it's easy to create more.
It's definitely true. I've only written a couple toy projects in Rust (no more than a few hundred lines total), and I jumped into the Servo codebase and started contributing. It can be slow going as you wrestle with the language and learn the codebase, but the project maintainers are very helpful.
Is there any chance we'll get a browser with support for discretionary access controls in the render processes? Given that pretty much every OS supports locking down rights processes have, it would be a big win, security-wise, if the OS could catch anything the browser doesn't.
Could someone in the know clarify for me what the difference is in aims of this new Browser ("Servo") and Firefox? The Servo landing page said its aims are
Servo project aims to achieve better parallelism,
security, modularity, and performance.
Parallelism: All current browsers are written to do layout in a single thread and the specs were written assuming this. When Servo started, whether you could write a parallel layout algorithm for the web was an open question. Servo has a multithreaded layout system that's faster than Firefox per-thread.
Security: The Rust programming language is memory safe. This prevents a large class of security vulnerabilities from happening. The number I saw was ~50% of the critical security vulnerabilities reported in Firefox in 2014 wouldn't be possible in Rust. It's also architected from the ground up to be multi-process sandboxed. I think there are other layers of defense but those are the two I know about.
Modularity: The pieces of Servo are in seperate repositories and the repos (crates) are actually used by the Rust community. Similarly, Servo pulls in crates from the broader community. This combined with the desire to port bits and pieces to Gecko makes the browser a lot more modular than you might expect. It's also designed for embedding and (I think) matches the CEF api so it should be usable wherever webkit/blink are but I haven't been keeping up with things there.
Performance: Parallel layout with a GPU-based scene graph [1] should result in a browser that's several times faster than anything currently available. The Servo team has been hesitant to really push benchmarks given the incomplete support but everything published is really promising.
Servo is a research project by Mozilla as a part of their Rust programming language. It gives them a "real world" app to build in Rust, so they don't end up doing ivory-tower designs that don't work well in real apps.
Despite the way some news articles might write about it, Mozilla does not intend this to be a new browser for regular people - It's an extension of their new language, and R&D.
Over time, Firefox may incorporate components written in Rust, potentially including parts from Servo, but AFAIK there is no timeline for that.
> Mozilla does not intend this to be a new browser for regular people - It's an extension of their new language, and R&D.
I disagree. While I'm not sure what the internal planning is, we are doing a bunch of things to move towards having a shippable Servo browser. If Servo was more of an experiment with no possibility of shipping there would be a whole host of features that we wouldn't even bother implementing it.
Servo's existence is also not to serve as a use case for Rust. If anything, it's the other way around, Mozilla was interested in Rust development because of Servo. However, over time Rust has gained a life of its own which is pretty awesome. (Yes, Servo does serve as a pretty good canary for Rust, but it's not the raison d'etre)
As far as Servo's future is concerned, there are a bunch of non-mutually-exclusive steps that can be taken going forward:
- Start moving components into Gecko (already being done) and write new ones (already being done): https://wiki.mozilla.org/Oxidation . Gecko can use Servo's URL parser, and wrote its own MP4 metadata parser. There's active work going on replacing Gecko's style/CSS system, and discussions about webrender. I've also seen some interest in dropping Rust into Spidermonkey.
- Expose a webview-like library (not being done yet, but there's interest)
- Release servo as its own browser, perhaps using browser.html (this is being coordinated at browser.html)
- Replace Gecko on Firefox for Android (this is not too hard, since the browser UI is simpler. Not being done yet)
- Replace Gecko in Firefox for Desktop (this is pretty hard -- there's no clear delineation between "Gecko" and "Firefox" and the UI uses things like XUL which I'd rather keep out of Servo). This is not being done yet.
Gecko is something of a pig. I find it hard to believe that Servo will be developed and then thrown away as purely a research exercise. I suspect Mozilla is hesitant to lay out a roadmap considering how young Rust is and how long rendering engines take to write, but it seems to me that its likely Servo will replace Gecko when its mature.
> "It's an extension of their new language, and R&D."
I'd suggest there's another set of aims that comes out of the work to make Servo easier to embed than Gecko, but the time isn't right to make the most of that yet.
Any plans to make the browser cache aware of cross ___domain resources? Basically a hash based cache, where as long as the hash is valid it can be used from a hash based caching pool. This could be integrated with SRI to reduce unnecessary network load without compromising user privacy.
Will not provide complete privacy, but might reduce exposure to 3rd party CDNs.
I wonder if the other browser vendors are working on a similar parallel browser engine? Perhaps using some custom version of Clang that applies Bjarne Stroustrup's C++ Core Guidelines to errors/warnings.
I don't remember the C++ Core Guidelines having anything to say about parallelism, nor proposing any sort of lint or static analysis involving concurrent code. Though I may be misremembering, it's a rather long document.
I am more interested to see how a major Rust project holds up when its attack surface gets larger. So the question is when does Servo get added to the Pwn2Own
That's a political question, not a technical one. Things get added to Pwn2Own when its sponsors (including, e.g. Google and Microsoft) push for them to be added, as far as I can tell.
Mercurial isn't a mozilla thing. Firefox uses mercurial. A lot of other Mozilla projects (Gaia, emscripten, shumway, Servo, Browser.html) use GitHub, and others (Bugzilla) use Git directly (over gitolite or something).
We occasionally have discussions on whether or not to switch to Bugzilla. Github has some limitations when it comes to organizing things; however it's more newbie-friendly so there's a tradeoff.
A meta bug in Mozilla parlance is basically the same as epic. It’s a bug that acts as a collection of bugs, that can be closed when all its dependencies are closed but doesn’t have any direct work associated to itself.
Web compatibility is a long road, and it's crucial for us to be able to know what missing functionality is most important and the places where we need to focus on performance the most. The purpose of the package is to help us find and prioritize bugs and missing features. We want to know which sites are the most broken (and, even more importantly, which missing features are breaking those sites). From the Web developer side, we also want early feedback on use cases that may be slow today, so that the browser engine can eventually become a great experience for everyone.