Hacker News new | past | comments | ask | show | jobs | submit | mccr8's comments login

rlbox is used for more than one library: "Now, we’re bringing that technology to all supported Firefox platforms (desktop and mobile), and isolating five different modules: Graphite, Hunspell, Ogg, Expat and Woff2"

https://blog.mozilla.org/attack-and-defense/2021/12/06/webas...


Their concern is not with theoretical vulnerabilities, but actual ones that are being exploited. If an attacker never tries to find a vulnerability in some code, then it might as well not have it.


Firefox uses unified builds, where a bunch of .cpp files are globbed together and compiled at once. That helps a lot, but a build still takes a bit of time unless you are on an absurdly fast machine. Chrome used to also support this, called "jumbo builds", but they didn't want to deal with the maintenance overhead. Presumably all of the Chrome developers employed by Google are using some kind of massive distributed build infrastructure so there's little impact of slower builds on individual developer productivity, so the use case of building on a single computer is not as prioritized.


That's the CTO, not the CEO.


According to news stories, Apple received $20 billion dollars in 2022 from Google to make Google the default search in Safari.

https://www.theverge.com/2024/5/2/24147007/google-paid-apple...


Which is easy money that Apple uses for the company as a whole. They don’t make Safari because of Google’s money nor is it likely they would stop developing it if that money was no longer paid.


Firefox also uses reference counting plus a trial deletion based cycle collector to manage C++ DOM objects (and cycles through JS). In fact, Graydon was responsible for the initial implementation.


I got 35.7 ± 2.3 on a MacBook Pro M3, Chrome 122.


33.3 with an Intel 14900k and Ubuntu. And people will sit there and try to tell you that computers aren't getting faster any more.


If you are running untrusted code in Node, subtle JIT bugs are probably the least of your problems.


What does trusted really mean? If you use node (or other JS packaging systems), you are running code that someone else wrote, that almost certainly you didn’t review as it’s huge, changing, etc. How about companies that use v8 to run JavaScript extensions to their app that their customers wrote. This is many apps. Are you saying they are all vulnerable?

The answer is they are all vulnerable, just because of problems like this. Any user code (js in this case) is untrustworthy, and everything has js extensions. What’s the safe way to run user JS? Running v8 in its own somehow separately limited process maybe is what I think people do.


Whether or not you review your deps code is on you, it doesn't make it untrusted. You're trusting them whether you do the due diligence to see if that trust is warranted or not. Untrusted means code that comes from outside your system, like 3rd party extensions to your app and is presumed to be completely broken and actively malicious and nonetheless shouldn't crash your own program or reveal sensitive data.


There is a massive difference between the supply chain risks of open source packages and actively fetching and executing remote code provided as user input like the browser inherently does.

The case of user provided extensions definitely falls a lot closer to the supply chain threat model.


>What does trusted really mean?

I agree; "trusting" third-party/remote code or not frankly went out the window with the bathwater including baby when we moved on to Web "JavaShit" 2.0, or was it 3.0.

Feels like we're in a worse security hellhole than we ever were with Flash or ActiveX back in the day, frankly.


For context, assuming by JavaShit Web 3.0 you meant JavaScript:

Flash and ActiveX were proprietary technologies that often required users to install plugins or additional software on their devices. These plugins operated with high levels of access to the system, which made them a significant security risk. They were notorious for being frequently exploited by attackers as a way to run malicious code on users' machines.

In contrast, JavaScript is a core part of the web and is executed within the browser in a sandboxed environment. This means that JavaScript operates with limited access to the system's resources, reducing the risk of system-level security breaches.


> In contrast, JavaScript is a core part of the web and is executed within the browser in a sandboxed environment. This means that JavaScript operates with limited access to the system's resources, reducing the risk of system-level security breaches.

Flash (and probably ActiveX) were also executed in a "sandboxed environment", including "limited access to the system's resources". All 3 have (or well, had, in the case of Flash and ActiveX) regular vulnerabilities - including JavaScript. JavaScript is not any better than Flash or ActiveX and I really don't understand why people pretend it is.

BTW, Flash was definitely a core part of the web in its heyday, too.

ETA: Oh, and Java was also executed in a sandbox (and a virtual machine!) and had plenty of vulnerabilities back when applets were a thing.

At least with Flash, ActiveX, and Java you could choose not to install them and most sites would continue working. For JavaScript you have to install (and trust) some third party extension to block it and then no sites work...


Flash was never a core part of the web. That was the problem: it was loosely bolted onto browsers but the company behind it didn’t understand or care about the web, spent their time inventing random new things for demos trying to get you to build on top of their platform INSTEAD of the web, and was never willing to spend time on support.

> JavaScript is not any better than Flash or ActiveX and I really don't understand why people pretend it is.

Because it is. Both of those were hard to use without crashing the browser - the primary selling point for Chrome originally was that it used process sandboxing and so when Flash crashed you wouldn’t lose every open window - whereas what we’re seeing now are complex attacks requiring considerable investment finding ways to get around the layers of precautions. It’s like saying that there’s no difference between leaving your money under the mattress and putting it in the bank because banks still get robbed.


[flagged]


> That was most definitely not the primary selling point of Chrome.

It was very popular, especially for people who did support or were the “tech guy” for their friends & family. Chrome had multiple nice features but the one which most frequently got people to switch permanently was not losing all of your work when Flash crashed. Not having to tell people that the big comment they’d been working on for an hour was permanently gone because some ad in a different windows crashed Flash lead to a lot of installs.

This was especially bad for anyone developing in Flash because Adobe was motivated to sell licenses and only cared about reliability to the extent that it impacted sales. Their vision was that instead of using web technologies you’d target Flash and run things maybe on the web, desktops, mobile, or set top boxes but always Flash. The problem with that was that they mostly focused on shiny new things which demoed well but didn’t spend money on QA or support. I shipped a couple of Flash apps in the 2000s where we hit basic bugs in their library code, and the support process was basically that you filed a bug report with a ton of details, didn’t hear anything back until the next major release, and then the issue would be automatically closed with a generic suggestion that you try buying the new version and reporting if it wasn’t fixed. $800 later, you could repeat the process as they never once fixed a bug even with a simple reproducible test case.


An important point that you're both missing:

Flash did things "web technologies" didn't. There was no module system, no React, no custom elements. JavaScript was a bit like bash: usable for simple things, but only masochists would use it for anything complex.

The early Google web properties - Maps and Gmail - were the vanguard of the web as an interactive platform. As I recall, they were actually written in Java and translated to JavaScript (by GWT) in part because of how limited JS was as a platform at the time.

To this day, there are things that's were easy in Flash that are borderline impossible on the web. The dual paradigm of drawing when drawing is easiest and coding when it isn't was really powerful, and the modern web still misses that.

I was thinking the other day of making a button that transitions into a throbber when you click it. Would have been a fun afternoon project in Flash. Would be a couple weeks of math homework in JS.


> As I recall, they were actually written in Java and translated to JavaScript (by GWT) in part because of how limited JS was as a platform at the time.

GWT - at least the opensource version - came a few years after Gmail launched. I’m pretty sure gmail and other Google products of that era were written using the Google Closure compiler. Not GWT. (Not to be confused with closure the language).

The closure compiler was written in Java, but it was a JS-to-JS compiler similar to a js bundler, minifier and the typescript compiler all rolled into one. Its input was (is) a semi-typed JavaScript dialect where types were put in special comments. It has a lot of code analysis & optimisation tricks that I think still aren’t present anywhere else in the JavaScript ecosystem. And it came with a batteries-included standard library that was really needed at the time, to both pave over browser differences and provide a lot of missing functionality. (Js was a scrapyard back then).

I’m surprised the closure compiler never saw much use outside of Google. It was a lovely piece of software for its time. It’s opensource - but nobody seems to know about it.


> Flash did things "web technologies" didn't.

This is true. Especially before HTML5's canvas, video, and audio elements were standardized. Things have gotten so much better now with the advent of the Web Animation API also.

As powerful as Flash was though, sometimes that power was misused. I remember websites with slow intro animations that you had to wait to load before using the website. Flash also powered some of the most annoying advertisements I've ever had the displeasure of viewing.


Oh, sure, I wasn’t saying that Flash was entirely without value - we all used it for a reason - but simply that it wasn’t a web native technology. That proved to be its downfall: the entire ecosystem depended on the whims of a company which just isn’t interested in supporting platforms, and that proved to be fatal in the end.


> To this day, there are things that's were easy in Flash that are borderline impossible on the web...

My only experience with Flash is playing Flash games, heh. What would you say made Flash so powerful to achieve such things?


That's a longer answer than I have scope to type right now…

The short version is the seamless interplay between drawn and coded objects. The viewport was called a Stage (Illustrator's equivalent is the Artboard). The Flash tool had all the vector drawing goodies you'd expect. Everything you drew on the Stage was a DisplayObject. Everything you coded was also a DisplayObject. Things that were easier to draw than code, you would create in the Flash app. Things that were parametric (e.g. you want 200 of them, or you want to explain in code how a thing moves), you'd code. You could easily import drawn assets into code, or drag visual containers for coded assets onto the Stage.

Modern JavaScript does an okay job at the coded aspect of it, but first class support for hand-drawn assets is totally missing. SVG (Adobe's attempt to compete with Flash before they bought it) can handle still images, but nothing is seamlessly multi-modal like Flash was.

The other thing Flash had going for it that nothing else does is animation. Even today, motion designers are stuck using After Effects (a tool designed for movie titles and broadcast news graphics). We still don't have a good tool for making interfaces that move, which means our ability to make properly interactive interfaces was severely knee-capped when Flash was removed.

So Flash let you draw and animate at whatever level of fidelity you wanted - draw an arm, animate a progress indicator - and use those assets seamlessly in code. Or code the parts that are painful to draw and lay everything out visually. It really let you use the best tool for the job, and it provided some best-in-class tools for those parts of the job that still aren't matched today.

PS: One of the coolest parts of Flash that nobody talks about was its ability to treat vectors like clay. Illustrator and the tools it inspired make you compose a vector out of points and bezier handles. Vector creation in those tools requires deft use of barely-abstracted math - it's a craft that's hard to master.

In Flash, on the other hand, you could just push and pull on the edge of a shape until it looked how you wanted. It abstracted the bezier into a more humane form, and that's just one of countless microtools that we lost when we lost Flash.


> The other thing Flash had going for it that nothing else does is animation. Even today, motion designers are stuck using After Effects (a tool designed for movie titles and broadcast news graphics). We still don't have a good tool for making interfaces that move, which means our ability to make properly interactive interfaces was severely knee-capped when Flash was removed.

We definitely need an easy to use GUI for building complex animations. Exporting animations from After Effects via Lottie is not an ideal solution. Maybe Rive or some other startup will fully supplant this workflow at some point.


very popular != primary selling point


Take it down a notch. Please try to adhere to the Hacker News guidelines when posting[1]

[1] https://news.ycombinator.com/newsguidelines.html


LOL.


Way too condescending and hyperbolic in turn, the general excuse being you believe the other post was hyperbolic. Even if the every assertion was 100% correct...D-. Revise and edit and please come see me after class.


No, actually I just don't have time for idiots who make unfalsifiable statements when they've been shown they're wrong.

(BTW, uncrashable Chrome just "crashed" for me: all windows, across two profiles, stopped accepting input properly. But hey, Flash bad!)


> Flash (and probably ActiveX) were also executed in a "sandboxed environment", including "limited access to the system's resources".

IIRC, the main issue with ActiveX was that it did not execute in a sandboxed environment, unlike Flash and Java. With ActiveX, all you had was a cryptographic signature saying it came from a trusted publisher; past that, the full Win32 API was available, with complete access to the operating system.


That wouldn't particularly surprise me. I never used ActiveX, so I can't really speak to that one. But then, there also weren't many (public) websites that I ever ran into that wanted to use it.


> But then, there also weren't many (public) websites that I ever ran into that wanted to use it.

As I understand it there were weird pockets where organisations went hard in to activeX. IIRC it was used heavily by the South Korean government, and a lot of internal corporate intranet projects for all sorts of things.

That obviously caused massive problems a few years later when Microsoft tried to discontinue activex and make IE/Edge a normal web browser.


As someone who still has to support users of several ActiveX apps, turning off the "block unsigned ActiveX" setting goes with the territory of using it.


Either “sandbox” was the funniest and most appropriate name ever chosen in tech, or the person who came up with it has never actually seen a sandbox.


Everyone who runs Node runs untrusted code (depending on your definition of untrusted). No one I’ve ever worked with made an effort to review the source of the thousands of dependencies they were slinging around.


I’m pretty sure untrusted code means code you can’t trust, which includes any code that you haven’t either analyzed yourself or put through some sort of institutional review with auditable sign-offs.

It is how these conversations always go:

There’s a hole in the sandbox.

If you were trusting the sandbox, you were already doomed.

Nobody validates their code well enough to trust it. (we are here)

The ecosystem and industry is just irreparably damaged.

What am I supposed to do about that?

Non-solutions, because it is an impossible problem to actually fix


Web browsers rely on the sandbox. Almost everyone runs untrusted code single day. There are very few people who do not trust the sandbox.

It does not directly affect servers if one rejects your rather broad definition of untrusted, but does indirectly.

> I’m pretty sure untrusted code means code you can’t trust, which includes any code that you haven’t either analyzed yourself or put through some sort of institutional review with auditable sign-offs.

That is so broad that very people are running trusted code. You would need to ensure your entire stack down to the firmware it runs on had been analysed.


Sounds like the ecosystem and industry is just irreparably damaged.


I would expect "untrusted code" to mean "code in a sandbox" or "code I'm not gonna run at all anytime soon", so running code from thousands of dependencies in node is effectively trusting all of that code, unless it is your direct expectation that it is malicious (and even then, aren't you trusting it to be malicious ?).

The trust we give to random dependencies like that is quite arguably unwarranted to a large degree, but it doesn't mean the code isn't trusted.


Serverless and CDN style edge compute are two scenarios that this may be relevant to, where untrusted or semi-trusted code may run in some construction on top of V8. Especially providers of those services are probably tuned in right now or ought to be.


There's not a lot of context in this submission, but presumably it is being linked because the release notes for this CVE says "Google is aware of reports that an exploit for CVE-2024-0519 exists in the wild."

https://chromereleases.googleblog.com/2024/01/stable-channel...



That's a different bug (CVE-2024-0517 vs CVE-2024-0519)


I’m pretty sure the chain is all three bugs. 517, 518, 519.


I'm pretty sure it isn't, the write up only uses 517 to get an arbitrary write primitive and then did a pretty standard chain into a sandbox escape via wasm (disclaimer - I work on V8).


Hmm. I also thought the type confusion in 518 was the same one from the blog post, but looking at the patch, it's not either. I think I stand corrected overall.


I wonder if they got a bounty from it?


Exodus sells vulnerabilities to the government.


To quote their "ethics" statement:

"We pride ourselves in our skillsets to parallel those of nation state hacking groups, and we tout that our expertise is unrivaled in our ability to discover and exploit vulnerabilities in a variety of product.

Our intention as a Company is to provide this intelligence to US and Allied countries for their enterprise and governments to have a leg up over the malicious actors from around the world."

Ouch.



Are you sure? That might be one of them, I think one of them is this one (see my other comment)

https://chromium-review.googlesource.com/c/v8/v8/+/5173470


That commit says it is for 1515930. The Chrome releases page says that CVE-2024-0519 is associated with 1517354, which is what I linked to. There may be a connection between CVE-2024-0519 and CVE-2024-0517, but none is mentioned on the Chrome releases page which is what I'm going by.


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: