This isn't driven by a "fad", rather by the simple fact that any trace of JS engine performance will show a sizable chunk at the beginning dedicated to just parsing scripts. Heck, we even have "lazy parsing", where we put off doing as much parsing work as possible until a piece of code is needed. Replacing that with a quick binary deserialization pass is a straightforward win.
I wouldn't call this "compiled JavaScript". In broad strokes what's happening is you take the result of parsing JavaScript source to an AST and serialize that; then next time you deserialize the AST and can skip the parsing stage.
(Source: I spent a summer working on JS engine and JIT performance at Mozilla.)
I don't think the point was that this project is a fad.
Instead, I think the point was that Javascript is a fad (we'll see whether this is true by watching how popular compile-to-WASM languages become compared to JS, once WASM becomes widespread and stable).
Alternatively, we might say that JS was created (in 10 days, yadda yadda) at a time when the fads were dynamic typing over static typing; interpreters over compilers; GC over manual or static memory management; OOP over, say, module systems; imperative over logic/relational/rewriting/etc.; and so on. JS's role as the Web language has tied developers' hands w.r.t. these tradeoffs for 20 years; long enough that a significant chunk of the developer population hasn't experienced anything else, and may not really be aware of some of these tradeoffs; some devs may have more varied experience, but have developed Stockholm Syndrome to combat their nostalgia ;)
As an example, one of the benefits of an interpreter is that it can run human-readable code; this is a sensible choice for JS since it fits nicely with the "View Source" nature of the Web, but it comes at a cost of code size and startup latency. The invention of JS obfuscation and minifiers shows us that many devs would prefer to pick a different balance between readability and code size than Eich made in the 90s. This project brings the same option w.r.t. startup latency. WASM opens up a whole lot more of these tradeoffs to developers.
Actually I think it was created at a time where it was used very sparingly and had very limited scope. The joke used to be that the average javascript program is one line. I don't know if that was ever exactly true but a lot of early javascript was on inline event attributes (think "onclick" and the like).
For that use a forgiving scripting language is a good fit. What changed is what javascript, and the web, is used for. Embedded Java and Flash showed that there was an appetite for the web to do more and the security problems involved with those technologies showed they weren't a good fit.
Javascript was adapted to fill that void by default as it was the only language that every browser could agree on using.
I wouldn't call this "compiled JavaScript". In broad strokes what's happening is you take the result of parsing JavaScript source to an AST and serialize that; then next time you deserialize the AST and can skip the parsing stage.
(Source: I spent a summer working on JS engine and JIT performance at Mozilla.)