Hacker News new | past | comments | ask | show | jobs | submit login

Apparently this can actually get decent frame rates. Although I suspect they are using the patched version of the JS interpreter mentioned on github.

http://yfrog.com/nmng0z

Still not sure how this is meant to actually be useful though. The problem with H.264 isn't availability of implementations, it's being non-free and heavily patented.

I can kind of see a use for this if you are a big content provider with a bunch of cash and want to distribute video to Firefox users without transcoding but it still seems pretty derpy.




> Although I suspect they are using the patched version of the JS interpreter mentioned on github.

We are using standard Firefox, no special patches. However, we used the Firefox nightly, not current stable. The decoder runs much faster in nightly, due to JS engine improvements that landed over the last few months, and are not yet in stable.

> Still not sure how this is meant to actually be useful though. The problem with H.264 isn't availability of implementations, it's being non-free and heavily patented.

First thing, this at least gives you another option. That is, if we get this codec to run as fast as a native one, then we now have the choice of either the browser or the video website providing the decoder (and properly licensing the decoder, if they are in a country that has pure software patents). More options are never a bad thing.

But I think the real potential in this approach is something entirely different. The opportunity is that you can download arbitrary decoders. So instead of the current world we live in, where you have a few decoders installed, you can have custom ones for different websites. Imagine a website that has cartoon videos or anime etc. - in principle, they could use a custom codec that is heavily optimized for that kind of content, as opposed to being forced to use stock decoders.

Also, it prevents being frozen in time: If you can download decoders from the web, you can improve them constantly while making sure your users have the proper decoder (since you ship it to them yourself), which you can't do if you rely on stock preinstalled decoders.


> if we get this codec to run as fast as a native one

Native ones have chunks written in hand-tuned assembly language, offload parts to specialized hardware, and other such tricks not available to ECMAScript. I'm not even sure why "as fast" is being considered a possibility.


I agree in general that native decoders can be faster - they can in principle do anything a JS decoder can, and in addition the things you mention. However,

1. JS can also use hardware acceleration through WebGL. We have not done this yet, but will. 2. JS has some proposed extensions, WebCL and Intel's RiverTrail, which let it utilize SIMD and other computing hardware. We will investigate using those too.

With those two things, we believe JS performance will be very good. How close it will be to native code, though, is hard to say at this point in time.

However, there is one big advantage a JS decoder would have, that native code does not: A JS decoder can be downloaded and run securely. As a consequence, you can continually improve your decoder in JS and your users will run the latest, most optimized version, while standard native decoders are typically upgraded much, much less frequently. Also worth noting is the potential to ship specialized decoders, as I mentioned in another comment, imagine an anime website that ships a video decoder heavily optimized for that specific type of content. That could be much more efficient than a stock native decoder.

Finally, it's worth noting that the decoder we compiled from C, Android's H.264 decoder, does not have any substantial amount of handwritten assembly. I had assumed like you said, that real-world decoders would have such things, and am curious why it doesn't. If anyone reading knows the answer I'd be very interested in that.


Well, for instance, we can take advantage of hardware features like shaders in WebGL or make use of WebCL, rivertrail, etc., which are all available to JavaScript. Perhaps we may not reach the performance of native codecs, but we can get close enough.


Brendan's demo was running at 30fps, I believe on a Macbook Pro (I'll find out). Some content will run slower right now, which is true of all codecs AFAIK but is more so for this stuff right now.

The patches linked from the github README aren't necessary to run the transpiled version that was in the demo -- it's a memory optimization that's being used in the tuned-for-JS version.

As regards derpiness: it lets content distributors decide to pay for H.264, if they want, exactly, and paves the way for other codecs to be deployed by such distributors as well. It also runs the codec in a managed environment -- format decoders are often very fertile territory for exploitable bugs, since they are pretty much by definition all about pointer math and byte-poking. But the initial intent, when they decided a week ago to try it, was to push the envelope of JS performance such that we find new ways to extend said envelope.


It was on a MacBook Pro. We tried it on a MacBook Air too and it performed reasonably well.


My 2007 MBP manages a decent 20 fps but starts pretty slow and averages 15.

That's a 2.4GHz C2D. Need to try it on my i52400.


You should give the latest version a try, we improved the performance considerably.


Interestingly, my i52400 only manages 25 fps (average over 1 minute). Hmmm. Ahhh, it's only single-threaded.

Is that intentional, or a problem with Nightly?


It's a proof of the power of javascript. In another 10 years it'll be even faster due to faster compilers and faster processors.

Perhaps by then it'll be completely routine for video to be encoded and decoded using Javascript on a mobile device.


Useful? Not everything needs utility. It can just be cool. This is definitely the latter, although more as a Javascript exercise than a video one.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: