Now this is big. Doesn't this means Node.js would benefit from asm.js addition in V8 as well?
Few months ago by Brendan Eich: "Get your tums out, pal. We're taking PNaCl down for good this year with http://asmjs.org/. Cross-browser."[1] With the announcement of Unreal Engine and this, seems like he's one step closer to the goal. Now only JSC and Trident left...
I don't see this as big. Kbr wouldn't in any way be responsible for adding asm.js support to v8. He's one of the core guys on accelerated graphics, but isn't involved in the JS engine. Also, you'd be well served to consider anything Brendan Eich claims in the context of his own motivations. Because other JS engine developers aren't necessarily enthused about signing on to support a spec that Mozilla developed mostly in secret without their input, relies extensively on engine quirks, and doesn't yet (ever?) address many hard problems like code caching and threading. I'm not saying asm.js isn't interesting, but it's not a guaranteed win either.
> Because other JS engine developers aren't necessarily enthused about signing on to support a spec that Mozilla developed mostly in secret without their input,
I don't think that's accurate. We started work on asm.js only a few months ago; at the beginning, we had no idea if it would work, so we just threw around some ideas between each other. When we had something we felt could actually work, we immediately put it up on
And worked on it there. asm.js has been discussed on IRC (#emscripten and other places), the emscripten mailing list, etc., as we further improved the spec.
About the middle last month, we got some performance numbers, and felt we could say something about speed, so I mentioned asm.js in my talk at mloc.js and other more noticeable places, but it was definitely not a secret far before that.
> relies extensively on engine quirks
That would be a bad bug in the asm.js spec. Can you please elaborate?
Yet when Google spent only a few months developing the first Dart, and then released it to the public, and has been iterating on it for 2 years in full public view, and it still isn't even in Chrome nightlies (yet asm.js has already been dumped into Firefox nightlies) the same criticism was leveled at Google from Mozilla.
I think Mozilla would get a lot more benefit if people stop with the juvenile "You're going down!" adversarial mentality. If someone from the Dart or NaCL team had tweeted "2013 is the year NaCL and Dart destroy Javascript!" imagine the uproar. This is about exploring the solution space for solving technical challenges related to maximizing performance, helping developer productivity, and strengthening the Web, and there are many potential solutions. (I for one don't see the "Web" as Javascript, I see it as HTTP, URL, 'drive-by', indexability, transparency, platform independence and location independence. HTML, CSS, and JS are just particular concrete embodiments of those principles, but 30 years from now, they could be implemented in a radically different way)
I think asm.js is great for a slice of game genres, and I fully support it. The cheerleading and politics just isn't needed, let the code, benchmarks, and game demos speak for themselves.
Couldn't agree more. I think asm.js is very cool, and very likely a better approach than PNaCl (which I have been a big fan of for a while). But the "take THAT, Google," sentiment is really off-putting. Native Client was never about "take THAT, Mozilla" (or anyone else), it was about serving an unmet need that no one else was serving.
> I think Mozilla would get a lot more benefit if people stop with the juvenile "You're going down!" adversarial mentality. [..] The cheerleading and politics just isn't needed, let the code, benchmarks, and game demos speak for themselves.
I agree.
Also, btw, I think NaCl is wrong for the web, but an amazing technology, designed by super-talented people (I met a few). It solves a very hard problem in an impressive way. It already has other use cases than the web, and I imagine will continue to. For the web, however, as I say I think it is inappropriate, due to portability and standardizability concerns (PNaCl solves some of the portability concerns).
> Yet when Google spent only a few months developing the first Dart, and then released it to the public, and has been iterating on it for 2 years in full public view, and it still isn't even in Chrome nightlies (yet asm.js has already been dumped into Firefox nightlies) the same criticism was leveled at Google from Mozilla.
I think a big part of the PR problem Dart has began with the fact that it was leaked. The leak was an internal document, and it spoke somewhat bluntly - which is fine for an internal document, but it was an unfortunate way for it to be revealed to the world.
> Yet when Google spent only a few months developing the first Dart ... the same criticism was leveled at Google from Mozilla.
Yeah right. Dart was originally dash. Before that it was planned modifications to JS, but nobody wanted to turn JS into Java so you threatened JavaScript "would be replaced".
Dart/dash/google.js is just open-washing. Nobody outside of Google has any say in it, and that's why they don't take to it.
No, Dart had nothing to do with planned modifications to JS, and if you think it was in the oven for a long time, why then have 2 years passed and it still isn't shipping in the browser, and the language grammar has undergone many changes? It's spent an enormous amount of time, the vast majority, in public development. The Dart2JS compiler was rewritten multiple times from scratch since then.
I work for the GWT team, which was also owns the Google Closure Compiler. Simultaneously, at the same time, our team was kicking around taking the Closure type annotations, and making them actual (optional) grammar terminals instead of boilerplately JsDoc comments, pretty much exactly what TypeScript or JSX is today, and then propose those changes to TC39 (one difference is, we didn't have classes, we had Structural Types). A separate team was working on Traceur, which was prototyping a different set of changes to JS, and they too were planning to go the TC39 route. Google is a big company and can have multiple teams prototyping many approaches. This stuff about turning JS into Java sounds like religious nonsense from people who have an aesthetic derangement over classes and OO, there are advocates on both sides, not just Googlers (and in fact, there are Googlers opposed to classes in JS), and ES6 has been fielding Class proposals. We were testing out how various extensions "felt" and worked with tooling, for example, Structural Typing in the presence of recursive types has issues.
There's nothing "non-open" about coming up with language extensions, doing a prototype implementation, and then putting it out for public review and drawing up a spec. That's exactly how all of the IETF specs get done -- rough consensus and running code. Talk is cheap, a proposal is better understood if it comes with an example prototype. Mozilla does their own prototyping of extensions to JS, it's the only way to really do a sanity check. Compared to IronMonkey, no one is under any illusions that something like Traceur was being pushed as some kind of official product for people to adopt and use, it was very clearly, a test bed for experimentation.
If we take your definition, no one outside of Mozilla had any say over asm.js, they developed the complete asm.js spec without involving the public, dumped it fully formed, along with an implementation VM, emscripten integration, and even an UnrealEngine demo, all before any standardization activity. The draft spec has no non-Mozilla editors/contributors listed or acknowledged as far as I can tell.
One of the real shames here, all the focus on Dart/NaCL, while native apps distributed on mobile OSes are eating our lunch. I want ChromeOS and FirefoxOS to be a success, but it is more likely to be a success if Mozilla, Google, et al can work together, and avoid attacks. It pains me to see this because both sides are doing tremendous work to move the web forward, and pettiness can harm spirit of cooperation.
> if you think it was in the oven for a long time, why then have 2 years passed and it still isn't shipping in the browser, and the language grammar has undergone many changes?
As an outside observer I'd say as a company with basically unlimited money resources for a decade that Google devs probably are slacking off quite a bit, not really driven in language design to get anything done.
Asm.js is basically just a spec for what emscripten was already doing. There's not much too it beyond that other than the linking/module bit, so not much to even discuss with anybody else. Dash/Dart/Pepper is a major piece of technology that is not a subset of existing tech like asm.js is. It has a much higher expectation for collaboration.
So rather than believe that Dart wasn't anywhere near done when it was announced, as evidenced by the massive changes, 20000+ commits since then, and the time it's taken to get close to version 1.0, you prefer to think that the engineers are just slacking off.
Why is this a game technology and not a general purpose one? I think Mozilla might want to expand their message a bit, and provide some non game engine demos.
>I don't think that's accurate. We started work on asm.js only a few months ago; at the beginning, we had no idea if it would work, so we just threw around some ideas between each other. When we had something we felt could actually work, we immediately put it up on
As others have pointed out, Mozilla has been less than receptive when similar situations were reversed. I also know from working with the NaCl and PPAPI teams that they repeatedly reached out to Mozilla in an effort to publicly develop a mutually agreeable standard, but they were not well received. So, given past precedent, the way asm.js was introduced isn't necessarily bad, but it certainly feels a bit hypocritical.
>> relies extensively on engine quirks
>That would be a bad bug in the asm.js spec. Can you please elaborate?
Perhaps "quirks" isn't the right word for things that are idiosyncratic but legal either by standard or de facto convention. It's certainly an ingenious way to get backwards compatibility while basically defining a new IR, but it is a bit... quirky.
As I mentioned elsewhere, I think asm.js might really catch on. It could be the IR that finally becomes universal, in large part because your backwards compatibility strategy is so damn clever. That said, I also see a few obvious pain points that will need solutions:
* Reasonable debugging support (maybe just a metadata standard morally equivalent to symbol files)
* Threading (even just co-routines, it seems some behavior needs to be specified)
* Load performance (validation and code caching seem needed, which are very tractable but just mean more work)
> As others have pointed out, Mozilla has been less than receptive when similar situations were reversed.
That's a long conversation, and I likely know nothing of the non-public aspects of it. From what I know of the public stuff, I don't think it is the same in reverse. I do see the general similarity you refer to, I just think there are some fundamental differences that explain the different responses.
Happy to discuss this more if you want.
> Perhaps "quirks" isn't the right word for things that are idiosyncratic but legal either by standard or de facto convention.
I still don't know what you mean, can you please say what concretely in the spec you are referring to?
>> Perhaps "quirks" isn't the right word for things that are idiosyncratic but legal either by standard or de facto convention.
>I still don't know what you mean, can you please say what concretely in the spec you are referring to?
You're using a subset of JS as an IR, which is something the language was never designed for and not inherently good at. To make that performant you have to hack in things like manual memory management and use type coercions to game existing engine behavior. To get full performance you need significant engine changes including a mode switch, validation step, and special purpose compiler. And of course, the JS as IR doesn't really qualify as human readable in any meaningful sense.
And I happily admit that type coercions and similar tricks are a very clever way to structure your IR while simultaneously maintaining compatibility and improving performance in existing JS engines. However, in doing so you're already relying heavily on unspecced implementation details of existing engines.
So, my point is that it's a hack... a really brilliant hack that might actually be the best path forward for a universal IR on the Web. But it's still a hack, and it brings with it some serious pain points that still need to be resolved.
> You're using a subset of JS as an IR, which is something the language was never designed for and not inherently good at
I agree it was not designed for it. However I a not sure what "inherently good at" means - after v8 showed up in 2008, for example, many types of code suddenly became very fast. Were they always inherently fast?
> To get full performance you need significant engine changes including a mode switch, validation step, and special purpose compiler.
I disagree. Firefox did pick that option because it seemed best, but I believe it is possible to achieve similar speeds with other approaches. As the slide here shows,
on many benchmarks, even without any new optimizations modern JS engines are already fast on asm.js code. And with new optimizations, even without things you just mentioned, they should be able to get fast on the rest. I saw some activity on the v8 bug tracker indicating possible work in that direction, which I am very curious and hopeful about.
> And of course, the JS as IR doesn't really qualify as human readable in any meaningful sense.
Yes, it is not intended to be - like the output of closure compiler, etc. Note though that emscripten can generate asm.js in debug mode, which is not minified, and actually quite readable - you can recognize function and variable names, for example. It looks a little quirky to be sure ;) but it is more readable than compiler IRs like LLVM for example, in my opinion (and certainly far more readable than x86 or ARM assembly).
> However, in doing so you're already relying heavily on unspecced implementation details of existing engines.
Performance is not specced at all for JavaScript. Again, when v8, nitro and tracemonkey came out in 2008-2009, many types of JS code suddenly got fast. There wasn't a spec for any of that. No one says modern JS engines should use int32 when a value never goes out of the int32 range, but all modern JS engines do that (in hot functions).
> So, my point is that it's a hack... a really brilliant hack that might actually be the best path forward for a universal IR on the Web. But it's still a hack, and it brings with it some serious pain points that still need to be resolved.
I fully agree it is a hack, and it has various pain points. It's a compromise, not a clean solution from scratch. I hope we can resolve many of those pain points in time, and that we can do that in collaboration with all browser vendors together.
> I also know from working with the NaCl and PPAPI teams that they repeatedly reached out to Mozilla in an effort to publicly develop a mutually agreeable standard, but they were not well received.
They weren't well received I suppose because Pepper is a binary API that duplicates everything already in the browser. Why does PP have to be so large? Because with NaCl it's incredibly awkward to call javascript functions from native code, and vice versa. It's not webby.
That would be a reasonable argument if Mozilla wasn't still adding extensions to NPAPI, a far less webby technology that's a much worse design and outright dangerous for end-users. And yes, NaCl has its awkwardness, but it's far better than NPAPI, and if you look a bit you'll realize that NaCl's awkwardness is specifically because it can support things that asm.js can't. NaCl has real native threads, clean debugging, and a simpler legacy code path, but the cost is the complexity of the implementation.
The funny thing in all this is that I'm really not a proponent of NaCl. I find it technically very interesting, but I never considered architecture-specific NaCl viable for the Web, and there are still kinks to work out with PNaCl. I just find the attitude towards asm.js curious, because it carries so many of the past criticisms of NaCl (not human readable, initially developed in private, etc.).
Why would Mozilla add extensions to NPAPI, for more advanced plugins that must be trusted or expensively sandboxed because they are normally compiled? Mozilla is doing everything in the web, adding sound and graphic APIs to html not duplicating them for legacy native code. It doesn't even make sense for Google to do Pepper, let alone Mozilla.
Real threads essentially mean that every call must copy all data or else another thread could modify the data while it is being used by trusted code, or all threads have to be suspended. That's clumsy and not a good solution.
Your statements here are just nonsensical. And given that your account was created the day of this post, and hasn't commented on anything else, it's hard not to read this as simple trolling.
> other JS engine developers aren't necessarily enthused about signing on to support a spec that [company] developed mostly in secret without input from other JS engines, relies extensively on engine quirks, and doesn't yet solve many hard problems
And I would argue that any spec tagged with these kinds of complaints rarely ever sees broad adoption, regardless who made it. From my perspective, the only thing asm.js has going for it that offsets the downsides is that the code at least runs in other JS engines even if they don't explicitly support it. And that alone might actually be enough.
> Because other JS engine developers aren't necessarily enthused about signing on to support a spec that Mozilla developed mostly in secret without their input, relies extensively on engine quirks, and doesn't yet (ever?) address many hard problems like code caching and threading
It's a spec for a subset of JS with guaranteed AOT compilation — it's hardly one that needs to go through a thousand committees to get somewhere where all those concerned are happy. If you want to do AOT compilation, everyone basically needs the same subset. Indeed, the most contentious part of asm.js is probably how you isolate it off from the rest of the script. It's simply not a spec that defines very much.
And what engine quirks does it "extensively rely on"? If it relies upon anything apart from ES5, it's broken and the spec needs fixing.
Similarly, it's out of scope for it to deal with anything like code caching and threading — the former is depending on how you tackle it either implementation detail (see how Carakan and V8 implement it today) or something really quite radical to the platform (but not something that needs to be immediately tackled), and the latter is one solution to a broader problem (parallelism in JS). Neither seem like things asm.js should be rushing to solve.
An AOT compiled subset of JS is not in itself novel: this is not the first time it's ever been talked about, but what is novel (to my knowledge) is the idea of making compiler behaviour explicit for it, by having a token in code, along with isolating globals. Enabling AOT compilation is in many ways the "golden-grail", as although one must still be weary of compilation time (see SunSpider — many of the benchmarks complete in under 10ms, so you don't have the time to spend compiling them), the warmup time is currently relatively large.
asm.js is not a redesign of JS from the ground up, it is an evolutionary step in improving support for a certain style of Emscripten/Mandreel generated code that Google is at least somewhat interested in, given its inclusion in the Octane benchmark.
Performance improved in Chrome on at least one Emscripten benchmark when generating asm.js code rather than the previous backend, without any further change to Chrome, probably because asm.js does a better job at formalizing the various assumptions that this style of code relies on.
What I meant to say was not to use asm.js as target for compilation, but rather implementing performance-critical module in asm.js subset and get the extra performance out of V8. Sort of like how Rusha[1] did it.
Calling asm.js code is not free either, it's worth mentioning. I measured it at (very) roughly 2ms/call for the code I posted yesterday. (At asmjs.org they mention they intend to address this.)
It's not a no-op, it has to check types and maybe convert representations, much like a C/C++ FFI (but with more freedom for the implementor, who's not committed to the C ABI, so there's potential to go faster than a C FFI).
But I agree it needs attention, and apparently so do the Firefox devs. In http://wry.me/hacking/Turing-Drawings/ the asm.js version ran much slower than the regular JS version, until I adapted it to make fewer calls. The slowness also makes some other uses I've thought of impractical.
(The figure I gave was not carefully measured; just enough to say "yep, that'd account for why this is amazingly slow".)
It could help solve the cross platform extension problem. Windows users often get left out. This would make node.js more like java as a write once run anywhere platform.
Anyway I agree. People in the previous thread clamoring for NaCl, claiming asm.js isn't supported by other browser now look really, really foolish in hindsight.
I'm moderately happy the Web is moving in this direction. I'm glad Mozilla is moving it in that direction :)
I have no idea why Mozilla was so resistant to pnacl. It's a nice step beyond the nasty legacy JavaScript mess. Among other things, JavaScript is necessarily single threaded... Yea.
DRY? I think NaCl will require Mozilla to implement another VM or parser, and do same optimization it already does on JS. So why should they repeat themselves?
"WebWorkers are not really a multithread model. Game engines could be written to leverage message passing based isolates, but in some cases, but doing it in an optimal form isn't exactly developer friendly, it's one of pains of developing the PS3 SPEs is that they didn't share system memory and you had to DMA stuff around."
Anyway, he said that the problem with Pepper is that its API is gigantic and its only spec is the implementation in Chromium. Other browsers could port the API but it come at very high cost due to WebKit-specific glue code.
The "problem" with pnacl is that it means the end of Javascript's death grip on the browser, which Brendan Eich is interested in extending for obvious reasons
To be clear, there are two things that have historically been called "pepper". The first was a basically a better version of NPAPI, with the limited scope that implies. That was replaced by something that's basically a Google-proprietary replacement for the entire web stack, with both the scope and the problems that implies.
It may run on multiple threads but I don't think it exposes threading to the runtime. Ecmascript is certainly single threaded and async by design, so v8 would be straying far from the spec.
for opening up a new separate thread, that you can only communicate through message passing. It's not part of the core language but the browser library hence why you can't use them in Node.
They're also more directly comparable to processes than threads, as they do not share memory (though there are a couple of proposals to allow them to).
"Cross-browser". Why exactly is it asm.js cross-browser? Because they just said they'd put it in V8? The argument is moot since if Mozilla said that first about PNaCl (that they'd adopt it) it would be PNaCl that would be the cross-browser solution.
No, I don't care about backwards-compatibility. I care about the solution leveraging existing infrastructure and not being a hack.
asm.js is, at its heart, nothing more than a compiler hint. It tells the runtime that the JavaScript you're writing in this function follows some conventions that happen to be easy to optimize, but it's still JavaScript code. A browser that doesn't support asm.js will pause for a few nanoseconds to wonder what the heck that strange string at the start of the function is, then go on to execute the JavaScript code the same way it always does, with the same result as if it had known what asm.js is.
As far as I can tell they haven't said they'd put it in V8. One member of the chromium team, who doesn't work in that area, has opened up a feature request.
This seems to be much ado about nothing unless and until it's accepted and assigned.
On the Note of Node.js, i was wondering if there will ever be a Node.js based on IonMonkey + Baseline Compiler instead of V8.
The engineer in my says, cool idea and judging by libuv [0] The joyent / node.js team seems to be open to abstractions, if help is offered.
Assuming the technical task isn't too expensive in terms of leaky abstractions or ripple, one has to ask the next question, if V8 works...why, would it be worth it?
Few months ago by Brendan Eich: "Get your tums out, pal. We're taking PNaCl down for good this year with http://asmjs.org/. Cross-browser."[1] With the announcement of Unreal Engine and this, seems like he's one step closer to the goal. Now only JSC and Trident left...
[1]: https://hackernews.hn/item?id=5226967