HN2new | past | comments | ask | show | jobs | submitlogin

> The browser vendors are doing exactly what they're supposed to do: come up with new things. If the W3C was doing its job, it would watch those new things carefully and, when enough momentum existed, start ratifying them as standards.

Both the browser vendors and the W3C could be doing a better job. But neither is really the problem here.

The problem is that WebKit, a single implementation, is utterly dominant in the mobile space (except for Opera on low-end phones in the far east). This isn't a criticism of WebKit. Any single implementation that becomes so dominant is bad.

Any single implementation, once it has vendor prefixed stuff in release builds, will lead to the problems the mobile web has now: People will use the vendor prefixes, after which that implementation can't remove them without breaking websites. And after that, other vendors will be forced to implement those prefixes in order to render the web in a non-broken way.

One way to avoid that is for vendors to not vendor prefix at all in release builds, but all browsers do that. If that is out, the only thing that can save us is for no single implementation to become dominant. But WebKit won on mobile - Android and iOS are by far the leaders, and both use WebKit; in fact, even most of the runners up do (Blackberry, Bada, WebOS, etc.).

My only criticism of WebKit is that, when it saw itself becoming so dominant, it should have been much more careful with vendor prefixing, preferably stopping doing so entirely in release builds. It's ok to vendor prefix when you have a competitive market, but if you do so when you are dominant or about to be, you end up with the current debacle.



In case people aren't aware of just how badly competing browsers are locked out of the mobile web, imagine a world where most of the buttons in Google Maps are broken in Firefox. Where bing.com doesn't work in Opera. Where non-WebKit browsers get a degraded experience in Google web search, and a barely-functional version of Gmail.

All of this is true on mobile phones today. And it's getting worse over time rather than better. If you want to build a browser that provides decent compatibility with popular mobile websites, you have little choice but to emulate WebKit's experimental (and often unspecified) features.

[Disclosure: I work for Mozilla on Firefox for Android, and as the editor of the W3C Touch Events spec.]


The web - the browser programmers, the HTML developers, certainly all levels of management - is incapable of learning from its mistakes - we've been over this with Netscape, then IE6, and now the exact same thing is happening with Webkit.


In my view what we've learned is the standardization process doesn't work. We had IE and Netscape doing what they liked and we got innovation; we got javascript, we got iframes, we got embedded video. Sure, each of these is a mess, to a lesser or greater extent. But they let us build the sites we want. Then in ten years of W3C we got nothing - just the blind alleys of xhtml and css 2.1. The web has only started moving again after the WhatWG basically said to the W3C "we're going to implement this stuff, you can either call it html5 or become irrelevant". We have learned from our mistakes. We're returning to the netscape vs IE6 days because they were better for innovation than the ten years of emptiness that followed.


The ten years of emptiness were caused by the browser wars and its ultimate, unequivocal victor when it, and the technology it supported, became entrenched for an extended period of time. It's not as though the IE team wasn't innovating because W3C stopped them.

During the emptiness, technologies like alpha-transparent PNGs started out but never got anywhere because the gorilla didn't support it and what were you going to do? We got this idea to create layouts without using tables, but oops, box model bugs, I sure was glad to be working around the legacy of the days of innovation, and hey, wouldn't position:fixed be great? You had upstart browsers that couldn't access many websites because in the happy days of innovation the web decided that if (document.all) elseif (document.layers) was a decent way of writing code and it's not like anyone was going to rewrite those. Remember that innovative native client technology a whole bunch of banks and DRM sites decided could be good for secure, controlled internet experience? I think they called it ActiveX, sure did wonders for usability and practicality of actually innovating browsers that could not support it.

You say we've learned from our mistakes but to me the use of -webkit- and continued use of user agent sniffing (buggy, natch) suggests otherwise. Who'll be updating those sites two years from now, when Firefox Mobile's or X Mobile's rendering engine is the innovative stuff all developers love?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: