HN2new | past | comments | ask | show | jobs | submitlogin

If it is true that we have moved past the era of formal standards, we will have lost something that is important. More important than most people understand.

If a technology has only implementation, even an open source implementation, that implementation will have all sorts of hidden assumptions and biases. No one will know about or understand those assumptions and biases until the time comes to use that technology in a new context.

In my work, I've dealt with multiple implementations of things like simulators, models and even languages. The additional implementations usually had some specific reason behind them (performance, alternative features, supporting different execution environments, different mathematical properties and so on). Nevertheless, the value of each alternative implementation (in terms of surfacing hidden assumptions, finding bugs, illuminating more general and flexible alternatives, etc.) went well beyond the specific reason that implementation was made.

These considerations are even more important in the case of software that is part of a large, chaotic, heterogeneous system like the Web. Imagine where the mobile web would be today if there hadn't been a W3C and, at the turn of the century, the Web really had been "what Internet Explorer renders" with no attempt to document and codify anything else.

Or, to take your example of Objective C, how much does Objective C matter outside of Apple's ecosystem?

Contrast this with:

- How much does JavaScript matter outside of Netscape's ecosystem?

- How much does Java matters outside of Sun's?

- Or, to take the granddaddy of examples, how much does C matter off of the PDP-11 and outside of Bell Labs?

None of those languages would be anywhere near where they are today without standards. On the other side of the coin, there is a reason languages like Python, Ruby and Haskell have had (and, each to a varying extent, continue to have) a "bus problem". Standards matter and even just the standardization process itself matters. I wish more people understood that.



How much does Objective C matter outside of Apple's ecosystem?

It doesn't need to. Apple has demonstrated the agility of a wholly owned, vertical stack. The pace of change has qualitatively changed since the days of language standards set by plodding standards bodies.


It depends on what your ambitions are. I'd agree that it doesn't matter much to Apple that Objective C isn't standardized (and not standardizing may be an intentional part of Apple's business strategy by increasing porting and switching costs).

But that also means that, no matter what its true potential is, Objective C will never be more than what Apple makes of it. It won't be seriously used outside of application areas Apple is interested in. It won't develop new features Apple doesn't care about. And if Apple decides to go in a different direction and/or runs into serious trouble, well that's it.

Apple has changed the world, but, unlike some other, standardized languages including the ones above, Objective C never will. If we lose the full potential of future great languages because they are limited by a single company and/or a single implementation, we will all be poorer because of that.


I used to think so but I've come around to the idea that the language is just part of the platform. Learning the Apis and idioms is always more work than learning a new syntax anyway.


I guess this is where we're actually parting ways. I agree that learning a platform, its APIs and idioms is important, non-trivial work, but...

As you might guess from my username, I'm a fan of more than a few languages where (at least I believe) the differences go well beyond syntax. That means I want standards because I want to know what I can (or could) take with me to another platform and what I can't.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: