HN2new | past | comments | ask | show | jobs | submitlogin
A Language Is More Than a Language (docs.google.com)
74 points by guiambros on Oct 4, 2015 | hide | past | favorite | 63 comments


"And you're right: we were not out to win over the Lisp programmers; we were after the C++ programmers. We managed to drag a lot of them about halfway to Lisp. Aren't you happy?"

-- Guy Steele - Sun Microsystems Labs (about Java)

source: http://www.ai.mit.edu/~gregs/ll1-discuss-archive-html/msg040...


Back then I was disappointed with the state of C++ compilers across vendors and OSes (the standard was still being defined).

Meanwhile, I also had discovered that it was possible to use a GC enabled systems programming language for writing OS (Oberon and Modula-3). And was exposed to Prolog, Caml Light, Smalltalk and Eiffel.

So Java seemed a nice compromise for industry adoption. Our university adopted it right away for its distributed computing and compiler development classes.

However I never liked the religiosity about a JIT, preferring the .NET way of having a choice between JIT and AOT.

So I still like C++ a lot, but unless I am shaving off ms or bytes from a specific algorithm or writing portable code for mobile devices, I tend to use other languages.


> However I never liked the religiosity about a JIT, preferring the .NET way of having a choice between JIT and AOT.

I wonder how much of this comes down to the fact that Java methods are virtual by default and in C# they are not. The former makes devirtualization more important and is harder to do ahead of time.


Might be. Most other OOP languages back then (mid-90's) used static by default.


I think what Bruce Eckel is arguing here is that the creators of Java and the Java API would have done a much better job if they had applied the lessons in advance that they learnt from years of Java being actively used. Which is, in my opinion, a somewhat unfair and unrealistic point of view.

But yes, a real Java 2 or Java++ or next generation Java would indeed be a good thing.


> But yes, a real Java 2 or Java++ or next generation Java would indeed be a good thing.

I though C# was considered pretty much Java with less warts.


I agree. But, until recently, it had the huge disadvantage of being a Windows-only product.


So Mono being released in 2004, is recent?


There was that recent article about FogBugz vs JIRA, where a former FogBugz guy was participating. He said that the buggy and incomplete standard library for Mono was a large part of their product's failure on UNIX / Linux.


From almost everything I've seen so far Mono wasn't really considered production-ready until recently.

I remember reading, for example, some horror stories from the early days of Fogcreek Fogbugz support on Unix via Mono.

Even today, is any big company using Mono? Are there are any big commercial (or Open Source) applications using it?


Agreed on Mono hardly being production-ready until recently: that said, Unity uses Mono extensively for production purposes.


You don't have the benefit of retrospection in advance, it's true.

But you can take more time before releasing major components/features, and try to get them right. Eckel thinks they could have done better if they hadn't been in such a hurry, and focusing on marketting over quality, and maybe had some more skilled people involved. Especially in a platform so committed to backwards compat, if you release something designed poorly, you're stuck with it.

I guess the 'marketting' rejoinder would be if they had taken more time before releasing an attempted solution, it might have prevented Java's success in the 'market'.

I don't know. But I think it's general consensus at this point that eg J2EE API's and design was/is pretty awful -- and I don't think it actually takes retrospection to see that, it was awful right away and was seen so by many right away too.


Was J2EE obviously awful at the time? I'm honestly not sure.

I had a summer job working on EJBs at a company that had built everything on CORBA. It was basically a research project; they knew they had to switch to EJB, because it was going to be the standard (ha), and they wanted to know how. This was in the EJB 1.x era, so somewhere between 1998 and 2001 - i don't remember exactly when!

I remember thinking that it was all rather complicated, especially around persistence. And i remember being appalled that beans didn't actually implement their interfaces. We came up with some rather complicated pattern to work around that - i think there was a master interface where all the methods threw RemoteException, then the bean interface trivially extended that, and there was a local interface that extended it but overrode all the methods to not throw RemoteException, then the bean implemented that. And i spent most of my time there wrestling with the fact that the performance characteristics of local and remote calls are very different, and how that impacted design; we ended up with DTOs mirroring every entity bean to save network round-trips.

But the thing is, there was nothing else around at the time that did anything similar. Remote object invocation, with declarative transactions, and container-managed object lifecycles, including pulling objects out of a database on demand. That seemed like saucer tech! So it seemed, glaring design flaws aside, fair enough that it was complicated. Hey, i might have to write an XML document, but at least it wasn't ten thousand lines of C++.


The problem is that many that suffered from J2EE v1.0, didn't had the "pleasure" to have worked either with CORBA or DCOM.

So they criticise J2EE, but don't get it was actually a pleasure when compared to these technologies, specially when adding the C++ portability issues across OSes and compiler vendors.


At the time, CORBA definitely seemed more sophisticated. CosTrading, IORs, lots of cool stuff. It was just a colossal pain to use in practice, because of the tooling and interoperability.


Last time I had to deal with it was in 2005, luckily not more.

It had lots of nice ideas, but it was a pain to use in reality.

Typical enterprise stuff.


See also SOAP. I think the promise of this kind of standard turns out to be false.


If, right off the bat, you had to come up with some complicated patterns to work around EJB's.... to me that says, yes, it was obvious at the time that J2EE/EJB was awful.

There might have been nothing at the time _in mainstream languages/platforms_ (important distinction) that did something similar, but that doesn't mean they coudln't have done better if they had spent more time on it. I mean, even if there's nothing better to use as a model (debatable), is it acceptable to release something that requires complicated workarounds? (I guess that's debatable too; but we know what side Eckel comes down on!).

(Does anyone still want remote object invocation these days? I ask for real, I don't know. In my circles, it seems generally to have been found to be a mistaken thing to try. Based on experiences like yours in fact. So the other question is whether 'something similar' is even what was needed -- again, I think, a design question that takes time. Figuring out the right features of the right solution to actual problems is design.).


> here might have been nothing at the time _in mainstream languages/platforms_ (important distinction)

Going off mainstream, I read somewhere that Objective-C WebObjects framework was the inspiration to J2EE, given the Objective-C work going at Sun back then.

Then there was also Taligent, but it was short lived as well.

> Does anyone still want remote object invocation these days?

Apparently yes, now they call it REST and microservices.


But noone tries to fake them being the same as local calls


Except almost everyone is providing SDKs that do exactly that. Should I provide links?


Really? Every 10 years the same mistakes roll around again, I guess.



I don't know whether to laugh or cry.


In its defence, that was one thing that was really ugly. One obvious but fairly trivial thing. And yes, it was something that they obviously could have fixed if they'd had a competent person working on it for a bit longer. But that didn't qualify it as 'awful'.

As for remote invocation, i think it became clear quite a while ago that there's no point trying to pretend that remote objects are local. So, indeed, the original motivation behind EJB and CORBA has fallen away. The current version of EJB does still have remote invocation, and it appears people use it; i think it's used in much the same way as protobuf+HTTP/Thrift/Avro are, as an efficient coarse-grained intramural RPC mechanism.


A fairly trivial thing that you spent non-trivial time developing a non-trivial in-house workaround for?

I dunno, it is indeed just one example (and it's yours not mine, I don't have enough experience with this stuff to use my own, I've succesfully avoided it in my career) and it alone isn't enough to make the case, but it seems to me to be an example of Eckel's point. For something you're going to put in a stdlib and is going to be used by possibly millions of developers, you have a responsibility to design it right.

Eckel writes "collective losses of billions due to bad design" -- this is not an exageration. How many aggregate developer-hours spent fighting with this stuff, that could have been prevented by many many fewer developer-hours up front doing it right in the first place?


How many aggregate developer-hours spent fighting with this stuff, that could have been prevented by many many fewer developer-hours up front doing it right in the first place?

How many aggregate developer-hours would have been lost waiting for it to be perfect rather than just better than whatever else was available at the time? Especially since perfection is a target that keeps moving as we learn more about what works and doesn't.


It was probably a pioneer effect with dominant position and far too much resources. They were doing complicated stuff (compared to <?perl for @range { say date(); } ?>), so it lead to premature abstraction over still-cooking paradigm.

But I'm sensing some historical farce coming, the isomorphic web starts to feel a lot like remote objects in javascript/protobuf clothings.


J2EE was awful because it was rushed; the early vision of Java was all applets, and when that didn't pan out, Sun was desperate to find a market, and settled on Java as the new COBOL. Even now I'll wager 95% of what people use EJBs for they could do with POJOs.


That is indeed what Eckel suggests in the OP, that it could have been a heck of a lot better if it hadn't been rushed for market penetration purposes.

I don't know enough about the history of Java to say. But I know from my experience that good design takes time. And I agree with Eckel that a lot of historical Java API's aren't very good design.


I was getting paid to write Java in 1995 so I saw all this first hand, but I got out of the Java game in the early '00s.


EJBs are POJOs now :-)

Source: I am a Java dev and I simply old code a lot by introducing EJBs.


Especially the heavy EJB part - when the heaviness of your designs starts to make it into book titles as something to be avoided, it may be time to reconsider the design :)

"Expert One-on-One J2EE Development without EJB Paperback – July 2, 2004"

http://www.amazon.com/Expert-One-One-Development-without/dp/...


I have no doubt that you cannot win the 'war' on the language front. All those bloggers, internet and conference luminaries that never write real-world production code miss the point. It's about the platform, the whole programming ecosystem not about 'beautiful' syntax. That's why Java - the platform (Java EE, Android, ...), not the language - won and is still going strong. New languages (D, Rust, ...) have not the ghost of a chance unless they are able to create a platform.


I agree. But don't forget that it took years of hard work to build up enough of "Java the platform". Things that help there is having traditional man-power, which costs money, and having a strong community, which takes lobbying/evangelising/advertising. In a way, all those "luminaries" you mention may or may not miss the point individually, but the buzz it creates is necessary to build the community that makes the platform, and that in turn makes the language succeed. Developers, developers, developers, developers, developers...


Can someone please clarify me... Not defending Sun/Oracle decisions in here but isn't Bruce Eckel confusing Java as a language with some Java libraries, of which some happen to be provided via JDK? This was what I felt while reading this presentation.

Thanks


Sounds like a good argument for Clojure.


And the Oracle legal shenanigans seems like a good argument for Clojure on the CLR. I don't know how mature the CLR ecosystem is on Linux, but the licensing at least seems ok.


Android Java is J++ relaunched.


clojurescript is starting to look like a potential successor isn't it ?


> And the Oracle legal shenanigans

You mean the Google legal shenanigans that tried to find loopholes around two Java licenses (one of them completely open-source, and the other would have cost them $25M) that would hopefully hold up in a court case (which they were sure would follow, because they went straight after Sun's business), failed, and in the process forked the ecosystem?

Oracle did exactly what Google dared them to do (after a reprieve while Sun was dying). Only, to get what they wanted -- payment for the use of Java not under its open-source license -- they used whatever legal argument they could come up with, in the hope that one would stick. One did (well, at least partially), and now Oracle is at fault? This entire court case -- except the decision -- was orchestrated by Google. They could have avoided the whole thing by either abiding by the open-source license (which, BTW, is exactly the same as Linux's) or by paying Sun $25M. Oracle, OTOH, had no choice because Google went after them with their own weapon.

Oracle didn't just go after someone who'd forked Java. They went after someone who'd forked Java with the intent to harm on of their main income sources from Java (at least at the time), while forking the ecosystem and breaking compatibility -- the ecosystem's prime directive -- and all that while Java was available under an open-source license, which Google rejected. They don't care about the principles of copyright. They just happened to use that in their arsenal of legal tools to go after a company that they believed had harmed them.

Then, Google's PR tried to terrify the world with the possible repercussions of the one argument that did stick by blowing it out of proportion and neglecting to mention that it hardly applies to any software these days.

Neither of these companies is a saint in this story. They are just two corporations playing their parts. But Google was the one who could have stopped all this, and blaming the whole thing on Oracle is just ridiculous.


Frankly, one doesn't need to know anything about the case to find the idea that Google would balk at the request for a single payment of $25M to ensure that Android was free from claims of infringement is clearly nonsense. They spend more than that per year in food for their employees.

Of course, that's not true, and you know it's not true, since you replied earlier this year to someone who explained that the alleged offer was $20M + 10% of Android's revenue (capped at $25M) for a three year license - and this was in 2006, years before Android had even launched. FSM knows what Oracle would have demanded after Android was a proven success and Google had caved earlier.

Of course, one may still think Google should have paid whatever Sun/Oracle asked for, but let's not spread misinformation.


" Just because Sun didn't have patent suits in our genetic code doesn't mean we didn't feel wronged. While I have differences with Oracle, in this case they are in the right. Google totally slimed Sun. We were all really disturbed, even Jonathan: he just decided to put on a happy face and tried to turn lemons into lemonade, which annoyed a lot of folks at Sun. "

-- James Gosling, http://nighthacks.com/roller/jag/entry/my_attitude_on_oracle...


Err, if he really felt that way, why did he join Google?


No idea, but he didn't stay there long.

If you check his blog he has another posts about the whole issue.


OK, then. Google was free to fork Java under its open source license and pay nothing. I'm sure they had their reasons not to, but painting Oracle as the only bad guy in this story is preposterous. Those are two "evil" corporations caring for nothing but their own interests. There are no saints in this story and no one who has "developers' best interests" in mind.


Not really. They were free to fork Java for desktops, but not for embedded devices. Jave ME was a major revenue stream for Sun and when that dried up Oracle decided to shake down Google.

So, what do you think the SSO of 37 Java API's are worth?


That's just not true. Java is 100% open-source with 0 additional restrictions (extra restrictions are strictly prohibited by the GPL). All restrictions on mobile use apply when you choose to not use the open-source license.


>The Google lawsuit has brought the contradictions out in the open. Java is controlled by a single corporate entity, in much the same way that .NET is. Yes, you can make a GPL clone of JDK 6 that runs on desktops and servers. That's nice. But if you want to run Java on tablets, smartphones, or embedded devices, you'll pay. If you want to use a version of Java that is not “substantially derived from OpenJDK Code”, you'll have to negotiate a TCK license, which Oracle may or may not grant. https://weblogs.java.net/blog/cayhorstmann/archive/2010/09/0...

Also, since all Oracle has left is the SSO claim of 37 Java API's, what do you think that's worth? And do you really support the copyright of API SSO?


> But if you want to run Java on tablets, smartphones, or embedded devices, you'll pay.

WRONG! OpenJDK is 100% open source with absolutely no restrictions whatsoever[1]. Restrictions apply to other Java licensing methods, some of them may also be free as in beer but not as in speech.

> And do you really support the copyright of API SSO?

I think there are valid arguments either way (e.g., a book's TOC is copyrighted), but the use of an API for implementing a compatible version should be allowed regardless of the copyrightability issue.

But whatever your opinion is, blaming Oracle for this is hypocritical. Their goal wasn't to copyright their APIs. Their goal was to get Google to pay for Android, and they used whatever tool they could think of in their legal arsenal. Any other company would have done the same. And keep in mind -- they didn't do it on principle. Google directly attacked one of their major revenue streams -- licensing (under a non-GPL license) of Java for mobile devices. What Google did was highly unusual. I'm unaware of any such event in the past two decades or maybe even ever. It was Google who provoked Sun to sue them, and now they're trying to gain sympathy by crying over the weapon that happened to hit. But they knew Sun (or Oracle) would come after them, they started this, and now they want people to blame Oracle for any collateral damage.

The collateral damage, BTW, is quite small (although Google is also trying to inflate that to gain sympathy). The copyrightability ruling applies only to "classical" software APIs; not protocols; not REST APIs etc.. There is no way, no how, to interpret the ruling to apply to anything other than this kind of API (copyright cannot apply to anything that isn't fixed). Implementation of an API by software that does not comply with the API's license is rare. In addition, implementation of an API for the purpose of providing a compatible, competing product may still be fair use. But that, too, is not what Google did. Android does not comply with not a single one of Java's editions, so it was certainly not Google's intention to create a competing implementation that would allow Java programs to run unchanged.

[1]: http://programmers.stackexchange.com/a/214724


You seem confident of the licensing terms. But, you really have no facts to back up your claims so it's entirely speculation and most likely very incorrect. To suggest Sun wanted a measly 25 million for a perpetual license was silly. Sun wanted a piece of the pie and there was no way Google was ever going to let that happen. Oracle failed to assert their VM patents and so their copyright claim on the SSO of 37 Java API's was their last desperate hope of clawing any compensation.

Android is the reason Java is the #1 programming language. I hope Google transitions to another language and leaves Java in the dust. Google is responsible for making Java relevant in the consumer space and Google is also going to be responsible for making it irrelevant thanks to Oracle.


> Sun wanted a piece of the pie and there was no way Google was ever going to let that happen

Or Google could have complied with the open-source license, as they did with Linux. But there was no way Google was ever going to let that happen.

> Android is the reason Java is the #1 programming language

Android hardly accounts for a few percentage points.


>Or Google could have complied with the open-source license, as they did with Linux. But there was no way Google was ever going to let that happen.

How exactly did Google not comply? Are you suggesting that if Google released the Android bits under the GPL license and not the Apache they would be been compliant and Oracle would have not sued? I highly doubt that.

>Android hardly accounts for a few percentage points.

Java would be in a constant annual decline had it not been for Google and Android. Just as Objective-C has sunk dramatically in the TIOBE index we'll see the same result when Google switches to Go, Dart or perhaps even Python.


> Are you suggesting that if Google released the Android bits under the GPL license and not the Apache they would be been compliant

Of course! OpenJDK is licensed to the entire world under the GPL, and anyone is explicitly permitted to do whatever the hell they please with it.

OpenJDK is probably the world's second-largest open-source project (after the Linux kernel), and possibly the world's most active. OTOH, with the rare exceptions of Android (that has significant parts forked from Apache Harmony) and Go, Google has made scant contributions to open-source in general. Netflix, Yahoo, LinkedIn, Twitter and Facebook are all much bigger open-source contributors than Google. True, Oracle is not a big fan of the ton of open-sourceness it inherited from Sun, and has even made the terrible decision to stop contributing to OpenSolaris, but painting Google as a champion of the developer community and a defender of OSS is ridiculous. I have no love for Oracle, and even less love for Google, but in this story Oracle weren't the bad guys.

> Java would be in a constant annual decline had it not been for Google and Android.

That is completely unfounded. Even within Google (one of the world's biggest Java shops), Android is not one of the biggest Java codebases. Objective-C is a client-side language. Java is first and foremost used on the server. Client-side Java (including Android, which isn't really Java) is a very small part of the Java ecosystem.


>Android is the reason Java is the #1 programming language.

Hardly. Java has been #1 nearly a whole decade before Android even appeared.


No it hasn't. According to TIOBE it was down to 13.5% in Oct 2014.


Which is totally irrelevant -- not to mention re-inforcing my argument that Android is not what made Java #1 language.

Java has been #1 from at least 2002 (and even before, long before Android) until now according to TIOBE -- the fact that it had a dip to #2 in 2014 doesn't change anything substancial to what I said, which was: "Java has been #1 nearly a whole decade before Android even appeared."

http://www.tiobe.com/index.php/content/paperinfo/tpci/index....


You stated Java was #1 according to TIOBE a decade before Android was released. Android was released in 2008. Looking at the data it's clear your statement is incorrect. Furthermore, according to the TIOBE long term index, Java was #2 in 2005 and #3 in 2000.


>You stated Java was #1 according to TIOBE a decade before Android was released.

No, I wrote (quote) "nearly a whole decade before Android was released".

Besides, you couldn't have missed the point more.

First the whole thing in discussion is whether Java became #1 due to Android or not (and it hasn't: it was #1 way before Android, and kept it until now for over a decade).

Second, TIOBE is indicative, not absolute. A language that's #1 with 25% in early 2014 doesn't suddenly drop 5-10 points to go below another for a few months -- that's just a momentary trend in searches or repos and other things that TIOBE looks at, not some indication that millions abandonded Java and adopted C for six months and then came back.


Don't like Java? Try COBOL.

Then you will like Java.


tl;dr: try elm


For those who don't want to google it: http://elm-lang.org/


A conference in Crested Butte, Colorado in February is kind of a bad idea. I lived up there one winter and it is very very cold and covered in snow and ice. It's the kind of place/season where you pretty much need special clothes to walk a block.

Full disclosure: I grew up in SF and didn't really believe in seasons until that year I spent in CO, so take this comment with a grain of salt or better yet a big bag of that road salt they use to melt the ice on the roads, and a four-wheel drive vehicle with working heater, and winter underwear and a warm preferably-goose-down jacket and boots and "smart wool" socks, and hat and mittens. Seriously though, unless you like to ski or you're just really into ice and being cold and you can't get to Antarctica going to Crested Butte in winter is nuts. Nice town. Freezing cold.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: