I find it hard to believe any company, unless you were applying to Oracle to work on the JRE, would care what your favorite programming language was.
I don't work at Google, but having had a lot of other technical interviews, I'm almost certain that was a primer question in order to:
1) Get you to talk in a relaxed fashion, to calm your nerves. You get to talk about something you already know.
2) Guage your "passion level"
3) See how you think by probing why you picked that language.
Assuming you're right though, and it's because you said you love Arc and they decided to dismiss you, why exactly do you assume it was C++ you were supposed to answer? Did they specifically tell you "Sorry, the correct answer was C++"?
I'm really going to go with the grandparent and assume your answer to the "What is your favorite programming language" turned out to be so obnoxious that they rejected you based on personality. This is honestly something you can work on though.
No, it was clear enough that they wanted C++. I said PL/I. I wasn't arrogant about it at all. I doubt that the recruiter knew anything about PL/I or its pros and cons. So, I didn't get into a description of the pros and cons or get near any religious battles about programming languages.
In fact, PL/I has a lot of really nice design features missing from all the other programming languages popular now. Much or all of Multics was written in it. The Prime operating system Primos, much like Multics, was in part written in PL/I.
Why missing? C came forward in the 1970s because Bell Labs designed it for Unix for word whacking and wanted everything to run on an 8 KB machine or some such. At the time, IBM's PL/I ran fine on a 128 KB 360 Model 40, but then 128 KB was in every sense a LOT bigger than 8 KB. Also, at the time, writing a PL/I compiler was considered expensive, say, $40 million or more. That later PL/I got handled for quite modest funding was a surprise.
Due to anti-trust issues, Bell couldn't sell Unix so essentially just gave it away. Many universities got DEC computers and ran Unix. So, a lot of students learned C. C has a lot of problems with some traditional solutions given pointers, 'structures' of some kind, dynamic memory allocation, and 'entry' variables. Then C++ was just a pre-processor to C to make more definite these traditional solutions. Alas, both the syntax and the semantics of C++ are a mess.
C, and still C++, were, in a word, cheap. When Bell did C, and then C++, it was considered that implementing anything like PL/I would be far too expensive. PL/I has a much better collection of lessons for progress than C does. That C got so popular and PL/I was largely forgotten was a sad day for practical computing. Object oriented programming? It's easy enough with PL/I as it is, and, really, in part or whole, long was popular before C++. But that object oriented programming got to be mostly C++ built as a pre-processor to C instead of drawing from the lessons of PL/I (and more, e.g., Algol) was sad.
Net, for me, PL/I is much, much better than C and, still, even if want to use 'objects', better than C++.
But I didn't go into any of this with the recruiter at all. Such a description would have been considered too long and arrogant.
Net, Google is bending their arm all out of shape patting themselves on their back telling themselves that they are eager to hire Michelangelo to paint their ceiling but are using at best house painters to do the recruiting. There is a wide range between house painters and Michelangelo that Google doesn't know how to recruit. It won't work, and it doesn't work.
Google is violating a simple rule in technical recruiting: Under no circumstances should anyone in 'recruiting' or HR have any technical communications at all with a candidate. None. Zip, zilch, zero. The recruiting and HR people can schedule phone calls and visits, help with coffee, tea, water or soft drinks, explain where the restroom is, hand out the benefits packet, smile, be nice, ask what they can do to help, help with plane and hotel reservations, help with car rental, make getting reimbursed easy, etc. But technical? NEVER!
Any technical communications have to be limited to the management chain and, really, some other processes.
There is a fundamental problem: The need and the goal is to hire people who know things that some or all of the company so far does not know, or has capabilities the company does not have. So, that broad idea that the company will look down and 'examine' the candidate on material the company does not understand is fundamentally hopeless. Can't work. There are ways to select experts, but anything like the Google process is hopeless.
In particular, that book as "preparation" is an insult to any employee who would bring something new to Google.
Google believes that they are high up and looking down. That they are worth $195 billion, they are. Technically, especially in their recruiting, they are not.
This thread is not about me; it's about Google's recruiting. My experience is relevant only as a source of data I do have about their recruiting. Again, it's Google's recruiting, not me.
Since you like name-dropping credentials in relevant areas, I do minimal research in PL and I can say that PL/I is objectively far, far worse than C/C++. Why? Well, C and C++ both have huge flaws. Huge flaws -- no one who has ever programmed anything nontrivial in these languages would argue otherwise. So why are they better than PL/I?
PL/I has just as many flaws, if not more. Let's look at some of them.
1. PL/I compilers are awful. The number of people working on PL/I is a fraction of the people working on C/C++. C/C++ are getting faster, more correct, and more succinct every day. One of the most promising techniques for PL/I is compiling it into C and then calling gcc because PL/I is so bad. (And if you don't think build times are a factor in compiler construction you are sorely mistaken).
2. No one knows PL/I. This one's pretty easy. PL/I is write once, read once. C++ is write once, read many.
3. PL/I provides no compensation for its flaws by providing higher-level transformations and PL features (e.g. true higher-order functions, strong type safety, garbage collection). The cost of switching to PL/I is actually made worse by the fact that PL/I is at best slightly better to program in than C++.
4. PL/I build tools and deployment tools suck. Who cares if you can run it in a hundred processors on a mainframe? How is that useful in scaling to millions of people every second?
5. Where's the support for interoperability with other languages? C++ can be used with other languages through well-known and well-maintained paths. PL/I has no support.
Basically, you may have a solid theoretical background but your practical experience in modern PL engineering is sorely limited.
First you are angry. You are not so much attacking PL/I as attacking me personally.
Second you are pursuing religious arguments about programming languages.
Third you are arguing things about PL/I that are really not part of the language.
Fourth much of what you say about PL/I is technically wrong.
"PL/I compilers are awful".
What PL/I compilers are awful? As far as I know, there aren't very many and the more common ones there are, essentially all from IBM, are highly polished.
Compared with C/C++, the polished PL/I compilers are terrific because, with the language features, they actually do really compile the work instead of just call functions. In fact, the early versions of PL/I did string, etc. manipulations by having the compiler call run time functions; then the result was much like what C/C++ programmers are forced to do.
The compiling of functions for string and bit manipulation was done in part to have PL/I be faster than the then common practice of using functions for such things in Fortran. Net, for string manipulation, PL/I is faster than Fortran, C, and C++ because it actually compiles the work and avoids the overhead of function calls. Here C/C++ are behind and have no easy way to catch up.
For 2, who knows PL/I is not about the language itself. For it being my favorite, I know it!
That it's my 'favorite' doesn't mean that I suggest that others use it. I used the IBM PL/I on OS/2 a few times; I have the IBM PL/I for Windows but don't even have it installed. On Windows I use Visual Basic .NET, if only because it has such good access to .NET, ADO.NET, and ASP.NET. In many ways, I would prefer PL/I, but it is not a practical option.
PL/I has some features that were deliberately included to make learning it relatively easy. E.g., PL/I has no reserved words! That is, all the 'key' words in the language can be used by programmers for their own identifier names. So, a beginner doesn't have to worry about using a reserved word. I taught some elementary parts of PL/I to some not very good students in the business school at Georgetown University, and they learned fine.
For 3, the only serious problem with 'type safety' is for pointers. True, in PL/I, pointers do not have 'types' based on what they point to. That is, any pointer can point to anything. But, then, using pointers in PL/I is not nearly as necessary or common as in C/C++, is quite advanced, and is not common. I liked using pointers because can really work with the memory and, at times, write some 'polymorphic' functions. Tricky work with pointers is always tricky, and that the pointers in PL/I are not 'strongly typed' didn't make the work harder. Again, PL/I is not nearly as dependent on pointers as C; a C programmer is forced to used pointers frequently, and PL/I programmer can do fine using pointers only rarely.
Otherwise on types, PL/I took the attitude that converting a string to floating point, etc., should need just an assignment statement. For execution time, there is a warning from the compiler when such a conversion might be expensive.
But PL/I is far better than 'cast': Cast is just an override of the 'strong typing', that is, immunity from the strong typing police. The problem with cast is that it's super tough to find HOW THE HECK the conversion is done. Mostly I don't much care about pleasing the strong typing police, but I do usually very much care about the details of how the conversion is done. Right from the start, the IBM PL/I documentation was fully explicit on the conversions of all the pairs of the 'elementary' (no aggravates) data types -- good.
For garbage collection, where is that in C/C++? There is garbage collection in Visual Basic .NET, but it was not easy to implement. There is always some question if garbage collection should be implemented by the run-time. The way I'm depending on garbage collection in Visual Basic is quite similar to how I depended on automatic storage in PL/I, and PL/I automatic storage is MUCH more efficient to implement than garbage collection. Here the excellent, and advanced, scope of names in PL/I are a big help.
PL/I does do quite well on memory management, especially with its attribute 'automatic' which, for each 'task', has in effect a 'stack of dynamic descendancy' and allocates and frees automatic storage just as would want across the quite advanced scope of name rules.
This automatic storage works great with the PL/I 'structures' where a 'structure' is a list of elementary data types, arrays, or arrays of structures, all efficiently mapped to sequential storage in a clever, easy to understand way. The 'extents' (string maximum lengths and array bounds) need only be known when the structure is to be allocated. So, in particular can pass arguments to routine parameters what are the extents. Or can do a calculation, enter a Begin-End block and do the automatic allocation inside that block. Works great. That can't do such things in C, especially for arrays, not even array parameters, is one of the most serious failings of C and, thus, C++. To get around the problem, end up using C structures or C++ classes, both of which are much less efficient. PL/I structures are about as efficient as Fortran arrays, depending on what is being done, a little more or a little less.
Then scope of names and dynamic descendancy are well coordinated with exceptional condition handling so that the code that executes in response to an exceptional condition can be 'on the stack' several levels back and, then, if it wishes, do a 'no-local goto' to pop the stack back to its own level. In this way, all the relevant automatic storage gets freed and lots of memory leaks get avoided.
The problem here with C/C++ is that a C programmer, and, thus, also the pre-processor definition of C++, do not have enough access to memory management to do such good things.
Moreover, C and C++ have nothing in the language about 'tasks' or threads, and PL/I does, did from the beginning. In particular, the storage attribute 'controlled' (roughly like malloc and free) is 'task-relative', that is, goes away when the task does. Files are also task-relative and get closed when the task goes away. Nice.
4. For the 'build tools', never had a problem. I was in the group at Yorktown Heights that did the artificial intelligence language KnowledgeTool (KT) which was a pre-processor to PL/I, and we did a lot of building but had no problems with 'build tools'. I was the guy who used dynamic descendancy in a tricky way to make the 'rule subroutines' in KT simple and efficient and won an award for the work.
Of course, for a scripting language we had Mike Cowlishaw's Rexx, and for an editor we had XEDIT with its macro language, right, the same Rexx. Rexx is a great candidate for the most elegant scripting, macro language going. Rexx was the main reason we had no problems with build tools.
Actually, can claim that for some years Rexx, with a few extensions for some lower level OS access, basically 'ran' all of IBM: There were about 4000 mainframes around the world connected with simple bisync lines. In the end, it all looked much like the Internet today. So, the mainframes were acting as both the servers and the routers. The hard work of the routing, security, etc. was done with 'server virtual machines' programmed mostly in Rexx with a few routines for some lower level access. It worked surprisingly well. Rexx was no toy.
"5. Where's the support for interoperability with other languages? C++ can be used with other languages through well-known and well-maintained paths. PL/I has no support."
A lot of nonsense. On IBM, PL/I used standard OS calling sequences. Calling Fortran, Cobol, assembler, and C was routine. I wrote a collection of routines in PL/I to call C to call the TCP/IP routines. Occasionally I called assembler from PL/I.
Got'a tell you, calling Visual Basic .NET 'managed code' from C won't be a picnic! In some of my current project, at one point I call some C from Visual Basic .NET managed code and have the effort fairly simple. On Windows, calling one language from another is okay as long as they are both 'managed code'. Otherwise, on Windows or nearly anything else, calling one language from another is at least a little tricky and, in general, tough.
You are making four big mistakes:
First you are angry. You are not so much attacking PL/I as attacking me personally.
Nope. Besides noting that you have very little knowledge in PL research at the end, I made no comments about you personally.
Second you are pursuing religious arguments about programming languages.
Nope, I specifically mentioned practical things that matter.
Third you are arguing things about PL/I that are really not part of the language.
If you're talking about things like, "no one knows PL/I," so what? This is entirely relevant from an engineering standpoint. Just because it's not "really part of the language" is irrelevant because we're not debating whether or not PL/I is better than C/C++ in the 1950s in a perfect world, we're talking about practical engineering right now on real systems.
Fourth much of what you say about PL/I is technically wrong.
Not a one.
What PL/I compilers are awful? As far as I know, there aren't very many and the more common ones there are, essentially all from IBM, are highly polished.
Only if you're using compiler benchmarks from the 90's. I'm looking for things like packrat parsing and partial and incremental linking. Auto SSE/SIMD would be nice. If you're so convinced that IBM PL/I is good enough, why don't you benchmark it against GCC (on Intel ICC often outperforms GCC but I'm confident even GCC will far outperform PL/I).
That it's my 'favorite' doesn't mean that I suggest that others use it. I used the IBM PL/I on OS/2 a few times; I have the IBM PL/I for Windows but don't even have it installed. On Windows I use Visual Basic .NET, if only because it has such good access to .NET, ADO.NET, and ASP.NET. In many ways, I would prefer PL/I, but it is not a practical option.
My entire post was about how it's not a practical option and how you shouldn't be surprised when no one wants to use PL/I in production...
PL/I has some features that were deliberately included to make learning it relatively easy. E.g., PL/I has no reserved words! That is, all the 'key' words in the language can be used by programmers for their own identifier names. So, a beginner doesn't have to worry about using a reserved word. I taught some elementary parts of PL/I to some not very good students in the business school at Georgetown University, and they learned fine.
This is unimportant. Arguably it's worse than having reserved words because it allows for inadvertent shadowing, but really it's just a back-and-forth thing that no one cares about.
For 3, the only serious problem with 'type safety' is for pointers. True, in PL/I, pointers do not have 'types' based on what they point to. That is, any pointer can point to anything. But, then, using pointers in PL/I is not nearly as necessary or common as in C/C++, is quite advanced, and is not common. I liked using pointers because can really work with the memory and, at times, write some 'polymorphic' functions. Tricky work with pointers is always tricky, and that the pointers in PL/I are not 'strongly typed' didn't make the work harder. Again, PL/I is not nearly as dependent on pointers as C; a C programmer is forced to used pointers frequently, and PL/I programmer can do fine using pointers only rarely.
You completely misunderstood -- C and C++ are the standards. If you want people to use something not-standard you need to provide something which is far better in your own language than in the standards in order to compel people to switch. Being a little bit better isn't good enough, you need to be a lot better. Strong typing, garbage collection, and higher-order functions are all things which suck in C/C++. If you want people to adopt your language you should offer full support for these things because that gives you a compelling reason to switch.
Otherwise on types, PL/I took the attitude...
Read more about strong typing and read about type or category theory (preferably both). C and Java types are not strong typing, they're typing done in probably the worst possible way. Learn ML.
Or can do a calculation, enter a Begin-End block and do the automatic allocation inside that block.
Welcome to RAII, circa 2000.
To get around the problem, end up using C structures or C++ classes, both of which are much less efficient.
Only in C or C++ compilers from 1995...
Actually, can claim that for some years Rexx, with a few extensions for some lower level OS access, basically 'ran' all of IBM: There were about 4000 mainframes around the world connected with simple bisync lines. In the end, it all looked much like the Internet today. So, the mainframes were acting as both the servers and the routers. The hard work of the routing, security, etc. was done with 'server virtual machines' programmed mostly in Rexx with a few routines for some lower level access. It worked surprisingly well. Rexx was no toy.
It really is. IBM hasn't even come close to building what would be termed a modern distributed system infrastructure. There's no PL/I equivalent for MapReduce, BigTable, GFS, etc.
A lot of nonsense. On IBM, PL/I used standard OS calling sequences. Calling Fortran, Cobol, assembler, and C was routine. I wrote a collection of routines in PL/I to call C to call the TCP/IP routines. Occasionally I called assembler from PL/I.
OS calling sequences are the bare-minimum. If I wanted to optimize my Python code by dropping down and rewriting the code in a systems language, C has my back. Good luck with PL/I.
You seem to be using a lot of examples from HPC but they're not really relevant. HPC is a pretty easy target because you get to make a lot of assumptions about the DS you're architecting and the software that will be run on it. PL/I is just a dead language -- there's no reason to ever switch to it and while it may have been slightly better than C++ during IBM's heyday, the world has moved on. Even scientific computing prefers parallel Fortran, which ever since the latest version is incredibly fast.
PL/I was designed in IBM by a committee headed by George Radin in about 1963. First versions were running by 1966. Version 4 was running by 1969 and quite clean. There have been later versions. By the time IBM slowed maintaining the language, it was polished. Finally there was just one guy in CA maintaining PL/I. I suggested adding AVL trees (see Knuth, TACP).
The main intention of PL/I was to serve people using any or all of Cobol, Fortran, and assembler (at least for applications programming).
Again, versions of PL/I have been used for system programming in at least Multics and Primos.
It has been noted that the 1960s were "the golden age of programming language design". In comparison, progress for the next several decades was disappointing.
C was designed at Bell Labs in the 1970s. Likely the designers of C knew PL/I if only because they borrowed the semi-colon to end a statement and the syntax of comments.
C was supposed to be a minimal language, as simple as possible, to work on an 8 K DEC machine. The clever part of C was that while it had so little, with pointers, structures, malloc, and free, it still had enough for system programming. Also, since C was so simple, it needed no 'run-time' and could be used in embedded code in read only memory.
C++ was just a pre-processor to C to formalize some of the then standard ways to use C to make it livable for more complicated applications programming.
So, PL/I started off with much more than C. Since neither PL/I nor C can or will change, PL/I is still far ahead of C.
The preprocessor C++ is a bit ugly. Tough to say that just from a pre-processor C++ is much better than PL/I.
So, net, when Google asked me what my favorite programming language was, I said PL/I instead of the answer they wanted, C++.
I did not say that PL/I was the ultimate programming language, the end of programming language design, a programming language for 10,000 machines with 10 processors each with 1000 cores, etc. I didn't say that people should convert to PL/I. But, then, I would feel sorry for anyone to start a new project with C or C++.
I had my fling with programming language design with KnowledgeTool and a subsequent project. At the time I looked at the literature of programming language design and was not impressed. To me the literature looked like it was just rehashing old ideas back to Algol, looked like 'research' in cooking that was just remixing Escoffier's collection of sauces.
If since then the design of programming languages, compilers, operating systems, etc. have made progress, then about time and good.
For what is 'practical' now, I've voted with my feet: At present, I'm concentrating on my project in applications programming and am using essentially just Visual Basic .NET, ADO.NET for getting to SQL Server, ASP.NET for Web pages, and .NET for some of the other functionality it has, e.g., a lot in time and date manipulation.
That Microsoft went for their common language runtime (CLR), 'managed code', 'garbage collection' (with memory 'compactification'), common 'intermediate language', invited others to write their own syntactic sugar on top of the these, and provided the syntactic sugar C# and Visual Basic strikes me as good for now.
For my project, I decided to stand on Microsoft Windows instead of flavors of Unix. On Microsoft, I went with Visual Basic (VB) .NET. So far Windows and VB .NET have been as promised. I miss some of the features of PL/I, but the missing features don't keep me from getting my work done.
The VB .NET complier has been terrific: It compiles my programs in time in practice for me equivalent to 0 seconds. I will never type in enough code to slow down that compiler. The error messages, including in the context of ASP.NET, have been quite nice. I've found no bugs at all. The compiler has been easy to use just from command lines driven by some simple ObjectRexx scripts. I'm thrilled.
The main problem I have in my software development is some of Microsoft's documentation: (1) It is horrendous, thousands of Web pages, and, thus, tough to work with, even when it is good. (2) For SQL Server, especially management and administration, especially the 'security model', and a lot more in Windows, the documentation was awful in ways that have cost me unbelievable time and effort: Things didn't work anything like promised; I had to mount side projects to diagnose the problems and work around them; I had to write my own documentation, develop my own scripts to lock in the solutions, etc. to get around the nonsense and get back to my work, etc. It was horrendously expensive.
But, just for a programming language that is practical for applications programming now, I selected VB .NET. I certainly didn't try to continue with PL/I.
All this started just because Google asked me what my favorite programming language was and I said PL/I instead of C++ like they wanted. My answer was fine. The HR-recruiter rube didn't see PL/I on their list of acceptable answers and ended the interview. That was Google's error, not mine.
While the history is an interesting aside (I exaggerated my dates because they weren't particularly relevant), I was specifically speaking to modern implementations of the languages. Your history of C++ is correct but the modern version looks completely different.
I had my fling with programming language design with KnowledgeTool and a subsequent project. At the time I looked at the literature of programming language design and was not impressed. To me the literature looked like it was just rehashing old ideas back to Algol, looked like 'research' in cooking that was just remixing Escoffier's collection of sauces.
I completely agree that PL spent (and is spending) a huge amount of time rehashing old ideas but I think all the effort spent on the Fortran legacy (including C, C++, Algol, and PL/I) was a waste of time -- it took almost 20 years to get around to rehashing all the old ideas in LISP, which was a much better idea anyway.
As far as the Microsoft stack goes, I see no problem. My own experience with the internals of SQL Server makes me extremely reticent about using it myself but my time spent with the MS VC++ and CLR teams has made me very impressed with the entire .NET stack.
All this started just because Google asked me what my favorite programming language was and I said PL/I instead of C++ like they wanted. My answer was fine. The HR-recruiter rube didn't see PL/I on their list of acceptable answers and ended the interview. That was Google's error, not mine.
If that is actually what happened then I agree. That said, I think that the question of "what's your favorite programming language" is a pretty stupid one anyway and my answers of Haskell, ML, or Prolog wouldn't have been on the recruiter's list either.
I don't work at Google, but having had a lot of other technical interviews, I'm almost certain that was a primer question in order to:
1) Get you to talk in a relaxed fashion, to calm your nerves. You get to talk about something you already know.
2) Guage your "passion level"
3) See how you think by probing why you picked that language.
Assuming you're right though, and it's because you said you love Arc and they decided to dismiss you, why exactly do you assume it was C++ you were supposed to answer? Did they specifically tell you "Sorry, the correct answer was C++"?
I'm really going to go with the grandparent and assume your answer to the "What is your favorite programming language" turned out to be so obnoxious that they rejected you based on personality. This is honestly something you can work on though.