Every time I read a "scathing" attack on C/C++ for being the worst language the author ever heard of, I hear Jack Sparrow's reply in my head "Ahh... but you have heard of me".
Somehow despite the litany of crippling deficiencies, C/C++ have managed to be the foundation of every piece of technology a consumer or a computer professional will ever touch in their daily life. But yeah, other than that, we should all be using.. oh, I don't know, scala or haskell or something like that.
Your point is correct, but it seems like a defensive stance. Rather than debating the technical facets of the article, you fall back on the ubiquity of C++ projects as an overwhelmingly positive aspect. For what it's worth, people say the same thing about PHP; how do you feel about that language?
It's hard reading vitriolic attacks on things you enjoy. There was a talk given by a Rubyist on why he thought Scala sucked, and I found it very irritating (and incorrect). So, that being said, I don't want to dismiss people feeling this way when the subject is broached. It would be nice to find a way to raise these points without ruffling feathers.
...but I think it should still be talked about. Times change, we learn lessons, life moves on. In spite of this, our most-used software is written using a language that is decades old. C+ is not a mathematical theorem...there is no universal truth to it that gives it a timeless quality. It's very common in the medical profession to rapidly change treatment methods - although a majority of the underlying knowledge is static, the actual techniques and tools change as better ones are discovered. In the realm of programming, can't we at least have the discussion?
I'm not advocating totalitarian FP. What I would love is for people to just take some of the biggest lessons learned and apply them. Hell, if I could just get immutability by default (we all know untamed mutability is dangerous), no untyped null (Hoare's billion-dollar mistake), and truly useful first-class functions (with everything that implies), I'd say that's enough to mollify me for the present. Thinking way, way, way forward, I would like to see people really studying and rethinking programming in general. Just because this is how we are doing this now doesn't mean it's the best way. "Normality" is sort of an accident - I'd love to see a more studied approach.
> Rather than debating the technical facets of the article, you fall back on the ubiquity of C++ projects as an overwhelmingly positive aspect. For what it's worth, people say the same thing about PHP; how do you feel about that language?
As Stroustrup fittingly said: "There are only two kinds of languages: the ones people complain about and the ones nobody uses."
How can you know the downsides and warts of something you haven't used in a wide range of applications? Are you honestly saying there won't be any? Granted, an old, proven language will have far more warts than a fresh one that's still idealistic. Fixing old warts introduces new ones etc.
The point I was making (somewhat sarcastically) is that the claims of deficiencies in C/C++ family of languages, such as they are, are way overblown. There are lots of problems with the model of course. The author is not the first person to express his utter dismay at the inelegance and crudeness (as they see it) of this model and won't be the last. They have seen the light. How can the others be so blind etc.
Meanwhile in the real world, every single computing device, billions of them, are working non-stop, days and weeks and months on end, on a foundation of millions and millions of lines of imperative code (mostly a combination of C/C++/java/ObjectiveC ). If this programming model was so horribly broken, so inadequate, so crude, how could mere humans have created such an enormously complex technological edifice on top of such shaky foundations?
> ...how could mere humans have created such an enormously complex technological edifice on top of such shaky foundations?
Mere humans have split the atom, mapped the human genome, walked on the surface of the moon, transplanted organs, achieved flight, cloned a sheep, and invented apple pie. I'd say next to these accomplishments, the act of writing good software using tools that make some aspects of the process difficult is hardly an achievement.
I agree that hysterics claiming we are a hair away from a total imperative meltdown are ridiculous. However, I think that "make some aspects of the process difficult" is a reasonable assessment of imperative, low-level programming. Can you tell me you've never been bitten by an unexpected null? Or, that you've never tired of having to write a class that's essentially a wrapper around a List/Map because the interface is not convenient or fluent?
I'm not suggesting we get a mob together and overthrow the tyrannical imperative government, leaving corpses dangling from the gallows as a warning to the next person who wants to mutate an input parameter. I just want the discussion to move away from "C++ is fine, quit your bitching" - this is totally unproductive.
If medical doctors thought like this, laproscopic surgery would never have been taken seriously since conventional surgery was totally adequate. "I mean, what's the problem? Yeah, being sliced open creates very long recovery times for people and is more likely to lead to infection, but that's just something we have to deal with. An incalculable number of lives have been saved through conventional surgery, so I really don't think we need to consider new methods."
I like your approach. We do need discussions on the pros and cons of languages rather than be so religious or "conservative" about them.
From what i understand, different tools are suited for different tasks. C is even more dangerous that C++ but i am sure everyone admits that it is great for high performance computing. But dangerous as it is, it would slow programmer productivity a HELL LOT if you try to take care of all the possible bugs.
IMO Java and the OOP paradigm is designed for manageability of really large software. it's far easier to group related code together, delegate responsibilities to those classes, and just do inheritance when you need to reuse old code. Not to mention it's designed to let you borrow code from other people and use it with ease. This is very helpful when working in large teams where people can work on separate parts that talk to each other
PS: i apologize for my lack of vision. I have no experience with Ruby, scala or FP languages
"every single computing device, billions of them, are working non-stop, days and weeks and months on end"
working non-stop? No, they don't. Today’s computers are filled with bugs because bad implementations that comes from bad implementation languages and bad tooling.
The reason C++ is used is because backwards compability/legacy/history trumps everything else. Just look at MS Windows for a proof for that. Or x86 CPU architecture.
Those are claims that go against my personal experience (pretty much every device and server system I control has uptime in weeks or months). So unless you can actually show me some hard data to the contrary...
You on the other hand seem to be implying that a different programming model will result in a never crashing system at the scale of the current internet.
Remember Erlang runtime is also written in C ultimately. I would have believed claims about Erlang "never crashes" if I hadn't had to personally debug mysterious hangs in Erlang runtime (running RabbitMQ):
"C/C++ have managed to be the foundation of every piece of technology a consumer or a computer professional will ever touch in their daily life"
That has nothing to do with the technical merits of either language, and everything to do with a small number of early programmers' choice to use those languages -- which left us with a massive legacy codebase and an incentive to keep teaching people those languages. To put it another way, Unix is the reason C is popular and Windows is the reason C++ is popular. People who write software for Unix and Windows have compelling reasons to use C and C++, since every program ultimately needs to interact with the OS once in a while.
Every time someone brings up the crippling technical deficiencies of C and C++, someone responds with a point about C/C++ being popular. All that says is that technical merits are not necessarily a determining factor in language choice.
Well, there is also the difficulty that there has not been, save Ada, a serious and concerted effort to replace C and C++ for so many of the things that they are used for. Except for Ada, no other modern language is suitable for writing things like kernels, drivers, emulators, etc; except for Ada, basically all modern languages are dependent on garbage collection.
So if you want to complain about C, remember that the only serious alternative is and has been Ada, and ask yourself why you're not using Ada, why operating systems and window managers, mupen64plus and Wayland, et cetera don't use Ada, and the truth is: people like C more than Ada.
Ada is the anti-C++. It's the answer to every criticism. It's also basically unheard of outside of situations where people really, really need safety cf. airline industry. In fact most of the people who complain about C++ would never even consider Ada. The author of the article seems to be unaware it even exists, which says a lot.
The takeaway is that safety is overrated, consistency is overrated, reliability is overrated, because if we really cared about these things... we'd use Ada. And we don't, because "Ada is annoying!".
There is another attempt at it. That language has been in development for a few years and is almost ready. Rust combines the best features of C and functional programming and through a very clever pointer system manages to eliminate race conditions, null pointers and dangling pointers at compile time. It has a powerful type system and smart type inference. It's possible to have some memory be garbage collected, but by default it isn't so it can be used for tasks that prohibit garbage collection. The compiler ensures that freed memory will never be accessed. It's built for modern parallel hardware and offers tons of concurrency features, like lightweight tasks, actors, async, etc. Its structs are compatible with C and Rust can integrate seamlessly with C code and librairies. It compiles to native using LLVM. It's backed by a major non-profit, Mozilla. It's a fantastic project and it seems to have all the right cards to be the perfect alternative to C++.
Check it out. My only disappointment (and the reason why I'm not using it seriously at the moment) is that because the language is still in development, it changes extremely fast and 80% of the documentation online is wrong and won't compile. Even the official documentation contains a lot of outdated syntax. Hopefully that will change once the language reaches version 1.0, in the next 3-6 months.
I haven't tried Rust, but what about considering Python for embedded targets?
Is it that the language won't fit, or us embedded people are just too masochistic to find something easier.
You can use something like py2llvm to compile using the same system that Rust uses and get fairly good performance. I am working on this project actually.
Rust[1] is aimed squarely at this use case. The GC and runtime are optional. A toy kernel[2] and Linux kernel module[3] have already been written in it. The language is young and still unstable, but it's being designed by veteran C++ programmers with the goal of using it for a safer, faster, more maintainable successor[4] to the browser engine in Firefox. Worth checking out!
"Except for Ada, no other modern language is suitable for writing things like kernels, drivers"
I think the existence of Lisp machine OSes is a prominent counterexample to this claim. I understand your point, and I would not try using modern Lisp compilers to write an OS, but there is nothing about Lisp that makes it a bad language for low-level code.
> Well, there is also the difficulty that there has not been, save Ada, a serious and concerted effort to replace C and C++ for so many of the things that they are used for.
Except Ada is older than C++, so it cannot be an effort to replace it.
Around the time C was still UNIX only we also had Modula-2.
An advantage C and C++ have over many other languages is that they are part of the standard tooling from OS vendors.
The only way to get people to use other, safer systems languages, is when OS vendors push for them.
For example, Objective-C would be dead if it wasn't the only language to fully explore iOS and Mac OS X capabilities.
Sure you can write bindings in other languages, but Objective-C will remain the language to go to.
The same with any language that would be targeted to replace C or C++. Without OS vendor support, no chance.
> For example, Objective-C would be dead if it wasn't the only language to fully explore iOS and Mac OS X capabilities.
It's a symbiosis. Cocoa would be much worse and harder to program if it wasn't for Objective-C.
Objective-C gives a stable, object-oriented ABI without requiring a VM.
Objective-C runtime is as powerful and dynamic as Python or JavaScript, e.g. you can replace any method of any instance of a class, even ones you haven't defined yourself, and Cocoa uses this for UI bindings.
I haven't said otherwise, just that hadn't Apple not decided to push it, the language would be dead.
This was the reason behind the Cocoa Java support in the early versions, as Apple was not sure if mainstream developers were willing to pick up Objective-C, even with them pushing for it.
It is funny, from what I understand one of the old complaints about Ada was how verbose it was in comparison to C at the time.
Comparing modern Ada to modern C++, C++ has tried its best to equal Ada in verbosity, although C++11 fixed a good deal of C++03's excesses.
From what I can tell, Ada is still missing a good way to do really primitive bit operations in a simple matter. Then again, bit ops destroy Ada's type safety, so I can see why they were made difficult. Still, if one needs to start poking around the individual bits of arbitrary data, C makes that easy-ish. (I mean it could be a lot better, it isn't hard to come up with a better syntax than C's for bit-wise access)
But everything I have read about the latest Ada spec makes it seem like a nice language.
I also believe another problem is that Ada has always been a rather academic language in many ways. Where as for the longest time no one in the C community really talked about pointer aliasing (and I'm guessing if you took a random cross sample of programmers in the late 80s/90s, many of them wouldn't know what the term even meant), Ada requires you know about pointer aliasing just to use pointers at all!
Then there is the fact that the language syntax is defined in BNF, and the main reference pointed to for Ada is the official reference manual. On one hand, hey awesome, the language spec is freely available for all! On the other hand, it is about as readable as any other official spec. The experts who can easily understand the spec end up just pointing to it (after all it is so easy for them to read) and thus simplified explanations don't get created.
> In fact most of the people who complain about C++ would never even consider Ada.
If I had tooling and other people in my group had more familiarity with it, I sure as heck would. Holy crap am I sick and tired of being limited to just straight C and a very tiny subset of C++. And 90% of the new stuff in C++ is not targeted at the embedded market (quite the opposite!) where as Ada has stayed closer to its roots.
It's also difficult to find a compiler, programming tutorials, or a community for. I think there are a LOT of reasons programming languages are chosen by developers and that most of those reasons aren't all that rational or well informed.
Good compiler support, a wealth of documentation, an active community, and a number of well-maintained libraries are all rational and informed reasons to choose a programming language. Network effects matter.
> Unix is the reason C is popular and Windows is the reason C++ is popular.
I never understood why people say C++ is popular because of Windows.
C++ was already popular before Windows mattered.
My first C++ applications were targeted at MS-DOS, back in the days most PC compatibles were still sold with 1 MB of RAM and almost no one has Windows installed.
Before Java was created, many of us were already doing C++ with CORBA in commercial UNIX systems.
Microsoft compilers only mattered to us around the time Windows NT started to enter the enterprise and the 32 bit version of Visual C++ was introduced.
C++ got popular due to its almost 100% C compatibility and as with C, UNIX also contributed to its spread in the enterprise, not Windows.
I am speaking from experience here. The only language I have used where this does not apply is F#, and only if you count .NET support.
I love CL, I never want to go back to C or C++, but the most annoying aspect is the lack of good system call support. Sure SBCL has a posix compatibility layer, but anything beyond basic system calls becomes a drag. In the worst case you wind up having to write a C function to wrap a system call just to expose an easier-to-use interface for the FFI. It is a complete mess, and it becomes a maintenance nightmare.
The author clearly didn't say that C++ was the worst language he's ever heard of. He didn't necessarily even say C++ is a bad language. What I'm reading is more along the lines of "Ok sure, C++ was important at one point in time. But in 2013, we have lots of things to worry about that C++ doesn't help us with that other languages do help us with."
He conclusion is the C++ is obsolete and the power of new hardware capabilities is being squandered because of it. I'd call that 'scathing.' I'll always be fond of C++, but I hope I never have to use it again.
That conclusion on squandering the power of new hardware is being backed by absolutely zero data and is obviously false. Just look around you and note how much software you depend on runs in a parallel fashion on multicore systems (using the SMP model of programming) and is written in imperative languages (mostly C and C++).
The only arguments presented by the author are his own subjective opinions of how bad various features of C++ are. There's merit in many of his arguments but the effect of each problem is exaggerated way beyond it's actual impact. For example, the bold, unhedged claim:
"Two words: data races. Imperative languages offer no protection against data races — maybe with the exception of D."
is a ridiculous claim to make right after his (mostly dismissive) nod to memory model and synchronization primitive introduced in c++11. If it was that difficult to implement parallel programming in imperative languages, we'd have to scrap every OS kernel (which pretty much all scale to large number of cores today) and every hosted language environment (which are all ultimately written in C/C++) and rewrite them in Haskell.
Careful readers (who read the references at the bottom of the article) will note the author clearly has an agenda having to do with more automated resource management in C++. For whatever reasons, it has not found it's way into the standard until now and in his frustration, he's lashing out at not just C++ but basically the entire installed base of computer technology (mostly built on top of imperative, manually memory managed languages like C and C++).
It's worth re-reading the quote again: "Imperative languages offer no protection against data races"
...and this is true. An imperative language (in and of itself) has no protection against data races. That's not to say that imperative languages can't have code without data races. It just means that the language itself offers no protection.
Functional languages (basically only Haskell - I consider Scala an imperative language) have protections due to the Church-Rosser theorem[1], which states that (in essence) pure code can be executed in any order and there won't be any race conditions.
The rub (which is what I think you're getting at) is that it's dubious whether languages like Haskell are practical in a real-life production environment. Imperative languages "get things done", and given judicious use of appropriate libraries may be able to do concurrency just as well if not better than a purely functional language like Haskell.
I don't know any languages that are practical in a real-life production environment. They all kind of suck in their own special ways. What makes you think Haskell's practicality is more dubious than any other?
The author is on this thread, so maybe he'll answer you. Of course you can do these things in C++, but does anyone really want to when there are languages better suited to it?
> That conclusion on squandering the power of new hardware is being backed by absolutely zero data
I believe that the C++ language and its philosophy are in direct conflict with the requirements of parallel programming. This conflict is responsible for the very slow uptake of parallel programming in mainstream software development. The power of multicore processors, vector units, and GPUs is being squandered by the industry because of an obsolete programming paradigm.
It's not an opinion. It's a simple matter of observation. I see software all around me (and I write some of it) written in C/C++/Java and other imperative languages scaling perfectly fine on large scale multicore systems. I use linux, OSX, Android and iOS on a daily basis. Almost all the code running on these boxes is written in imperative languages, much of it written as parallel, shared memory code.
Google (where I used to work, but I don't think I'm revealing any secrets here) runs a gigantic cluster on foundational software almost entirely written in multithreaded C++ with the kernel written in C of course (some of the application level stuff is in other languages, but still a whole lot even at that level is in C++). Almost any large scale, parallel computational systems you could name are written in imperative languages.
To call all of the above "just my opinion" is simply denying reality. There's no further argument to be had unless there's a basic, shared reality ground to stand on.
What you're saying, basically, is that it's not an opinion because you're right. That's...not a very good point.
Pointing out that most parallel systems are written in imperative languages doesn't help. Most systems in general are written in imperative languages, so it's unclear what you're even comparing it to. Imagine standing on a street corner in 19th century England saying, "What do you mean that open sewers aren't as nice as closed ones? Look at all these sewers; they're all open!" Similarly, "but dude, Google!" just isn't responsive to the argument at hand. An easier-to-grapple-with framing might be: if FP languages continue to grow in popularity, will it become much easier to build parallel software?
I don't actually know, and I'm not even claiming you're wrong. And it's a predictive claim, so you could argue the semantics of whether it's an opinion or not. But your argument sucks and your pretense that your view is some sort of law of nature is lame. It's completely reasonable for someone to disagree with you on how good C++ is for parallelism.
The original claim that this subthread is arguing is that the multi-core processes are being "squandered" somehow due to the deficiency of C type imperative languages. I'm pointing out that I see all those multiple cores gainfully and heavily employed using nothing but the imperative languages so primitive, that they are being compared to open sewers of 19th century London at this point :-)
"Ease of writing parallel software" was never the question we were arguing.
The C++ problem with concurrency is not that C++ can't efficiently do concurrency (as you state, it definitely can), but that it can't safely do concurrency, therefore it's a big draw on programmers, debugging and productivity.
Exactly. the reason the parallel computing is being "squandered by the industry" is because it seems to be too hard to do. There is no "ease" of parallel programming and hence no one does it
I was under the impression he was speaking of parallel programming (software algorithms utilizing parallelism,) not sequential code running in parallel.
If you want to get into proper parallel numerical algorithms, most useful programs are C, C++, or Fortran, sometimes with CUDA or OpenCL kernels. I don't think I've seen any large-scale HPC applications in Haskell, as much as like that language.
I'm not trying to be argumentative, I'm just trying to clarify my perspective, which I think the author shares with me and I feel is not being understood. We know what has worked in the past. To me the author is saying functional programming with language features helpful for parallel programming is a better path to take. If I say I believe electric cars are better suited for the future and you say millions of internal combustion cars exist and work just fine, you haven't told me anything I don't already know or addressed features of electric cars that you feel make me wrong. That a large electric car infrastructure doesn't exist yet also doesn't disprove anything. The internal combustion engine is going away and so is C++. That's my opinion.
"I was under the impression he was speaking of parallel programming (software algorithms utilizing parallelism,) not sequential code running in parallel"
The argument was about squandering the powerful multi-core hardware because of deficiencies in imperative languages. No notion of some platonic ideal of a parallel application was raised.
My point is that the "squandering" claim is clearly and obviously false as I argued above.
I agree that the use of the word 'squandering' is provocative and incorrect. reddit points out that he is evangelizing Haskell (FPComplete) and trolling C++ users on Twitter ("I used to be a C++ masochist") He doesn't make his financial bias clear in the article, which would have been helpful for those not familiar with him.
Ouch! Ad personam attacks are a sign of desperation. For the record, I used to work for FPComplete (I architected the School of Haskell) but I quit. It was an interesting experience, having seen a whole complex web site and an online Haskell IDE built in record time by a few Haskell programmers.
Sorry man! If you read the rest of my posts I was agreeing with you. I was just trying to address the reaction chetanahuja had to your article to say I wasn't blinded and had healthy skeptism, but I still agreed whole-heartedly with you. C++ is great for special circumstances but I'd hate to have to go back to it for general coding. Functional programming offers some very intriguing ideas, which I'm only starting to grasp. I don't really mind you being provocative because it provokes and discussion follows.
This quote reminds me of the US Navy saying I used to hear from officers: "A bitching sailor is a happy sailor." Sometimes, but more often sailors have real grievances that higher ups are choosing to ignore. I had a choice to re-enlist in the Navy and later with C++. Didn't happen with either.
Edward Sissorhands is a tragic figure - tragic in the classical sense, the essence of his strength is the source of his weakness. Correct or not as a metaphor for cpp, that, not attack, is the author's theme.
I take the author to be issuing a warning that he wished he did not feel compelled to give, and that warning is that for some programming tasks which are increasingly becoming common the language beloved in his youth is feeling like a Turing tarpit.
Yes, with enough effort the relationship could be maintained, and for many people doing so makes sense for the sake of the kids, but for others like the author, the spark has died and there are younger more attractive "local girls who want to meet."
"You might think of the C subset of C++ as bona fide assembly language which you shouldn’t use it in day-to-day programming, except that it’s right there on the surface. If you reach blindly into your C++ toolbox, you’re likely to come up with naked pointers, for loops, and all this ugly stuff."
It's all rolled into one as far as the author is concerned.
The author definitely does not consider them all rolled into one; the author was a member of the C++ standards committee. That quote is discussing the C-backwards-compatibility baggage of C++ as a design flaw in C++. In the context of C itself, it might not be a design flaw, and the author appears less interested in that question (he's been writing about C++ since the '90s, and doesn't really write about C).
I don't know about you, but I read "which you shouldn’t use it in day-to-day programming" and "all that ugly stuff" to mean a complete and utter dismissal of C as a language even worthy of consideration. Are you somehow getting a message that the author is endorsing going back to the simple days of pure C?
You could believe C is an absolutely fantastic systems-programming language, but believe that in day to day applications programming, it is a poor choice. Actually, I think that belief is fairly common.
I don't think it would be going too far in the direction of looking for a charitable interpretation of the author's statements to consider this possibility.
I think not - C has extremely thin abstractions - it's basically a portable assembler, with all the trade-offs so implied. C++ is a beast of a language that includes everything and the kitchen sink (plus decent backward compatibility with C), and as such is hard to optimize for new contexts (like concurrency), because there are so many interlocking bits that all need to work (ever tried to write code combining templates with class inheritance? It's mind-boggling, and that's likely a simpler example than those the author cites).
Sorry but I have to call bullshit on this.
Even in addition to the passage I quoted which explicitly calls out compatibility with C as the biggest problem with C++, the bulk of the rest of that article is directed towards resource management, starting with dismissing malloc as the worst possible API for memory allocation and going on to bitterly complain about various problems with pointers, global memory, side effects, for loops and any and everything that C++ inherits from C.
The overwhelming thrust of the article is that C++ needs to stop looking like C and has to start looking like Haskell or whatever other functional cool-aid is being drunk these days. If you are interpreting this blog post to be a call to programmers to go back to C, you're not reading the same words that I am.
Call bullshit? Please. Just say you disagree or something. Bullshit implies that I'm intentionally trying to feed you false information. And you've misunderstood. The line of reasoning you quote is about how C++ introduced new and delete to circumvent the follies of malloc and free. But the upshot was that even new and delete had pitfalls that developers fell prey to, so then came tr1's shared_ptr, followed by unique_ptr, which then necessitated, make_shared, etc. Where did I say this is a call to go back to C? Where did you learn how to read?
I think the author is saying that for what C++ is used for, its backwards compatibility with C is an anti-feature, because it's too low-level. That doesn't mean C is a bad language if you want or need that low level of control, just that modern systems programming should be using a higher level of abstraction which can be more efficiently optimized.
90% of people who I come in contact with that claim to know c++: "Hey, I know C++! malloc, strlen, memcpy! Oh and std::cout."
The article sums this up nicely. Pure, idiomatic C++ is beautiful but too many people are stuck using the C constructs when they should be using C++ alternatives.
Yeah, I guess that was ambiguous, wasn't it? C could equally well be the original sin, if one goes by analogy with the whole song. I'm just glad that C isn't C++.
I recently had to maintain C++ code from about 15 years ago. It is nothing like the new C++11. C++11 is a completely different beast, with many enhancements. Looks good - I recommend you read Stroustrup's C++11 book; if you are familiar with C++, the tour of C++ covers most of the new C++11 features.
Have you used it about 20 years ago? I think what we have now is way better than that.
Of course, what was there 20 years ago, language wise, still is there, but IMO that does not make it a legacy language. A language with legacy, yes, but it still is very much alive.
Heard that saying before, but didn't know it was from Stroustrup! That definitely adds to the hilarity.
In that case, let's hope Haskell and Rust move ever more solidly from the second group over to the first :) Always gonna be stuff to complain about. Doesn't excuse C++'s issues.
We know programming is not incremental, often what we did to get 70% of the requirements will be completely useless to achieve the remaining 100%. C++ will take you everywhere, and I find it a solid choice for any project especially because I am often unsure of performance requirements. I can confidently say my C++ code will scale.
So will wire-wrap, but are you going to try wire-wrapping a word processing program? C++ is able to do anything that Lisp, Haskell, or Scala can do, but with at least an order of magnitude more work for any non-trivial project. That order of magnitude can make a difference in getting your project done at all.
"I can confidently say my C++ code will scale."
I can say the same about my Lisp code. In fact, I switched to Lisp to improve scalability over the original C++ codebase, because improving scalability required a higher-level approach. Sure, it could have been done in C++ -- which would have added at least a month of extra work, which I really cannot spare right now.
We all know that the C language is more a low-level system language than it is something else. Especially for this "something else" the use of other languages thrived. C++ since its appearance brought continuously on the table things to cover these higher levels of programming. This is what makes C++ a language that "will take you everywhere". Use newer incremental additions and you will do what Lisp, Haskell or Scala does without that "order of magnitude more work".
"C++ since its appearance brought continuously on the table things to cover these higher levels of programming."
I see two issues with this statement:
1. The attempts to bring high-level constructs to C++ seem to always come up short. Example: it is still not possible to create something as simple as a doubly-linked list using the standard automatic memory management constructs in C++.
2. Low-level issues creep into high-level constructs rapidly and conspicuously. Example: you have an iterator pattern for sequence types, but if you are not careful you can create a dangling pointer (and no exception will be thrown).
This is why I say (and why my experience has been) that C++ adds an order of magnitude of work to any non-trivial project. The high-level features are poorly conceived and muddied by low-level issues.
Assembler will always scale too. The fact that something will always scale does not mean that it really is the proverbial golden hammer. Assuming that you will need scalability for any project regardless of what it entails is premature optimization.
There are also any number of cases where by using C++, you are missing the opportunity to use something more sensible. Some example use cases include data processing using massively distributed Hadoop installations, data analysis using Matlab/Octave or R, scripting using shell, perl, python, or ruby, and web client programming using JavaScript. I can't imagine web client programming counts among your "anywhere", but if you are doing all of these other things and doing them in C++, then I must ask why.
And in my personal experience[1], developer velocity has proven to be much higher in, say, Python than C++, and there are any number of use cases where rapid development is more important than scalability. Do you really find C++ to be the fastest language for a team to develop in (in your experience)?
ASM doesn't scale in structure (impossible to manage) and certainly not in architecture - for example new Intel Haswell's have vectorization opportunities which will cause any code that doesn't use these opcodes to run worse.
Lastly, it is increasingly difficult to beat C/C++ in performance using handmade ASM, especially when comparing tight loop performance in handmade asm and intel icc - although maybe I suck.
Web-client doesn't count in my anywhere because my browser doesn't run a compiled language. That being said backends are often written in C++. A frequently used paradigm is to see some glue language java, python, calling critical C++ routines that are exposed as a library.
Somehow despite the litany of crippling deficiencies, C/C++ have managed to be the foundation of every piece of technology a consumer or a computer professional will ever touch in their daily life. But yeah, other than that, we should all be using.. oh, I don't know, scala or haskell or something like that.