Yeah, amongst all the "MS has lost its way" chatter recently, I don't think anybody noted how it's treating C++ devs. Annual full releases of VS? Really?
I think the "use D or Go or rust" or "there is no alternatives" debate has run its course until people actually do big projects in rust.
I do feel sorry (to the extent you can feel sorry for somebody who probably makes $150k+) for the guy in reddit who's started to talk about features he desperately wants in the C++17 spec.
The fact is that there's a very huge amount of legacy code which provides irreplaceable functionality. What would you do if you had a complex data crunching library in C to be integrated into your program?
Rewriting stuff from scratch is sometimes good, but usually you prefer to have some well-tested routines, and be sure they will not fuck up.
Edit: you may also argue that projects like LLVM make it possible to compile stuff in IR, and still keep the old code... but there may be situations in which choosing a different compiler is not an option.
Both CLang and GCC have "C++11 compatibility" switch, so it's a matter of a compiler parameter if they accept or not the new standard.
I would introduce a "backward compatibility" switch that has to be turned ON per source file to accept old constructs. All the new code from now own would be nice and shiny. There's an (easy) way to still keep around old code.
Only if you like writing all your libraries yourself.
To answer a little less flippantly, you'd need to add some sort of "unsafe" construct to the language, because most of the nice, high level abstractions have nasty low-level stuff buried in them, and because of the way C++ templates are compiled, that's all code that will be included in any program you write. In short, all "new" code is built on top of an awful lot of "old" code, and if you were going to make a huge breaking change like this, you may as well write a new language and libraries ground up (like Go or Rust).
Furthermore, even if you restrict yourself to the new stuff, C++ is not memory-safe. C++ would need some sort of Rust/Cyclone-like lifetime/region semantics to become safe.
Scala has the same debate - there are a couple constructs only library writers use (normally).
So they compile the library with "enable all". Everybody else gets a warning and they refactor the code or turn on the switch to allow.
The difference is that you can separately compile your Scala libraries from the code that calls them - if you do that in C++, you give up templates. All those #includes at the top of your C++ code are basically just dumping code into your source files, and if you're using templated code, it macro-expands with parts you're putting in from your own code. There are some separately compiled C++ libraries, but most of the standard library and Boost code you use is recompiled every time you compile your program. You could probably define a new pragma that would tell the compiler that all the code from a given file, class, or function is unsafe, but it would take source-level modifications to every library you use to properly implement the flag you suggest.
And my point was that, due to the way C++ is designed (specifically, not having a proper module system and having this massively powerful macro-replacement templating engine), there is no way for the compiler to distinguish between "legacy" code and "new" code.
Consider the following case study: you have a library including a template class that uses raw pointers as an iterator type. You accidentally write `iter + 9` instead of `*iter + 9` when you've included it (this will likely compile, though there should be a warning). Now, even assuming there's some language extension added to mark that library as "legacy" (maybe #include-legacy <mylibrary>), can you tell me whether the "new" code you just wrote using raw pointers counts as "legacy" or not?
I think this is basically Bartosz' point as well - that C++'s legacy design features are too baked in to make positive changes easily, or even tractably (even small changes like your --no-legacy compiler flag).
Yes, there's a way. Precompiled headers already do a pretty powerful analysis.
Thinking a bit more I realized I just reinvented the wheel: this is how Microsoft phased out the sprintf, strcpy and so on.
So you can choose between 3 compiles modes: --new-cpp --advanced-library and --allow-legacy
You can turn it on with pragma before including an .h file and turn it off after, if you have to.
By default "--new-cpp" is on, throws error on naked pointers, new, delete, pointer arithmetic and so on. Basically C++ would become a safe language, almost like a CLR-type managed code. It could even have a "const by default" variable declaration, like Scala.
The "--advanced-library" would allow a lot more (e.g. manual memory management), and "--allow-legacy" would be for full backward compatibility.
Does it worth it? I think it does: currently with meta programming, lambdas, named(!) closures, deterministic constructors etc. C++11 is one of the best programming languages for anybody who cares about performance.
There comes a time when no one uses a feature anymore so it might as well be dropped. How many of you still have floppies on your computer?