And my point was that, due to the way C++ is designed (specifically, not having a proper module system and having this massively powerful macro-replacement templating engine), there is no way for the compiler to distinguish between "legacy" code and "new" code.
Consider the following case study: you have a library including a template class that uses raw pointers as an iterator type. You accidentally write `iter + 9` instead of `*iter + 9` when you've included it (this will likely compile, though there should be a warning). Now, even assuming there's some language extension added to mark that library as "legacy" (maybe #include-legacy <mylibrary>), can you tell me whether the "new" code you just wrote using raw pointers counts as "legacy" or not?
I think this is basically Bartosz' point as well - that C++'s legacy design features are too baked in to make positive changes easily, or even tractably (even small changes like your --no-legacy compiler flag).
Yes, there's a way. Precompiled headers already do a pretty powerful analysis.
Thinking a bit more I realized I just reinvented the wheel: this is how Microsoft phased out the sprintf, strcpy and so on.
So you can choose between 3 compiles modes: --new-cpp --advanced-library and --allow-legacy
You can turn it on with pragma before including an .h file and turn it off after, if you have to.
By default "--new-cpp" is on, throws error on naked pointers, new, delete, pointer arithmetic and so on. Basically C++ would become a safe language, almost like a CLR-type managed code. It could even have a "const by default" variable declaration, like Scala.
The "--advanced-library" would allow a lot more (e.g. manual memory management), and "--allow-legacy" would be for full backward compatibility.
Does it worth it? I think it does: currently with meta programming, lambdas, named(!) closures, deterministic constructors etc. C++11 is one of the best programming languages for anybody who cares about performance.