An organization can go through and update its C++, because ultimately they're distributing binaries (or doing everything internally and not distributing anything at all).
Web "pages" aren't called that for no good reason. If in 2005 you bought a novel, or some punk writer–artist's printed pamphlet, and now you can't read it because in the meantime some engineers changed a spec somewhere, then that would be a failure, not just in the small, but on a societal level. Just rev the language is something that people who spend 40+ hours in an IDE or programmer's text editor think up when they're used to dealing in SDKs and perpetually changing interdependencies and fixing them and getting paid handsomely for it. But that's not what the Web is. The Web is the infrastructure for handling humanity's publishing needs indefinitely.
To rely upon another observation:
"[This] is software design on the scale of decades: every detail is intended to promote software longevity and independent evolution. Many of the constraints are directly opposed to short-term efficiency. Unfortunately, people are fairly good at short-term design, and usually awful at long-term design. Most don’t think they need to design past the current release."
Your post sounds good... Until you realise that nearly any nontrivial web page from 10+ years ago is broken today...
No Flash... Iframes don't work properly anymore... HTTPS servers from 10 years ago are unsupported by todays browsers... Most of the IE hacks no longer work (remember progid:DXImageTransform?)... Any images/resources hosted elsewhere are likely now nonexistent...
Plenty of web features have been introduced and then dropped just a few years later. Backwards compatibility is great... But if it's practically broken anyway, I think there is a good argument for breaking it further. People who need to read an old page will probably need to use IE6 in a VM anyway.
The problem with this argument is that it demands we apply a false equivalence. The key word in your comment:
> hacks
Flash was not standardized. Same with IE's proprietary recommendations (and Mozilla's for that matter—XUL is proprietary, even though people often use "proprietary" as an antonym for "open source"). Most of the "web features" that people have in mind are in the same boat: experimental and draft-level proposals that eventually fall by the wayside for one reason or another. The Web is actually the single most successful attempt at a vendor-neutral, stable platform that exists. It's why we're having this conversation now.
The argument is that, because some people did something hacky or bleeding edge and then bled from it, then there's no real point in any amount of stability, so we should punish everyone. What a double whammy that would make for! First, you spend all your time taking care to do things correctly, so you pay the penalty inherent in that—what with moving more slowly than all those around you—and then someone decides, "ah, nevermind screw the whole thing", doubles back on the original offer and then breaks your shit? I can't say I'm able to abide by that. Imagine all your friends getting drivers licenses and receiving a bunch of speeding tickets for their recklessness, then one day you get pulled over and ticketed, too, regardless of the fact that you weren't speeding.
> nearly any nontrivial web page from 10+ years ago is broken today
Can you provide some examples? In my experience broken 10+ year old websites is the exception, not the rule. And most of the exception is because flash (which has a workaround; plus most popular flash websites have been ported).
An organization can go through and update its C++, because ultimately they're distributing binaries (or doing everything internally and not distributing anything at all).
Web "pages" aren't called that for no good reason. If in 2005 you bought a novel, or some punk writer–artist's printed pamphlet, and now you can't read it because in the meantime some engineers changed a spec somewhere, then that would be a failure, not just in the small, but on a societal level. Just rev the language is something that people who spend 40+ hours in an IDE or programmer's text editor think up when they're used to dealing in SDKs and perpetually changing interdependencies and fixing them and getting paid handsomely for it. But that's not what the Web is. The Web is the infrastructure for handling humanity's publishing needs indefinitely.
To rely upon another observation:
"[This] is software design on the scale of decades: every detail is intended to promote software longevity and independent evolution. Many of the constraints are directly opposed to short-term efficiency. Unfortunately, people are fairly good at short-term design, and usually awful at long-term design. Most don’t think they need to design past the current release."
https://roy.gbiv.com/untangled/2008/rest-apis-must-be-hypert...