Layering stuff for legacy reasons isn't anything new. It was a smart idea to connect flat digital displays using ADC and VGA cable and display adapter with DAC. Still many did it, and some people even doing it today. It doesn't still make any sense what so ever.
It makes perfect sense when, say, that's the only cable you have on hand and when "mostly working for cheap" beats "not working, but technically correct and optimal." And then there are those to whom "lossy but non-DRM analog" (VGA) typically beats "lossless digital with DRM capability that will sneak up on you when you least want it" (HDMI).
What the hell am I going to do with a VGA display when the only output I have is a serial cable? And then why should I pay for a VGA cable? I can connect my vt100 anywhere with a cable I can make myself.
I'll bite. I didn't say VGA was cutting egde, or the one true cable, or that real programmers use butterflies - only that it remains practical for a very large number of uses. I believe the longevity of VGA has a lot to do with its compatibility and explicit lack of DRM features. I have often had a choice and gone with "good enough" VGA when digital was an option, simply because I am aware that by buying HDMI I am not only financially supporting and licensing DRM, but am committing to a technology that can be used against me. I may not be the norm, but I am far from a Luddite.
Yes, DVI was popular there for a while, and there are an assortment of others, but HDMI seems to have outpaced DVI in my encounters (and the others are quite niche: some Mac, Intel, etc...). DVI is still attractive, but things are shipping without it in favor of HDMI (in my experiences), and that only further helps extend VGA's lifespan. People already have VGA cables, and their only "upgrade" path to digital is often HDMI, so VGA remains the lesser of evils based on price and personal lifestyle/ethics.