As amazing and convenient as Docker is in practice, containers hide the inherent mess that is modern computing, and the more they are used, the less chance is that this mess is getting cleaned up... ever. Ultimately this is another dependency, complexity hidden by another layer...
I feel like calling modern computing a mess is harsh. Modern computing encompasses a plethora of applications, to a point where we can model most if not all workflows. It's the product of millions of humans working together for half a century in a very large graph with little connection between each clusters.
Nothing at that scale is a "mess". It's simply what was created by our collective distributed system of humans and we should appreciate that it's not really a problem that can be solved instead of talking about it as if we could "fix" it.
I agree. While I understand the practicality of Docker and facilitating development using it, I wouldn't call it "forward" progress.
The fundamental problem, as you say, is that our dependency ecosystems don't meet our requirements. Docker is one way to avoid the problem without fixing it since it's easier. Forward progress would be to fix the problem.
On one hand, Docker removes some pressure to fix the problem and encourages perpetuating it. On the other hand, maybe it gets people to think about the problem more. I don't know which influence is stronger.
> containers hide the inherent mess that is modern computing, and the more they are used, the less chance is that this mess is getting cleaned up... ever
If you have a roadmap for how 150 or fewer engineers can "Clean up the inherent mess that is modern computing" in less than 5 years, then I'd be eager to read it. In the meantime, tools which enable people to manage the symptoms of that mess are good.
The Chunnel lets us work around the fact that the ocean has not yet been boiled away.
Understand that people quite reasonably feel this way but I personally don’t.
You’ve got to pick your battles. If you’re, for example, a front-end dev working right up at the top of the stack, then delivering value to your clients means getting them their marketing webpage, CRUD app, what-have-you. To do that you have to abstract away a vertiginous amount of stuff under you, all the way down the stack. We’re all standing on the shoulders of giants.
Docker is an amazing tool for just this sort of thing.
The great mistake happened way back in the 1980s (maybe earlier) when most OS developers didn't implement a proper permissions system for executables. Basically, the user should always be prompted to allow a program read/write access to the network, the filesystem and other external resources.
Had we had this, then executables could have been marked "pure" functional when they did't have dependencies and didn't require access to a config file. On top of that, we could have used the refcount technique from Apple's Time Machine or ZFS to have a single canonical copy of any file on the drive (based on the hash of its contents), so that each executable could see its own local copy of libraries rather than descending into dependency hell by having to manage multiple library versions sharing the same directories.
Then, a high-level access granting system should have been developed with blanket rules for executables that have been vetted by someone. Note that much of this has happened in recent years with MacOS (tragically tied to the App Store rather than an open system of trust).
There is nothing in any of this that seems particularly challenging. But it would have required the big OS developers to come on board at a time when they went out of their way to impose incompatibility by doing things like: using opposite slashes and backslashes, refusing to implement a built-in scripting language like Javascript, or even providing cross-platform socket libraries, etc.
The only parts I admire about Docker are that they kinda sorta got everything working on Mac, Windows and Linux, and had the insight that each line of a Dockerfile can be treated like layers in an installer. The actual implementation (not abstracting network and volume modes enough so there's only one performant one, having a lot of idiosyncrasies between docker and docker-compose, etc) still leave me often reaching for the documentation and coming up short.
That said, Docker is great and I think it was possibly the major breakthrough of the 2010s. And I do love how its way of opening ports makes a mockery of all other port mapping software.
I'm not sure I quite agree with that. Having a controlled environment in a sandbox of its own clearly has benefits, both for consistency of what you're running and for safety if it doesn't work as you expected. It doesn't need to be Docker specifically that we use to create such an environment, but if not Docker then we'd surely have looked for some other way to achieve the same result.
Docker is literally the specification of all the missing parts of the operating system. It's not a very good specification, but it is fairly comprehensive.