The problem is the distributions packages are often horrendously out of date. I have stopped using them entirely even for things like Numpy and Scipy which I would really rather use them for (as they take a long time to compile and rely on several C libraries and a fortran compiler). It isn't my favorite thing to have my fabric task compile all the dependencies but there isn't really an easy way around it.[1]
In general I think pip has drastically improved my work flow and code structures. For instance: I no longer use submodules for my library code, I pip install the git repositories. Huge improvement, way easier to manage.
[1] Obviously, you can get around it if you do things like copy the virtualenv and fix the paths as in the OPs post. However, I wouldn't call that "easy".
It's true that the distro's packages can often be out of date: particularly with "Enterprise" releases like RHEL/CentOS. That doesn't mean that one shouldn't use the package manager, though. Building custom packages to backport updates and tracking the upstream for security and bug fixes isn't the funnest thing in the world, but it is often a necessary evil. It's certainly better than a "compile from source then dump everything into a tarball" approach which leaves you with no good way to track what versions of which software are installed on which nodes.
It isn't a necessary evil when it isn't necessary. If you have isolation (like virtualenv) then each app gets exactly what it needs. Life is too short to waste it trying to make every single bit of Python use the same versions of dependencies, pinning the version in the package manager, etc.
The universe does not revolve around lazy sysadmins
I think that a 'best practice' is to do something like create a virtualenv for your app and package the entire virtualenv in the distro's package manager, then version that.
I don't understand why they can't all play along. Why can't pip packages be generic enough to be transformed into rpm/deb packages? Why can't a repository of these packages be maintained with a comprehensive and up to date selection of packages for each OS?
Why must everyone reinvent the wheel, but do so in an incomplete way?
In general I think pip has drastically improved my work flow and code structures. For instance: I no longer use submodules for my library code, I pip install the git repositories. Huge improvement, way easier to manage.
[1] Obviously, you can get around it if you do things like copy the virtualenv and fix the paths as in the OPs post. However, I wouldn't call that "easy".