HN2new | past | comments | ask | show | jobs | submitlogin

I am going to guess you are coming at this entirely from the perspective of a sysadmin who doesn't do a lot with python.

virtualenv allows you to easily isolate dependencies so that multiple versions can coexist in different projects without fighting. If you sysadmin significant amounts of Python stuff then you would have already seen that benefit. It also allows you to control PYTHONPATH more elegantly than setting it in every script. This lets you write clean sensible imports rather than using fragile path tricks or wrestling with relative imports.

Your project dependencies are often not exactly the same as what the distribution gives. On one hand, distributions can be mind-bogglingly slow about releasing updates to python libraries. On the other hand, you may not want to force your project to use a new update of something because it may introduce bugs. It is pretty essential to have some lead time to port if your app is anything important. So you need control of the versions of your dependencies.

So if your project dependencies are not exactly the same as what the distribution uses, then you will need some kind of isolation mechanism in order to allow your project to work as a distribution package. You could do that, but you could also save time and just use virtualenv.

Creating platform-specific packages is nontrivial, also it is platform-specific. If I write a library for Python I only want to specify the packaging metadata/scripts once. I am certainly not interested in trying to get it accepted in repos for every distribution.



> Creating platform-specific packages is nontrivial, also it is platform-specific.

Unfortunately, Python-specific packages are Python-specific. I don't want to manage one package manager for Python, one for Ruby, one for Perl and yet another one for system libraries. And I'd prefer to have to trust only one vendor for security updates for everything.

And anything that is not Python depends on system libraries, which are supplied by apt or yum and live in an distro-specific namespace. How does setuptools declare and pull in a dependency on a system library?

I understand the need for Python, Ruby etc. to have their own packaging for cross-platform use, including on non-Unix. But system packaging tools have their uses too. There is no easy answer, and certainly no One True Answer.


I never even implied that system packages do not have their uses, or that python packages are free-standing of libc. That is just a straw man.

In fact, I agree with you for system libraries or tools which happen to use Python and really do not need virtualenv because they are really system-global.

Where I don't agree with you is with use cases you clearly don't understand, where you are developing and/or running multiple python apps in which you NEED isolation or you NEED to manage the versions of your dependencies. Your package manager is not helping with that at all.

And if you don't understand these tools and configuration management tools, then you should not be a sysadmin for projects which involve significant amounts of Python or Ruby.


A system packaging system that was better about sandboxes and running certain programs in certain contexts would be the One True Answer. NixOS (a Linux distro) at least has the requisite features as bullet-points, but I'm yet to successfully get it to install in either of the two times I've tried. (And my level of experience with Linux installations is "no longer need to consult the Gentoo manual to install Gentoo".)


> I am going to guess you are coming at this entirely from the perspective of a sysadmin who doesn't do a lot with python.

I am going to guess you are coming at this form the perspective of a small web developer who hasn't deployed large projects at various customer sites, and then had to support them. Your projects are also just python, not much anything else (C++, Erlang, data files).

For example if you deploy on a well known OS and you deliver a software package to customers and expect to upgrade and maintain them then OS packaging makes sense. Otherwise you are re-inventing the wheel and are shooting yourself in the foot.

I, for one, don't see how you can just dump files one system. Then when it comes to upgrading, just never removing the previous files and simply overwriting files again.

> Creating platform-specific packages is nontrivial, also it is platform-specific.

As much as everyone bashes distutils it does/did? have a reasonably easy way of building rpms -- setup.py bdist_rpm. It has bugs, but it works.


How do you use rpm to provide isolation of dependencies and simple control of PYTHONPATH? This lack of awareness shows that you don't understand the purpose of virtualenv. Nobody is arguing about the utility of packages, but you really don't understand the Python world at all if you think that packaging systems cover the use case of virtualenv.


I wasn't talking about virtualenv only. I was referring more to just using setuptools and pip to manage dependencies and upgrades.

However, system packaging is orthogonal to virtualenv. You can have virtualenv setup packaged inside an RPM (an isolated, self contained package).

> How do you use rpm to provide isolation of dependencies and simple control of PYTHONPATH?

Our product has about 50 or so RPMs. Only a couple of packages needed self-containment. Isolation has its downsides -- you cannot reuse code. So we have managed to avoid it as much as possible.

> but you really don't understand the Python world at all

I understand the "deliver a product" world though. Python is part of the product, not the whole product. I think it is little short-sighted to assume everything is just Python and that is the end of the world, and just run "setup.py install" and you are done. And yes, we have been through the "just run pip" or "just run setup.py install as root" before, and it creates a dependency and upgrade tracking nightmare.

Install a plugin using "setup.py install". Then update the package and remove some modules, run "setup.py install" again.

Now all of the sudden your system is messed up because it picks up old, modules that have not been cleaned by properly removing a previous package when upgrading.


My perspective is a developer on a small team who had to try and support a legacy environment in which no two machines had the same versions of the kernel, native libraries, web server, script engines, and our code, and every box was an irreplacable work of art which we basically had to /bin/dd to migrate. It took us a traumatic year to get from there "yum install our-app" producing a usable app server on a freshly-imaged box, and I don't think we would have succeeded without being able to express all our dependencies regardless of language.

As for mixing different versions of a library, that way lies madness. I regard it as a problem I want the package manager to prevent. An app that isn't on the company-wide version of a library is like radioactive waste—keep it tightly contained on separate hardware and dispose of it as soon as we possibly can. This does make upgrades a Big Deal involving a lot of regression testing, but it's better than troubleshooting code that both is and isn't what's running.


> Creating platform-specific packages is nontrivial

Not so much any more - for some platforms, anyway: http://lists.debian.org/debian-python/2010/01/msg00007.html


Also look at fpm: https://github.com/jordansissel/fpm/wiki

This sysadmin uses it frequently and it's a big time-saver for many software packages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: