Sorry, this isn’t a very original post. There are many guides on using virtualenv and pip. Some of them will be more comprehensive. Still, I find value in posting this because to some extent I do use my web site as notes for myself.
First, install virtualenv. This is probably as easy as: easy_install virtualenv. Or apt-get install python-virtualenv, or something like that for your platform.
Then, to create a virtualenv, go to your chosen work directory and type:
virtualenv –no-site-packages my_new_project
This will create my_new_project, and inside that bin, lib, and include directories. You could also use ./ as the project directory.
To work in your virtualenv, cd to my_new_project, and type:
source bin/activate
Now, python stuff will work from that directory ignoring whatever is installed on the system in general.
Doing easy_install or the like inside the virtualenv will cause the installed packages to be done there instead of system wide. But, you probably shouldn’t keep using easy_install. Apparently the hot new tool is pip.
I don’t have strong opinions about pip. I am mainly using it because it seems to be the current best practice. On paper, pip offers compatibility with both distutils and setuptools based packages, as well as the ability to install unpackaged stuff from svn repositories.
Setting up a new project in the virtualenv then is just a matter of a few commands like:
pip install psycopg2
pip install PIL
And of course, the beauty is that the package you just installed are only in your virtualenv, and not polluting the entire system. Project A won’t use Project Bs libraries without your intentionally installing the same libraries in Project A. This is valuable when you go to install Project A on a new machine.
Even better, you can painlessly try new packages without polluting your system.
Anyway, I don’t know if I helped you, but at least I now know that my memory on how and why is some place more secure than my memory.