I’m now a uv
convert. I haven’t really updated my Python tooling and workflow for the past two years.
In fact, I’ve been using the same workflow ever since I started working:
python -m venv venv
source /venv/bin/activate
# While in venv
pip install -r requirements.txt
But there’s a lot of limitations in this workflow. For example, I’m limited by the Python version installed in my machine or it’s hard to just install ad-hoc dependencies for my one-off scripts. This workflow also doesn’t manage dependency conflicts, I still remember those days when I’m checking whether I should use X version of pytorch or Y version of spaCy to get things running.
The uv
tool solves all of that.
-
First,
uv sync
does all the dependency management under the hood, and it automatically installs everything in a virtual environment. I was surprised how fast the installation is, compared to your good oldpip install
. Really awesome! Adding new requirements is a breeze too viauv add
: I just need to think about the name of the dependency, anduv
will figure out the correct version to use. -
What’s magical for me is that the virtual environment can have its own python version. So if a project needs 3.10 and I have 3.12, I can just set
3.10
in a.python_version
file (and also in mypyproject.toml
), anduv
will automatically install the correct python version and dependencies. -
The best feature for me is on adding one-off dependencies for my ad-hoc scripts. This is really cool because my standalone scripts can be wholly separated from the rest of my project. When I organize my research code, I often have a separate folder for managing all my plotting code, and sometimes these only use pandas and matplotlib. I can just specify those dependencies and run the script in isolated fashion.
Yeah, uv
is cool. For the past two years I never shifted to tools like poetry
or pyenv
, I find them quite bulky and cumbersome to use. However, uv
’s interface feels lightning fast. The folks at astral are real geniuses!