Back in 2020, I dove into Python during the pandemic, taking online Digital Humanities courses in the Netherlands. Those late nights of coding were simpler times. I was blissfully naive about package management, installing everything through Jupyter notebooks into my base environment. If it worked on my machine, it would work anywhere, I assumed?

Fast forward a few years, and my workflows had grown significantly more complex. Running deep learning models on university HPC clusters, managing multiple Python versions, dealing with conflicting dependencies across projects — the tooling landscape had to evolve.

The Scripts Concept

One of uv’s most elegant features is inline dependency declarations for quick scripts. Instead of setting up a full virtual environment for a one-off analysis, you can simply declare what you need at the top of your script:

# /// script
# dependencies = [
#   "requests<3",
#   "rich",
# ]
# ///

import requests
from rich.pretty import pprint

resp = requests.get("https://peps.python.org/api/peps.json")
data = resp.json()
pprint([(k, v["title"]) for k, v in data.items()][:10])

Then run it with uv run test.py — uv handles everything. No virtualenv creation, no pip install, no activation. It just works.

The Project Concept

For full projects, uv manages dependencies through pyproject.toml:

uv init my-project
cd my-project
uv add torch torchvision
uv add --dev pytest ruff

The lockfile ensures reproducibility. When I push to the HPC cluster, uv sync recreates the exact environment in seconds — not minutes.

Python Version Management

Perhaps most useful for HPC work: uv manages Python versions directly:

uv python install 3.12
uv python install 3.11
uv run --python 3.11 test.py

No more relying on whatever Python version the system admin installed. No more pyenv. uv handles it all.

The speed difference is remarkable. What used to take minutes with pip now takes seconds. On HPC clusters where every minute of compute time counts, this adds up fast.