At some point I counted the tools I needed just to run a Python script reliably: pyenv for Python versions, virtualenv for isolation, pip-tools for lockfiles. I tried Poetry to consolidate the package management side — it helped a little, but it never felt standard. New team members didn’t know it, the community was split, and pyenv was still sitting there regardless. uv replaces the entire stack.
The Trouble with Python Tooling #
Python ships with macOS and most Linux distros. That sounds convenient until you try to replace it. Your OS depends on that Python installation. Upgrade it carelessly and you’ll break system tools. So you leave the system Python alone and install a newer version alongside it. Now you have two Pythons. Which one runs when you type python?
That’s where pyenv comes in. It manages multiple Python versions and lets you pin a version per project. It works — but the setup involves modifying shell configuration files, the PATH needs careful ordering, and shims can behave unexpectedly across terminals and editors. Every new machine means redoing this setup, and CI environments add another layer of complexity. For a while I also used a tool called pyver, inspired by nvm from the Node.js world. It didn’t seem to have much traction in the community, and I eventually drifted back to pyenv.
Once you have Python sorted, you still need to manage packages. pip installs globally by default, so packages from one project bleed into another. You learn to use virtual environments. Then you learn to remember to activate them. Then you add requirements.txt, discover pip-tools for lockfiles, and realize you’ve assembled a small ecosystem of tools just to get a reproducible Python environment.
Poetry came along and tried to collapse the last two layers — it manages a virtualenv and packages together, with a pyproject.toml and a lockfile. A single poetry install and you’re set. On paper, a real improvement. In practice, it never became the obvious standard. Team members coming from other projects often hadn’t used it, onboarding took explanation, and the community remained split between it and the plain pip-tools approach. And it never touched the Python version management problem. You still needed pyenv underneath it all.
How I Discovered uv #
I didn’t find uv by searching for a better Python tool. I found it inside MCP server configurations.
When working with Claude’s Model Context Protocol, many server configs include a command like this:
{
"mcpServers": {
"my-server": {
"command": "uvx",
"args": ["my-mcp-server"]
}
}
}That uvx command kept appearing. I looked it up, found uv, and realized it does everything I’d been assembling from separate tools.
Python Version Management #
uv manages Python versions directly — no pyenv required. Install a version:
uv python install 3.12
uv python install 3.11 3.12 3.13 # multiple at oncePin a version for a project:
uv python pin 3.12This creates a .python-version file that uv respects automatically. List what’s available or installed:
uv python list
uv python list --only-installeduv finds managed installations first, then falls back to system Python — so it coexists cleanly with whatever Python your OS provides. No shell shims, no PATH juggling.
Project Dependencies and Lockfiles #
uv manages project dependencies through pyproject.toml and a uv.lock file. The pyproject.toml format is a Python standard — defined in PEP 518 and PEP 621 — and uv follows it correctly using the standard [project] table. Your project metadata is standard Python, not tied to any particular tool. Initialize a new project:
uv init my-project
cd my-project
uv add boto3 requestsThis updates pyproject.toml and generates a lockfile with exact resolved versions. Commit both. Anyone cloning the repo runs uv sync and gets an identical environment.
[project]
name = "my-project"
version = "0.1.0"
requires-python = ">=3.12"
dependencies = [
"boto3>=1.34",
"requests>=2.31",
]Run code without manually activating a virtual environment:
uv run python main.pyuv ensures the environment is in sync before running. No more “did I activate the venv?” moments.
Upgrading Dependencies #
Have you noticed that whenever an agentic coding tool generates a requirements.txt or pyproject.toml, the dependencies are outdated from the very first commit? That’s not a coincidence. LLMs were trained on code written months or years ago, so the versions they suggest reflect what was current then — not now. The result is projects that start life already full of known vulnerabilities. Plain pip has no structured answer to this — you can run pip install --upgrade, but nothing records the result in a reproducible way. pip-tools gets you closer with pip-compile --upgrade, but that’s yet another tool on top of an already crowded stack. With uv, upgrading is built in. Start fresh and upgrade everything at once:
uv lock --upgradeOr if you want to be more careful and upgrade one package at a time:
uv lock --upgrade-package boto3Either way, the lockfile updates while pyproject.toml constraints remain intact. Run uv sync after to apply the changes.
Inline Script Dependencies #
For standalone scripts, uv supports PEP 723 inline script metadata. You declare dependencies directly in the script file:
# /// script
# requires-python = ">=3.12"
# dependencies = [
# "boto3>=1.34",
# "rich",
# ]
# ///
import boto3
from rich.pretty import pprint
ec2 = boto3.client("ec2")
pprint(ec2.describe_regions())Run it with:
uv run script.pyuv creates an isolated environment with those dependencies, runs the script, and discards the environment. No pip install, no virtual environment setup, no cleanup.
This matters a lot for MCP servers. Many MCP servers are single Python files that need a few packages. With uv, the server’s dependencies travel with the code. The MCP host configuration points to uv run server.py or uses uvx with a published package — and the environment is handled automatically.
The uvx Command #
Remember that MCP server config that started this whole journey? It didn’t use uv run — it used uvx. That distinction matters. Where uv run executes code inside your project’s managed environment, uvx is shorthand for uv tool run: it fetches a package into a temporary isolated environment, runs its command, and discards the environment when done. Nothing persists, nothing conflicts with your project.
uvx ruff check .
uvx black --check src/
uvx aws-lambda-powertools-utilitiesWhen the package name differs from the command name:
uvx --from httpie http https://api.example.comPin to a specific version:
uvx ruff@0.3.0 check .This is how MCP server configs use it. The uvx my-mcp-server pattern in a config file means: fetch the package, run it in isolation, keep nothing behind. No global pollution, no version conflicts between tools.
For tools you want permanently available, uv tool install adds them to your PATH:
uv tool install ruff
uv tool install blackPerformance #
uv leads with performance on its website, and the numbers are real: the project claims 10–100x faster than pip. The gains come from being written in Rust, using parallel downloads, and aggressive caching with hard or reflinking where the filesystem supports it.
Honestly, performance wasn’t why I adopted uv. Solving the version management and dependency chaos was enough. But the speed difference is noticeable. uv sync on a project with 20 dependencies takes about as long as pip install takes to read its own help text.
For CI pipelines and Lambda packaging workflows, faster installs mean cheaper builds. That’s a real benefit, not marketing.
Is This Cloud Native? #
uv is part of our standard toolchain for Python Lambda functions and MCP servers that integrate with AWS services. Every project that deploys to the cloud starts with uv init. The lockfile goes into version control. The build step is uv sync --frozen — fast, reproducible, no surprises.
If you write Python and work with cloud infrastructure, uv removes a category of problems that used to require multiple tools and careful configuration. That’s reason enough to make the switch.
References #
- uv documentation — main entry point, includes performance claims and feature overview
- uv: Python version management — installing, pinning, and discovering Python versions
- uv: Project management guide — pyproject.toml, lockfiles, uv add and uv sync
- uv: Scripts guide — PEP 723 inline metadata, running standalone scripts
- uv: Tools guide — uvx, uv tool run, temporary and persistent tool installation
- uv: Benchmarks — performance methodology and comparison with pip and Poetry
- PEP 723 – Inline script metadata — the Python standard that uv’s inline dependency declarations implement