Mehdi Abaakouk Mehdi Abaakouk
December 15, 2025 · 5 min read

Stop Lying to Your Dependency Resolver: The Real Rules for Python Dependency Management

Stop Lying to Your Dependency Resolver: The Real Rules for Python Dependency Management

Your Python app didn’t change: your dependencies did. This post explains why apps must pin dependencies, libraries must declare ranges, dev tools must be locked, and how to use lockfiles correctly with Poetry, PDM, and uv to avoid CI and production surprises.

Every engineer eventually learns this lesson the hard way: your app works locally, CI is green, you deploy… and something random blows up in production because an upstream package released a patch version two hours ago. You didn’t change anything, but your environment did.

This happens because most Python projects use dependency declarations that don’t accurately describe their requirements. And modern tools like PDM, Poetry, and uv are only as safe as the constraints you give them.

This post lays out the fundamental rules for writing dependencies in Python projects:

  • why **applications must pin with ****==**,
  • why libraries must declare ranges (**>=****<**),
  • why dev dependencies must be pinned with **==** for everyone,
  • why lockfiles are non-negotiable,
  • how to install from the lockfile with PDM, Poetry, and uv,
  • and how to make Renovate/Dependabot open clean, meaningful PRs instead of chaotic lockfile churn. “

The naïve setup: “Just use >= everywhere”

A lot of Python projects begin like this:

[project]
dependencies = [
    "fastapi>=0.100",
    "httpx>=0.26",
    "uvicorn>=0.30",
]

It looks flexible. It feels modern. You’re “not being too strict.”

But the next time you install this project (even just a week later) you silently get different versions. Your local environment, CI environment, and production environment subtly drift apart. Your dependency bot has nothing meaningful to upgrade. Eventually, something breaks in a way that’s hard to reproduce.

Because >= in an application manifest means:

“Install whatever you want as long as it’s not too old.”

But what you actually want is:

“Install exactly the versions my team tested.”

That requires a lockfile and constantly installing from it.“

Rule #1a: Applications must pin runtime dependencies with ==

Applications (= anything you deploy) require deterministic installs. Determinism comes from the lockfile, not from the pins themselves.

For apps:

  • runtime dependencies → pin with **==** when you want Renovate/Dependabot to open clean upgrade PRs, not for reproducibility,
  • lockfile → commit to version control,
  • always install strictly from the lockfile.

uv and PDM

[project]
dependencies = [
    "fastapi==0.115.0",
    "httpx==0.27.2",
    "uvicorn[standard]==0.30.6",
]

Poetry

[tool.poetry.dependencies]
fastapi = "0.115.0"
httpx = "0.27.2"
uvicorn = {version = "0.30.6", extras = ["standard"]}

Pinned app dependencies don’t make installs reproducible; the lockfile does. But pins do give you small, readable dependency PRs

But runtime dependencies are only half the story.“**

Rule #1b: Dev dependencies must use == too (for apps and libraries)

This is one of the most misunderstood aspects of dependency hygiene.

Your dev environment controls:

  • how pytest discovers tests,
  • how mypy type-checks,
  • what rules ruff enforces,
  • how docs build,
  • how linters behave.

If dev dependencies float, then:

  • CI becomes non-deterministic,
  • formatting/linting change unpredictably,
  • tests fail because tools changed,
  • Renovate/Dependabot can NOT create proper upgrade PRs.

Proper dev dependencies look like this:

Poetry

[tool.poetry.group.dev.dependencies]
pytest = "8.2.1"
mypy = "1.12.0"
ruff = "0.6.5"

uv and PDM

[project.optional-dependencies]
dev = [
    "pytest==8.2.1",
    "mypy==1.12.0",
    "ruff==0.6.5",
]

If instead you write:

pytest = ">=8"

then bumping pytest from 8.2 → 8.3 doesn’t change the manifest, so dependency bots have nothing to update. The change gets buried inside the lockfile. That’s how you end up with giant “update lockfile” PRs that update ten packages at once.

Pinned dev dependencies = one PR per tool = clean, predictable CI.**````

Rule #2: Libraries use ranges for runtime dependencies (>=, optional <)

Applications need deterministic versions.

Libraries need compatibility ranges.

Good:

dependencies = [
    "requests>=2.32,<3.0",
    "pydantic>=2.5,<3.0",
]

Bad:

dependencies = [
    "requests==2.32.3",  # Breaks downstream apps
]

Library runtime dependencies must:

  • specify the earliest supported version (>=),
  • optionally cap at the next major (<3.0) if following semver,
  • let downstream applications choose exact versions using their lockfile.

But library dev dependencies (e.g., pytest, mypy, ruff, sphinx, etc.) must still be fully pinned for reproducibility.

A well‑maintained library should also test against both ends of its supported dependency ranges. That means running the test suite at least once with:

  • the lowest supported version (e.g., requests>=2.32 → test with 2.32.x), and
  • the highest currently allowed version (e.g., <3.0 → test with the latest 2.x).

This ensures that when upstream releases new versions within your declared range, CI catches breakages early, and your dependency constraints remain honest.

And if you’re using uv, the experience is even better: it provides purpose-built commands for preparing environments at both ends of your supported ranges:

uv sync --resolution lowest
uv sync --resolution highest

These commands resolve dependencies to the lowest or highest versions allowed by your constraints and generate environments from the lockfile accordingly. This makes it trivial for library authors to run:

  • lowest‑supported‑versions test job, and
  • highest‑supported‑versions test job,

ensuring full coverage of your declared compatibility contract.

Rule #3: Always use a lockfile-aware package manager

Python has finally caught up to ecosystems like Node and Rust. Modern tools all support lockfiles:

  • pdm.lock
  • poetry.lock
  • uv.lock

Your workflow should always be:

  1. Change version constraints in pyproject.toml
  2. Run an explicit update command
  3. Commit both the manifest and the lockfile
  4. Install from the lockfile everywhere else

Lockfiles turn your environment from “whatever the resolver picks today” into “exactly what we agreed on.”

How to install strictly from the lockfile

Correct CI / Docker / production installation commands:

PDM: install from lockfile

pdm sync

Poetry: strict lockfile install

poetry install --sync

uv: strict lockfile install

uv sync --frozen

These ensure:

  • exact versions from the lockfile
  • no resolution
  • no modifications to the lockfile
  • deterministic builds and deploys

Why Renovate and Dependabot behave much better with proper pins

Dependency bots rely on your manifest to express intent.

If your manifest contains:

  • only >= in apps, or
  • unpinned dev dependencies

then bots cannot produce clean PRs.

They can only regenerate lockfiles and dump a pile of version changes on you.

But with pinned versions:

fastapi = "0.115.0"
pytest = "8.2.1"

bots can create atomic, reviewable PRs:

chore(deps): bump fastapi from 0.115.0 to 0.115.1
chore(dev-deps): bump pytest from 8.2.1 to 8.2.2

This makes automated merging safe, predictable, and auditable.

Personal reflection

After adopting these rules, I stopped treating dependency declarations as “just metadata.” They are the contract your project has with its ecosystem. Apps demand determinism, libraries demand compatibility, and your dev tooling demands reproducibility.

Now, whenever I add a package, I ask a simple question:

“Is this a pin or a range?”

That one decision prevents CI noise, deployment failures, and mysterious lock file churn. It’s a small habit that pays off every single day.

Stay ahead in CI/CD

Blog posts, release news, and automation tips straight in your inbox.

Recommended posts

The Day My AI Agent Deleted 29 Git Worktrees
April 1, 2026 · 4 min read

The Day My AI Agent Deleted 29 Git Worktrees

What happens when you rubber-stamp an AI agent's suggestion to 'clean up' your git worktrees, and the agent uses --force on all of them.

Alexandre Gaubert Alexandre Gaubert
When GitHub Webhooks Lie: How an Empty Array Broke Our Merge Queue
March 25, 2026 · 8 min read

When GitHub Webhooks Lie: How an Empty Array Broke Our Merge Queue

GitHub webhooks can deliver structurally valid payloads with stale data. We traced a customer incident to out-of-order delivery and built action-aware upserts to protect against it.

Mehdi Abaakouk Mehdi Abaakouk
The Comfortable Room
March 6, 2026 · 5 min read

The Comfortable Room

Software engineering was a walled garden. AI just copied the key. The data is messy: 19% slower in trials, 30% more warnings, 322% more vulnerabilities. But the baseline wasn't pristine either. What's left isn't coding: it's judgment, taste, and knowing which room to build.

Rémy Duthu Rémy Duthu