November 03, 2025 | View Comments
Warning
This blog post has been written by an LLM for the most part. The image above has been generated by a soulless ghoul called Gemini 2.5. I hope you still find the blog useful!
I've recently revisited building a GitHub CI pipeline for a Python project that includes coverage reporting, with only free-as-in-beer tooling and I've landed in a pretty nice place.
The most interesting parts of the toolchain in this set up are:
- py-cov-action: Coverage reporting without external services like Codecov or Coveralls
- pytest-xdist: Parallel test execution using all available CPU cores
- uv: Package management that's 10-100× faster than pip
The integration of these tools requires attention to some nuances that aren't very well documented elsewhere, and so while each individual tool is documented very well, the interplay can get a bit tricky, so that's what we're focusing on in this blog.
Six critical gotchas
These silent failures can potentially waste you a couple of hours. Each one will make your CI look successful while coverage is actually broken.
Gotcha #1: Using coverage run -m pytest with xdist
Symptom: Coverage shows 0% or drastically low percentages.
Root cause: coverage run -m pytest -n auto bypasses pytest-cov's xdist support. Coverage.py only sees the main process, not the workers.
The fix: Always use pytest -n auto --cov
Gotcha #2: Missing relative_files = true
Symptom: Coverage works locally but shows 0% in CI.
Root cause: Coverage.py uses absolute paths by default (/home/runner/work/myproject/myproject/file.py). These don't match GitHub's file structure.
The fix in pyproject.toml:
Without this, py-cov-action can't map coverage data to source files.
Gotcha #5: Missing pytest-cov plugin
Symptom: pytest: error: unrecognized arguments: --cov
Root cause: pytest-cov not installed.
The fix in pyproject.toml:
Gotcha #6: E2E tests with subprocesses contribute 0% coverage
Symptom: E2E tests with Playwright or Selenium pass but show 0% coverage for server code.
Root cause: Tests spawn a subprocess. Coverage.py in the parent process can't measure the child.
The problem in practice:
The fix Set COVERAGE_PROCESS_START environment variable and invoke via coverage run -m:
However, you should take a step back and think if you want E2E tests to play into coverage. Some might say that this will lead to bloated coverage numbers, but it's also nice to look at E2E test coverage in isolation. However, beware that any frontend code written in a language other than Python will not be tracked.
Some reasons why you might not to include coverage for E2E tests are:
- Unit tests already cover most backend logic directly
- Integration tests already hit the same API endpoints
- Coverage.py only measures Python code, not JavaScript
- E2E tests primarily verify frontend/backend integration
Complete working example
Here's a production-ready setup you can copy and adapt. These files work together to provide parallel testing, coverage reporting, and fork-safe PR comments.
Note
This example uses the two-workflow pattern for fork PR support described earlier.
.github/workflows/ci.yml
pyproject.toml configuration
Verifying your setup
After pushing these files, here's how to verify everything works:
Check relative paths
Run tests locally and inspect coverage.xml:
You should see relative paths like filename="mypackage/cli.py", not absolute paths like /home/runner/work/....
Verify artifact upload
In GitHub Actions:
- Go to your workflow run
- Scroll to "Artifacts"
- Download coverage-report
- Verify .coverage file exists inside
What success looks like
Locally:
In GitHub Actions, you'll see:
- Test Results check with pass/fail counts
- Coverage comment on PR with diff
- Line-by-line annotations on changed files
- Badge in python-coverage-comment-action-data branch
Migration notes
From Codecov/Coveralls
Replace your codecov step:
No external token needed. Badge URL changes from codecov.io/gh/USER/REPO/badge.svg to github.com/USER/REPO/raw/python-coverage-comment-action-data/BADGE.svg.
From pip to uv
Replace in your workflow:
Create pyproject.toml with your dependencies and run uv lock to generate the lockfile.
Next steps
- Copy the workflow files above
- Add relative_files = true to your pyproject.toml
- Push to GitHub and watch CI run
- Add the badge to your README from the data branch
For slow tests, mark them with @pytest.mark.slow and run them separately. For coverage gaps, focus on unit tests for business logic.
.png)

