Running Tests¶
This guide covers all methods for running tests in RAG Modulo.
Quick Commands¶
Atomic Tests (Fastest - ~5 seconds)¶
Fast schema and data structure tests. No database required.
Unit Tests (Fast - ~30 seconds)¶
Unit tests with mocked dependencies. No external services required.
Integration Tests (Medium - ~2 minutes)¶
# Local mode (reuses dev infrastructure)
make test-integration
# CI mode (isolated containers)
make test-integration-ci
# Parallel execution
make test-integration-parallel
Tests with real services (Postgres, Milvus, MinIO). Local mode reuses local-dev-infra containers for speed.
End-to-End Tests (Slower - ~5 minutes)¶
# Local with TestClient (in-memory)
make test-e2e
# CI mode with isolated backend
make test-e2e-ci
# Parallel execution
make test-e2e-ci-parallel
make test-e2e-local-parallel
Full system tests from API to database.
Run All Tests¶
# Local: atomic โ unit โ integration โ e2e
make test-all
# CI: atomic โ unit โ integration-ci โ e2e-ci-parallel
make test-all-ci
Coverage Reports¶
# Generate HTML coverage report (60% minimum)
make coverage
# Report available at: htmlcov/index.html
open htmlcov/index.html
Direct pytest Commands¶
Run Specific Test File¶
Run Tests by Marker¶
# Unit tests only
poetry run pytest tests/ -m unit
# Integration tests only
poetry run pytest tests/ -m integration
# E2E tests only
poetry run pytest tests/ -m e2e
# Atomic tests only
poetry run pytest tests/ -m atomic
Run with Verbose Output¶
-v- Verbose output-s- Show print statements
Run Specific Test Function¶
Run with Coverage¶
Parallel Test Execution¶
Use pytest-xdist for parallel execution:
# Auto-detect CPU cores
poetry run pytest tests/unit/ -n auto
# Specify number of workers
poetry run pytest tests/unit/ -n 4
Test Filtering¶
Run Tests Matching Pattern¶
# Run all tests with "search" in name
poetry run pytest tests/ -k search
# Run tests NOT matching pattern
poetry run pytest tests/ -k "not slow"
Run Last Failed Tests¶
Run Failed Tests First¶
Test Output Options¶
Show Local Variables on Failure¶
Show Test Summary¶
-ra- Show all test results summary
Stop on First Failure¶
Stop After N Failures¶
Debugging Tests¶
Run with PDB Debugger¶
Drops into debugger on failure.
Show Print Statements¶
CI/CD Test Execution¶
Tests run automatically in GitHub Actions:
On Every PR¶
On Push to Main¶
See CI/CD Documentation for workflow details.
Test Requirements¶
Before running integration tests:
# Start infrastructure services
make local-dev-infra
# Verify services are running
docker compose ps
Troubleshooting¶
Tests Failing Locally¶
-
Clean Python cache
-
Restart infrastructure
-
Reinstall dependencies
Database Connection Issues¶
Vector Database Issues¶
# Check Milvus is running
docker compose ps milvus-standalone
# View logs
docker compose logs milvus-standalone
See Also¶
- Testing Strategy - Overall testing approach
- Test Categories - Detailed category descriptions
- Development Workflow - Development process