Skip to content

Running Tests

This guide covers all methods for running tests in RAG Modulo.

Quick Commands

Atomic Tests (Fastest - ~5 seconds)

make test-atomic

Fast schema and data structure tests. No database required.

Unit Tests (Fast - ~30 seconds)

make test-unit-fast

Unit tests with mocked dependencies. No external services required.

Integration Tests (Medium - ~2 minutes)

# Local mode (reuses dev infrastructure)
make test-integration

# CI mode (isolated containers)
make test-integration-ci

# Parallel execution
make test-integration-parallel

Tests with real services (Postgres, Milvus, MinIO). Local mode reuses local-dev-infra containers for speed.

End-to-End Tests (Slower - ~5 minutes)

# Local with TestClient (in-memory)
make test-e2e

# CI mode with isolated backend
make test-e2e-ci

# Parallel execution
make test-e2e-ci-parallel
make test-e2e-local-parallel

Full system tests from API to database.

Run All Tests

# Local: atomic โ†’ unit โ†’ integration โ†’ e2e
make test-all

# CI: atomic โ†’ unit โ†’ integration-ci โ†’ e2e-ci-parallel
make test-all-ci

Coverage Reports

# Generate HTML coverage report (60% minimum)
make coverage

# Report available at: htmlcov/index.html
open htmlcov/index.html

Direct pytest Commands

Run Specific Test File

poetry run pytest tests/unit/services/test_search_service.py -v

Run Tests by Marker

# Unit tests only
poetry run pytest tests/ -m unit

# Integration tests only
poetry run pytest tests/ -m integration

# E2E tests only
poetry run pytest tests/ -m e2e

# Atomic tests only
poetry run pytest tests/ -m atomic

Run with Verbose Output

poetry run pytest tests/unit/ -v -s
  • -v - Verbose output
  • -s - Show print statements

Run Specific Test Function

poetry run pytest tests/unit/services/test_search_service.py::test_search_basic -v

Run with Coverage

poetry run pytest tests/unit/ --cov=backend/rag_solution --cov-report=html

Parallel Test Execution

Use pytest-xdist for parallel execution:

# Auto-detect CPU cores
poetry run pytest tests/unit/ -n auto

# Specify number of workers
poetry run pytest tests/unit/ -n 4

Test Filtering

Run Tests Matching Pattern

# Run all tests with "search" in name
poetry run pytest tests/ -k search

# Run tests NOT matching pattern
poetry run pytest tests/ -k "not slow"

Run Last Failed Tests

poetry run pytest --lf

Run Failed Tests First

poetry run pytest --ff

Test Output Options

Show Local Variables on Failure

poetry run pytest tests/unit/ -l

Show Test Summary

poetry run pytest tests/unit/ -ra
  • -ra - Show all test results summary

Stop on First Failure

poetry run pytest tests/unit/ -x

Stop After N Failures

poetry run pytest tests/unit/ --maxfail=3

Debugging Tests

Run with PDB Debugger

poetry run pytest tests/unit/services/test_search_service.py --pdb

Drops into debugger on failure.

Show Print Statements

poetry run pytest tests/unit/ -s

CI/CD Test Execution

Tests run automatically in GitHub Actions:

On Every PR

# Runs: atomic + unit tests (~2 min)
make test-atomic
make test-unit-fast

On Push to Main

# Runs: all tests including integration (~5 min)
make test-all-ci

See CI/CD Documentation for workflow details.

Test Requirements

Before running integration tests:

# Start infrastructure services
make local-dev-infra

# Verify services are running
docker compose ps

Troubleshooting

Tests Failing Locally

  1. Clean Python cache

    find . -type d -name __pycache__ -exec rm -r {} +
    

  2. Restart infrastructure

    make local-dev-stop
    make local-dev-infra
    

  3. Reinstall dependencies

    poetry install --with dev,test
    

Database Connection Issues

# Check PostgreSQL is running
docker compose ps postgres

# View logs
docker compose logs postgres

Vector Database Issues

# Check Milvus is running
docker compose ps milvus-standalone

# View logs
docker compose logs milvus-standalone

See Also