Starter project demonstrating backend API-server best practices: unified REST & GraphQL endpoints, layered architecture, shared SQLModel models, and a modern tooling stack (uv, Alembic, Ruff, pytest).
- REST endpoints:
/health-check,/ping,/version(and a base API router at/api) - GraphQL at
/graphqlwith GraphiQL UI - CLI via
api-serverconsole command (Typer) andpython -m api_server.main - Database layer using SQLModel/SQLAlchemy; migrations via Alembic
- Taskfile-driven workflow (uv venv/install, ruff format/lint, pytest, alembic, run)
- Language: Python 3.13+
- Web Framework: FastAPI
- GraphQL: Strawberry GraphQL (with FastAPI integration)
- ORM: SQLModel (SQLAlchemy-based)
- Database: PostgreSQL (via psycopg)
- Migration: Alembic
- Package Manager: uv (faster alternative to pip)
- CLI Framework: Typer
- Linting & Formatting: ruff
- Testing: pytest
- Environment: python-dotenv
- Logging: loguru
- Task Runner: Taskfile (Go Task)
- API Docs: OpenAPI (Swagger UI), ReDoc
- GraphQL UI: GraphiQL
- Templates: Jinja2
python-api-server/
├── src/
│ └── api_server/ # Main application package
│ ├── api/ # REST API endpoints
│ │ ├── api_router.py # API router configuration
│ │ ├── health_check.py # /health-check endpoint
│ │ ├── ping.py # /ping endpoint
│ │ └── version.py # /version endpoint
│ ├── graphql/ # GraphQL schema and router
│ │ ├── graphql_router.py # /graphql with GraphiQL
│ │ ├── context.py # GraphQL context and dependencies
│ │ ├── schema.py # GraphQL schema definitions (queries, mutations)
│ │ └── types.py # GraphQL type definitions (Strawberry models)
│ ├── models/ # Data models (SQLModel)
│ ├── services/ # Business logic and services
│ ├── templates/ # HTML templates (index)
│ ├── app.py # FastAPI application setup
│ ├── database.py # Database connection and utilities
│ ├── logging.py # Logging configuration
│ └── main.py # CLI entry (Typer) and uvicorn runner
├── migrations/ # Database migrations (Alembic)
│ ├── versions/
│ ├── env.py
│ └── script.py.mako
├── Taskfile.yml # Common development tasks (uv, ruff, pytest, alembic)
├── pyproject.toml # Project metadata, deps, tooling
├── setup.py # Package setup shim
├── alembic.ini # Alembic configuration
├── Dockerfile # Containerization
├── docs/ # Additional docs
└── README.md
- Python 3.13 or higher
- uv package manager (faster alternative to pip)
# Install dependencies using uv directly
uv pip install -e ".[dev]"
# Or using Task (recommended)
task installBackend uses Task as a unified task runner (simple YAML definitions).
Pre-commit hooks run task fct (format, check, test) to enforce style and correctness before commits.
To install the pre-commit hooks:
# Install pre-commit hooks
task pre-commit:installThis will install hooks that run automatically on git commit and git push operations. You can also run the hooks manually on all files:
# Run pre-commit hooks on all files
task pre-commit:runThe backend uses uv for virtual environment creation and package management. The Taskfile is configured to use uv directly with the Python executable in the virtual environment, avoiding the need to activate the virtual environment manually. This approach ensures compatibility across different shell environments.
This project's Taskfile provides the following commands:
task setup- Create a Python virtual environment using uvtask install- Install the package and development dependencies using uvtask clean- Clean build artifactstask clean:all- Clean all artifacts including virtual environmenttask format- Format code using rufftask lint- Run ruff linter (auto-fix enabled)task test- Run tests with pytesttask test:cov- Run tests with coverage reporttask check- Run format and linttask build- Build the packagetask db:setup- Setup the database and run all migrationstask db:migrate- Generate a new migrationtask db:upgrade- Run all pending migrationstask db:downgrade- Rollback the last migrationtask db:reset- Reset the database (WARNING - This will delete all data)task run- Run the application without auto-reloadtask run:dev- Run the application with auto-reload for developmenttask fct- Format, check, testtask pre-commit:install- Install pre-commit hookstask pre-commit:update- Update pre-commit hookstask pre-commit:run- Run pre-commit hooks on all files
Run tasks from the project root, e.g.:
task run:devFor development with hot reloading (automatically restarts the server when code changes):
# Start the development server with hot reloading
task run:devRecommended for development: automatic reload on code changes.
Runtime configuration is centralized in a Pydantic Settings model (api_server.config.Settings).
Resolution order (highest precedence last):
- Default values embedded in
Settings .envfile (if present) – loaded automatically- Environment variables with prefix
API_SERVER_ - CLI overrides (Typer arguments) passed to
python -m api_server.mainor theapi-serverentrypoint
Access the settings instance anywhere:
from api_server.config import get_settings
settings = get_settings()
print(settings.host, settings.port)Inside FastAPI routes you can depend on it:
from fastapi import Depends
from api_server.config import get_settings, Settings
@app.get("/info")
def info(settings: Settings = Depends(get_settings)):
return {"host": settings.host, "port": settings.port}When log level is DEBUG or TRACE, settings are logged once at startup (non-secret fields).
Current Settings fields (extend as needed):
| Field | Env Var | Default | Description |
|---|---|---|---|
host |
API_SERVER_HOST |
0.0.0.0 |
Interface to bind |
port |
API_SERVER_PORT |
8080 |
Port to listen on |
log_level |
API_SERVER_LOG_LEVEL |
INFO |
TRACE / DEBUG / INFO / WARNING / ERROR |
sql_log |
API_SERVER_SQL_LOG |
False |
Enable SQLAlchemy SQL echo logging |
reload |
API_SERVER_RELOAD |
True |
Auto-reload in dev (CLI override supported) |
Add database URL / secrets by appending fields to Settings:
class Settings(BaseSettings):
database_url: str = Field(..., description="Primary database DSN") # required
# secret_token: SecretStr | None = None # example secretThen export API_SERVER_DATABASE_URL (and optionally add to .env). Pydantic enforces required fields.
Example .env file:
API_SERVER_HOST=0.0.0.0
API_SERVER_PORT=8080
API_SERVER_LOG_LEVEL=DEBUG
API_SERVER_SQL_LOG=true
API_SERVER_DATABASE_URL=postgresql+psycopg://postgres:[email protected]:5432/postgres
Variables can live in .env (example: .env.example).
CLI arguments override environment-driven settings for that invocation:
| Argument | Effect |
|---|---|
--host / --port |
Override bind interface/port |
--log-level |
Override log_level (case-insensitive) |
--reload / --no-reload |
Force enable/disable auto-reload |
--sql-log / --no-sql-log |
Toggle SQL logging |
Examples:
api-server --host=127.0.0.1 --port=9000 --log-level=DEBUG --sql-log
python -m api_server.main --no-reload --log-level=WARNINGIf you add secrets (e.g., SecretStr), avoid dumping them verbatim:
if settings.log_level in {"DEBUG", "TRACE"}:
safe = settings.model_dump(exclude={"secret_token"})
logger.debug(f"Effective settings: {safe}")# Run the server with default settings (port 8080)
python -m api_server.main
# Run the server on a specific port
python -m api_server.main --port=8081
# Run the server with custom host
python -m api_server.main --host=127.0.0.1 --port=8080
# Run the server without auto-reload
python -m api_server.main --no-reload
# Using the installed console script (via Typer)
api-server --host=127.0.0.1 --port=8080 --log-level=INFOOr run via Task:
# Run with default settings
task run
# Run on a specific port
task run -- --port=8080Alembic handles database migrations:
# Configure your database connection in .env file
API_SERVER_DATABASE_URL="postgresql+psycopg://username:password@hostname:port/database"
# Initialize the database with all migrations
task db:setupAfter making changes to your database models, create a new migration:
# Generate a migration with a descriptive message
task db:migrate -- "Add patient addresses table"To apply all pending migrations:
task db:upgradeTo roll back the last migration:
task db:downgradeWarning: This will delete all data in the database.
task db:resetAPI documentation is available at:
- Swagger UI: http://localhost:8080/docs
- ReDoc: http://localhost:8080/redoc
- OpenAPI JSON: http://localhost:8080/openapi.json
- GraphiQL (GraphQL): http://localhost:8080/graphql
Health-related endpoints:
- http://localhost:8080/health-check
- Basic ping: http://localhost:8080/ping
- Version: http://localhost:8080/version
The backend server can be containerized using Docker with the provided utility scripts:
# Build the Docker image
./scripts/docker-rebuild.sh
# Run the container
./scripts/docker-run.sh
# Stop the container
./scripts/docker-stop.shScripts advantages over raw Docker commands:
- Automatic environment variable resolution: Variables are automatically resolved from your environment or
.envfiles - Automatic port mapping: Exposed ports from the Dockerfile are automatically mapped
- Container name resolution: Container names are automatically extracted from the Dockerfile
- Custom environment files: Use
--dot-envoption to specify custom environment files (e.g.,--dot-env .env.docker)
# Run with a custom environment file
./scripts/docker-run.sh --dot-env .env.docker
# Show the command without executing it
./scripts/docker-run.sh --no-exec
# Pass additional arguments to docker run
./scripts/docker-run.sh -- -v /local/path:/container/path --restart alwaysImportant: When running in Docker, use host.docker.internal as the database host in your .env.docker file to connect to services running on your host machine.
Once running, you can access:
- API: http://localhost:8080
- Swagger UI: http://localhost:8080/docs
- ReDoc: http://localhost:8080/redoc
- Health Check: http://localhost:8080/health-check
Dockerfile highlights:
- Exposes port 8080 for the FastAPI application
- Sets environment variables:
API_SERVER_HOST=0.0.0.0API_SERVER_LOG_LEVEL=INFO
- Required environment variables that must be set:
API_SERVER_DATABASE_URL(can be set in.envfile)
- Runs the application with the command:
api-server --host=0.0.0.0 --port=8080 --no-reload
This template is designed to be easily adapted for your own projects. Here's how to customize it for your needs:
Let's say you want to rename the project to my_server:
-
Rename the source directory:
# Rename the main package directory mv src/api_server src/my_server -
Replace all occurrences of 'api_server' with 'my_server':
- Important: This is case sensitive!
- Use your IDE's "Replace in Files" feature (in VS Code: Ctrl+Shift+H or Cmd+Shift+H)
- Search for
api_serverand replace withmy_serveracross all files - Make sure to update:
- Python imports
- Environment variable prefixes
- Package names in pyproject.toml
- References in Taskfile.yml
- Database connection strings
- Docker configuration
-
Update the configuration prefix (optional): If you want a different env prefix (e.g.
MY_SERVER_), editenv_prefixinSettings.model_configinsideconfig.py. -
Test if everything is still working:
# Run format, check, and tests to verify the project is still functional task fct # Test database migrations task db:setup
-
Replace example code with your implementations:
- Replace the example models in
src/my_server/models/ - Update services in
src/my_server/services/ - Modify GraphQL schema in
src/my_server/graphql/ - Add your API endpoints to
src/my_server/api/ - Update database migrations as needed
- Replace the example models in
- After renaming, update
.envto use the new prefix if changed - Adjust
Settingsto include project-specific fields early (keeps config discoverable) - Update the console script name in pyproject.toml if you want to change the CLI command
- Regenerate database migrations if you change the models
- Add secrets as
SecretStrfields to automatically prevent accidental plain-text printing - Update or add tests that call
get_settings()to validate new required fields
This project is licensed under the MIT License - see the LICENSE file for details.