Stacks#

What Is It?#

A stack is a pre-built bundle of apps that work well together. Instead of adding apps one by one, you can install an entire group with a single command.

Think of it like a meal kit: instead of buying each ingredient separately, you get a box with everything you need for a complete recipe. A stack gives you a curated set of apps that form a complete solution — media streaming, home automation, monitoring, and so on.

Why Do They Exist?#

Many apps are designed to work as a team. For example, a media setup isn’t just a video player — it needs apps to find content, download it, organize it, fetch subtitles, and serve it to your TV. Setting that up means adding 7+ apps individually, which is tedious and error-prone.

Stacks solve this by letting you say “I want the media experience” and getting all the right pieces in one go:

psw -C ~/my-project app add @media --target media

That single command adds all the apps you need for a working media server.

Built-In Stacks#

PSW ships with seven stacks:

@media#

The essentials for streaming movies, TV shows, and music at home.

AppWhat It Does
JellyfinStreams your media library to any device (like a personal Netflix)
ProwlarrManages indexers (where to search for content)
SabnzbdDownloads content from Usenet
SonarrAutomatically finds and organizes TV shows
RadarrAutomatically finds and organizes movies
BazarrFetches subtitles for your media
LidarrAutomatically finds and organizes music

@media-full#

Everything in @media, plus extras for power users.

Additional AppWhat It Does
qBittorrentDownloads content via BitTorrent
AudiobookshelfManages and streams audiobooks and podcasts
FlareSolverrBypasses browser challenges for indexers
SeerrLets users request movies and TV shows
TdarrAutomatically transcodes media to save space
RecyclarrSyncs quality profiles to Sonarr and Radarr

@download-pipeline#

Just the downloading and organizing apps — no player, no extras.

AppWhat It Does
ProwlarrManages indexers
SabnzbdDownloads from Usenet
SonarrTV show management
RadarrMovie management

@observability#

A complete monitoring and alerting setup to keep an eye on your self-hosted solution’s health.

AppWhat It Does
PrometheusCollects metrics from all your apps and servers
LokiCollects and stores logs from all your apps
GrafanaDashboards and visualizations for metrics and logs
AlertmanagerSends alerts when something goes wrong
AlloyShips logs from every target to Loki
Node ExporterExposes system metrics (CPU, RAM, disk) from every target
Blackbox ExporterProbes endpoints to check if apps are reachable
ntfyDelivers notifications to your phone or desktop
Uptime KumaMonitors uptime and response times with a friendly dashboard

@home-automation#

Everything you need to make your home smart.

AppWhat It Does
MosquittoMQTT message broker (how smart devices talk to each other)
Zigbee2MQTTBridges Zigbee devices (sensors, lights, switches) to your network
Z-Wave JS UIBridges Z-Wave devices to your network
Home AssistantThe brain — automates and controls all your smart devices

@ai#

Local LLM chat with no cloud — your prompts and responses stay in the house. Requires an AI-class GPU with at least 8 GB of VRAM on one of your nodes — NVIDIA or AMD ROCm; see ai.md for the full story (GPU plumbing, model storage, recommended models, privacy invariants).

AppWhat It Does
OllamaThe inference engine — holds the AI model weights and answers requests. LAN-only API, no web page. Sweet spot for single-user chat with on-the-fly model swapping
Open WebUIThe chat front door at chat.<your-domain> plus the OpenAI-compatible relay other PSW apps (@voice, @office, @photos, @cctv) call into
vLLMHigh-throughput second LLM engine — wires automatically into Open WebUI’s multi-backend list. Continuous-batching + PagedAttention give it 3–5× the concurrent throughput Ollama offers, which matters when Paperless-ngx tagging reprocesses a backlog or agentic workflows fan out 5–10 sub-calls per task. LAN-only

@office#

Self-hosted document inbox that replaces Evernote / Dropbox-with-tags / paid OCR services. Drop scans into the consume folder (drag-and-drop in the web UI, the mobile app, or any tool that writes into the bind-mounted folder) and Paperless OCRs them with Tesseract , indexes the text, and lets you tag / classify / route documents with workflows. The companion Tika and Gotenberg sidecars handle Office documents (.docx / .xlsx / .pptx / .msg) and email files — Paperless calls Tika to extract the text, then hands off to Gotenberg to render a searchable PDF. Documents live on the DATA -class ZFS dataset (durable, snapshotted, included in Backrest ).

Two AI bridges ship as part of the stack and both route through Open WebUI ’s /v1 relay with per-consumer service-user keys — the LLM enriches every document with titles, tags, correspondents, types, and custom-field values; a vision-capable LLM also re-runs OCR on documents tagged for it (dramatically better than Tesseract on noisy scans, multi-column layouts, diagrams, and watermarked pages). Paperless plus Tika plus Gotenberg works perfectly well without the bridges — pull either out of your project with psw app remove paperless-ai / psw app remove paperless-gpt if you’d rather not run AI on documents.

AppWhat It Does
Paperless-ngxThe document inbox at paperless.<your-domain>. OCR, full-text search, tag / correspondent / document-type classification, workflows. Mobile apps for iOS / Android upload directly
Apache TikaStateless sidecar that extracts text + metadata from Office documents and email files Paperless can’t parse natively. LAN-only
GotenbergStateless sidecar that converts HTML, Office docs, and images into searchable PDFs via embedded Chromium + LibreOffice. LAN-only
paperless-aiLLM-tagging bridge at paperless-ai.<your-domain>. Routes through Open WebUI’s /v1 so prompts and responses stay on the box. Also exposes a RAG chat for asking questions about your indexed documents
paperless-gptVision-LLM OCR bridge at paperless-gpt.<your-domain>. PSW provisions a Paperless workflow that auto-applies the paperless-gpt-ocr-auto tag on every upload, so documents flow through qwen2.5vl:7b (or any vision-capable model loaded on Ollama) without manual tagging — same /v1 relay

@photos#

Self-hosted photo and video library that replaces Google Photos / iCloud Photos. The mobile app uploads everything from your phone in the background; the web UI gives you a timeline, albums, shared links, location maps, face tagging, and natural-language semantic search (“photos of the dog at the beach”) — all running on your hardware. Embeddings and face vectors live in shared core PostgreSQL via the VectorChord + pgvector extensions; the AI sidecar runs CLIP and ArcFace locally. Nothing leaves the box. See ai.md for the privacy invariants.

AppWhat It Does
ImmichThe photo & video library at photos.<your-domain>. Mobile apps for iOS / Android sync from your phone in the background
Immich Machine LearningThe AI sidecar that runs CLIP for “photos of X” semantic search and ArcFace for face recognition. CPU-only by default; an AI-class GPU accelerates everything 10–100×

@cctv#

Self-hosted NVR with neural-network object detection that replaces Ring / Nest / Wyze. Cameras stream RTSP into Frigate; Frigate runs an ONNX detector on your AI-class GPU (NVIDIA or AMD ROCm) and publishes detection events over MQTT to Home Assistant for automations — turn the porch light on when motion is seen, send a phone notification on package detection, etc. All clips and detection results stay on the box; no cloud-camera vendor account, no Frigate+ subscription required, no analytics phoning home.

Frigate is the one PSW exception to “use shared Postgres” — its event index lives in a local SQLite file because Frigate’s continuous-write rate is incompatible with a shared DB. Recording clips land on the MEDIA -class ZFS dataset; Backrest backs up the SQLite DB + config but excludes the recordings (huge, replaceable). Authelia gates the web UI through Traefik; Frigate’s own auth is off so the request flow is the same single-sign-on dance every other PSW app uses.

AppWhat It Does
FrigateThe NVR at cctv.<your-domain>. Runs object detection, manages recording retention, exposes RTSP / WebRTC streams (LAN-only) for cameras and Home Assistant to consume
MosquittoThe MQTT broker (already in @home-automation). Frigate publishes detection events to frigate/<camera>/<event> topics; HA subscribes via auto-discovery

Coral USB is supported by the Frigate image but PSW’s planner doesn’t yet model “Coral OR GPU” as a placement constraint — Coral users override the detector block in services/frigate/defaults.yml after deploy. The schema extension is a Phase 4 follow-up.

@voice#

Fully local voice control. “Hey Casa, turn off the kitchen light” — the audio, the transcription, the LLM reasoning, the synthesised reply, all of it stays inside the box. Builds on @ai (it reuses Ollama + Open WebUI as the brain) plus dedicated speech-to-text and text-to-speech services. See ai.md § Voice for the wiring details.

AppWhat It Does
WhisperSpeech-to-text. Turns audio from your speakers into text Home Assistant understands
PiperText-to-speech. Turns Home Assistant’s responses back into spoken audio
Home AssistantThe pipeline conductor — voice in, action out (turn lights on, query the calendar, ask the LLM)
MosquittoMQTT broker that smart-home devices talk through
Ollama , Open WebUIThe conversation brain (reused from @ai). The OpenAI-compat relay at chat.<domain>/api is what HA’s conversation agent talks to

How to Use Them#

Adding a Stack#

Use the @ prefix with psw app add:

# Install the full media stack on the "media" target
psw -C ~/my-project app add @media --target media

# Install observability on the "monitoring" target
psw -C ~/my-project app add @observability --target monitoring

Mixing Stacks and Individual Apps#

You can combine stacks with individual app names in a single command:

# Add the media stack plus Vaultwarden
psw -C ~/my-project app add @media vaultwarden --target media

Listing Available Stacks#

psw -C ~/my-project app list --stacks

This shows all available stacks, which apps they contain, and how many apps are in each.

How They Work Behind the Scenes#

A stack is not a special deployment unit — it’s just a shortcut. When you run psw app add @media --target media, PSW:

  1. Looks up the stack definition in psw-apps/stacks/media.yml
  2. Expands @media into its list of individual app names
  3. Adds each app independently, as if you’d typed them all out by hand

After expansion, each app is its own service with its own service.yml file, its own secrets , and its own lifecycle. There’s no “media stack” running on your server — just individual apps that happen to work well together.

This means you can:

  • Remove individual apps from a stack without affecting the others
  • Add apps to a target that already has a stack deployed
  • Mix apps from different stacks on the same target

Dependencies Between Apps#

Some apps in a stack depend on each other. For example, Sonarr needs Prowlarr for indexers. These dependencies are declared in each app’s metadata (not in the stack definition), and PSW handles them automatically through wiring during convergence .

The stack just makes sure all the right apps are present — the convention system takes care of connecting them together.

Key Ideas#

  • Convenience, not magic — stacks expand into individual apps, nothing more
  • Apps stay independent — each app has its own config, secrets, and lifecycle after installation
  • Curated bundles — stacks are opinionated selections of apps that solve a complete use case
  • Composable — mix stacks with individual apps, or use parts of a stack
  • Dependencies handled automatically — app-to-app connections are managed by conventions , not by the stack itself