Proxmox VE

Proxmox VE 9.1.1 โ€” Intel i9-13900H (20 threads) โ€” 128 GB RAM โ€” Kernel 6.17.2-1-pve

Hardware & Drives
Drive Size Type ZFS Pool Purpose
nvme0n1 1.8 TB NVMe โ€” Boot (PVE root + LVM-thin)
nvme1n1 3.6 TB NVMe nvmepool (stripe) VMs, containers, sync, music, movies, books, photos, video
nvme2n1 3.6 TB NVMe nvmepool (stripe) โ€”
nvme3n1 3.6 TB NVMe nvmepool (stripe) โ€”
sda 465.8 GB NVMe (USB) backups Vzdump backups, ISOs (Crucial P5 500GB in Sabrent USB enclosure, installed Apr 19 2026)
โ€” 2ร— 18.2 TB HDD (TB3) Biggest mirror-0 Archive/backup mirror (ORICO 9858T3 Thunderbolt 3 enclosure)
โ€” 3ร— 4 TB HDD (TB3) โ€” 3 free bays in ORICO 9858T3 Thunderbolt 3 enclosure (Birch pool retired Apr 2026)

Retired (Apr 2026): BIGGIE (Seagate 5TB USB), Big (932GB SSD), Birch (3ร—4TB RAIDZ1 โ€” pool destroyed, seedbox sync moved to nvmepool/ingest). Nextcloud removed.

ZFS Pools & Datasets
Pool Size Used Health Key Datasets
nvmepool 10.9 TB ~6.4 TB (59%) ONLINE sync, music, movies, books, photos, video, audiobookshelf, bookshelf, tv, ingest, container-data, vms
Biggest 18.2 TB ~16.2 TB (89%) ONLINE Maple (Amigo, Monte, Ichabod โ€” archive data), nvmepool-backup (nightly rsync of nvmepool), Kiwix
Birch โ€” โ€” โ€” RETIRED Apr 2026 โ€” pool destroyed. Seedbox sync moved to nvmepool/ingest. 3 free drive bays available in ORICO enclosure.
backups 464 GB โ€” ONLINE dump, isos โ€” Crucial P5 500GB (CT500P5SSD8, serial 21022FE3A911) in Sabrent USB enclosure (Realtek bridge 0bda:9210). Replaced failed Samsung 980 1TB on Apr 19 2026 (original Samsung lasted 6 days).
offsite 18.2 TB ~10.8 TB (59%) ONLINE maple (Biggest/Maple mirror), nvmepool-data (nvmepool backup copy), ct100-backups, seedbox

Dataset breakdown (nvmepool):

Dataset Used Mount Purpose
nvmepool/sync 1.87 TB /nvmepool/sync Mac Studio SYNC mirror
nvmepool/music 2.35 TB /nvmepool/music Music library (Navidrome + Plex)
nvmepool/movies 1.83 TB /nvmepool/movies Movie library (Plex)
nvmepool/audiobookshelf 24.7 GB /nvmepool/audiobookshelf Audiobook library
nvmepool/bookshelf 6.24 GB /nvmepool/bookshelf Readarr app data
nvmepool/books 33.2 GB /nvmepool/books Calibre-Web library
nvmepool/photos 1.40 TB /nvmepool/photos Photo library (Plex + Immich external library)
nvmepool/video 27.9 GB /nvmepool/video Video library (Plex)
nvmepool/tv 187 GB /nvmepool/tv TV library (Plex + Sonarr)
nvmepool/ingest varies /nvmepool/ingest Seedbox download landing zone (replaces retired Birch pool)
nvmepool/container-data 38.0 GB /nvmepool/container-data Large container configs (Lidarr, Plex, CWA, Sonarr, Immich DB + uploads) โ€” moved off CT100 rootfs Apr 2026
nvmepool/vms 95.4 GB /nvmepool/vms VM/CT disk images

Dataset breakdown (Biggest):

Dataset Used Contents
Biggest/Maple 10.1 TB Amigo (Cell Photos, ISO, TV, Video), Ichabod (Movies, Music, Databases, Podcasts), Monte (Dropbox, Mystuff, PDF, Photos)
Biggest/nvmepool-backup 5.81 TB Nightly rsync mirror of all nvmepool datasets
Biggest/Kiwix 99 GB Offline reference content (Wikipedia, Stack Exchange, Gutenberg) โ€” zstd compressed
Biggest/media-staging empty General staging area on mirrored drives

Speedy, TimeMachineOne, Ichabod/Sort, Amigo/delgross, Amigo/Youtube, Possible Delete โ€” all deleted (Apr 7 and Apr 16 2026). Special vdev (Optane 110GB) and cache SSD (465GB) removed from pool.

Dataset breakdown (offsite):

Dataset Used Contents
offsite/maple 6.23 TB Mirror of Biggest/Maple โ€” irreplaceable archive data
offsite/nvmepool-data 4.55 TB Backup copy of nvmepool media
offsite/ct100-backups empty CT100 vzdump backup destination
offsite/seedbox empty Seedbox data backup destination

The offsite pool is a single 18.2 TB drive that travels intermittently to the farm for geographic redundancy. Manual sync before each departure.

Containers & VMs

CT 100 โ€” docker-host (primary media/apps container)

Setting Value
OS Debian 12 (LXC)
Cores 4
RAM 16 GB
Swap 4 GB
Root disk 48 GB on nvme-data (expanded from 32 GB Apr 2026)
IP 192.168.8.100
Features Nesting, keyctl, privileged (unprivileged: 0) โ€” required for stable Docker networking
Autostart Yes

Bind mounts into CT 100:

Host path Container mount Purpose
/nvmepool/ingest /mnt/seedbox Seedbox downloads landing (Music + Books)
/nvmepool/books /mnt/books Calibre-Web library
/nvmepool/music /mnt/music Music library
/nvmepool/audiobookshelf /mnt/audiobookshelf Audiobookshelf data
/nvmepool/bookshelf /mnt/bookshelf Readarr app data
/nvmepool/movies /mnt/movies Movie library
/nvmepool/photos /mnt/photos Photo library
/nvmepool/video /mnt/video Video library
/nvmepool/tv /mnt/tv TV library
/nvmepool/container-data /mnt/container-data Large container configs (Lidarr, Plex, CWA)
/Biggest/Kiwix /mnt/kiwix Kiwix ZIM file storage (offline Wikipedia, etc.)

CT 101 โ€” immich (dedicated Immich photo management host, created Apr 18 2026)

Setting Value
OS Debian 12 (LXC)
Cores 8 (bumped from 4 for faster initial ML scan)
RAM 8 GB
Swap 2 GB
Root disk 32 GB on nvme-data
IP 192.168.8.103 (originally .101, changed Apr 18 due to IP conflict with office-2.lan)
Features Nesting, keyctl
Autostart Yes
MAC BC:24:11:D5:67:E8

Bind mounts into CT 101:

Host path Container mount Purpose
/nvmepool/photos /mnt/photos Immich external library (read-only, 1.4 TB)
/nvmepool/container-data/immich /mnt/immich-data Immich uploads, postgres DB, thumbs, model cache

Docker-specific notes: IPv6 disabled in /etc/docker/daemon.json (required โ€” ghcr.io was causing connection resets because CT101 has no IPv6 default route). DNS set to 8.8.8.8 + 1.1.1.1.

Docker Services (inside CT 100)
Service Image Port URL Status
Plex linuxserver/plex 32400 http://192.168.8.100:32400/web Up
Calibre-Web (CWA) calibre-web-automated 8083 http://192.168.8.100:8083 Up
Portainer portainer-ce:lts 9443 https://192.168.8.100:9443 Up
Uptime Kuma uptime-kuma:1 3001 http://192.168.8.100:3001 Up
Gotify gotify/server 8070 http://192.168.8.100:8070 Up
Gotify-Telegram Bridge custom (Python) โ€” โ€” Up
N8N n8n:latest 5678 http://192.168.8.100:5678 Up
Audiobookshelf audiobookshelf:latest 13378 http://192.168.8.100:13378 Up
Navidrome navidrome:latest 4533 http://192.168.8.100:4533 Up
Lidarr lidarr:nightly 8686 http://192.168.8.100:8686 Up
Bookshelf bookshelf:hardcover 8787 http://192.168.8.100:8787 Up
Shelfmark shelfmark 8084 http://192.168.8.100:8084 Up
Radarr linuxserver/radarr 7878 http://192.168.8.100:7878 Up
Sonarr linuxserver/sonarr 8989 http://192.168.8.100:8989 Up
Prowlarr prowlarr 9696 http://192.168.8.100:9696 Up
FreshRSS freshrss 8180 http://192.168.8.100:8180 Up
Kiwix ghcr.io/kiwix/kiwix-serve 8380 http://192.168.8.100:8380 Up
Wallabag wallabag/wallabag 8480 http://192.168.8.100:8480 Up
Wallabag DB mariadb:11 โ€” internal Up
Wallabag Redis redis:7-alpine โ€” internal Up
ConvertX ghcr.io/c4illin/convertx 3100 http://192.168.8.100:3100 Up
Aurral ghcr.io/lklynet/aurral 3002 http://192.168.8.100:3002 Up
Recyclarr ghcr.io/recyclarr/recyclarr โ€” headless Up
Dozzle amir20/dozzle 9999 http://192.168.8.100:9999 Up
Homepage gethomepage.dev 3000 http://192.168.8.100:3000 Up
FlareSolverr flaresolverr 8191 http://192.168.8.100:8191 Up
Watchtower containrrr/watchtower โ€” headless Up
Prometheus prom/prometheus 9090 http://192.168.8.100:9090 Up
Grafana grafana/grafana 3200 http://192.168.8.100:3200 Up
node-exporter prom/node-exporter โ€” internal Up
cAdvisor gcr.io/cadvisor โ€” internal Up
weather-exporter custom โ€” internal Up
Docker Services (inside CT 101)
Service Image Port URL Status
Immich Server ghcr.io/immich-app/immich-server:release 2283 http://192.168.8.103:2283 Up
Immich ML ghcr.io/immich-app/immich-machine-learning:release โ€” internal Up
Immich Postgres ghcr.io/immich-app/postgres:14-vectorchord0.4.3-pgvectors0.2.0 โ€” internal Up
Immich Redis redis:6.2-alpine โ€” internal Up

Immich is a self-hosted photo and video management platform (Google Photos alternative). Deployed as a 4-container stack on CT 101 via Docker Compose at /opt/immich/. External library points at /nvmepool/photos (1.4 TB, ~134K files) in read-only mode so originals are never modified. Immich’s own data (uploads, thumbnails, transcoded video, Postgres DB, ML model cache) lives in /nvmepool/container-data/immich/. Admin account created on first web access. DB password stored in /opt/immich/.env. Image tag locked to :release.

Docker Services (inside CT 100)

Plex serves movies, music, photos, video, and audiobooks from nvmepool. Plexamp (iOS/Mac client) connects to it for music. Uses network_mode: host.

Radarr manages the movie library at /mnt/movies (nvmepool/movies). Searches via Prowlarr indexers, downloads via seedbox, auto-renames and organizes movies for Plex. API key: b117993eb50f465ea485654bc0118861. Compose at /opt/radarr/docker-compose.yml.

Filebot (v5.2.1) is installed as a system package on CT100 (/bin/filebot) for ad-hoc movie/media renaming. Not containerized.

Calibre-Web Automated (CWA) serves the book library from /mnt/books (nvmepool/books). Auto-ingests books dropped into /mnt/books/ingest, auto-converts 28 formats to epub, fetches metadata, detects duplicates. Calibre bundled. Default login: admin / admin123. Image: crocodilestick/calibre-web-automated:latest.

Kiwix serves offline reference content (Wikipedia, Stack Exchange, Project Gutenberg, etc.) from /mnt/kiwix (Biggest/Kiwix โ€” zstd compressed, 5.6TB available). ZIM files are downloaded manually from library.kiwix.org. A cron-based watcher (/usr/local/bin/kiwix-watcher.sh, every 5 min) detects new/changed ZIMs via MD5 hash of the file list and restarts the container to pick them up. Compose at /opt/kiwix/docker-compose.yml. Starter ZIM: wikipedia_en_simple_all_nopic_2026-02.zim (922 MB).

Wallabag is a self-hosted read-it-later service (alternative to Pocket/Instapaper). Stack: Wallabag app + MariaDB 11 (wallabag-db) + Redis 7 (wallabag-redis), all on dedicated wallabag-net bridge network. Compose at /opt/wallabag/docker-compose.yml. Secrets (DB password, Symfony secret) saved in /opt/wallabag/credentials.txt (root-only, chmod 600). Data persisted in named Docker volumes (wallabag-db, wallabag-redis, wallabag-images). Default admin account needs to be created on first visit. Browser extensions for Firefox/Chrome and mobile apps (iOS/Android) support direct capture.

ConvertX is a self-hosted file converter supporting 1000+ formats via FFmpeg, Pandoc, LibreOffice, GraphicsMagick, Inkscape, and more. Compose at /opt/convertx/docker-compose.yml. Data persisted in named volume convertx-data. Account registration disabled after first account creation (ACCOUNT_REGISTRATION=false). Converted files auto-delete after 24 hours (AUTO_DELETE_EVERY_N_HOURS=24). HTTP_ALLOWED=true set for local HTTP access.

SMB Shares
Share Path Access Purpose
Review /Biggest/Maple read/write, user: bee Archive data on mirrored drives (Amigo, Ichabod, Monte)
Sync /nvmepool/sync read-only, user: bee Mac Studio SYNC mirror
Music /nvmepool/music read/write, user: bee Music library (33,654 tracks)
Books /nvmepool/books read/write, user: bee Book library
Movies /nvmepool/movies read/write, user: bee Movie library
Video /nvmepool/video read/write, user: bee Video library
Seedbox โ€” โ€” โ€”
Media Staging /Biggest/media-staging read/write, user: bee Staging area on mirrored drives
backups /backuppool read-only, user: bee Proxmox dumps/ISOs
nvmepool-backup /Biggest/nvmepool-backup read-only, user: bee Nightly nvmepool backup

All shares configured in /etc/samba/smb.conf (no registry shares). valid users = bee, ownership standardized to bee:bee across all datasets. Apple vfs objects = fruit streams_xattr for macOS compatibility.

Mac Finder access: smb://192.168.8.221/<share_name> or via Network โ†’ PVE (Avahi/mDNS advertised).

Seedbox Pipeline

The seedbox is a remote Usenet server at ismene.usbx.me (IP 46.232.210.50). NZBGet runs on the seedbox and downloads to categorized folders. Two SSH tunnels on Proxmox expose the seedbox UIs locally, and cron scripts pull completed files down.

Data flow:

  1. Lidarr/Radarr request albums/movies โ†’ send to NZBGet on seedbox
  2. NZBGet downloads and sorts into completed/Music/, completed/Books/, etc.
  3. seedbox-sync.sh (every 15 min) pulls Music, Books, and general completed downloads to /nvmepool/ingest/
  4. Lidarr/Beets process and move finished files to nvmepool/music
  5. Plex/Navidrome serve from nvmepool
Sync & Backup Scripts

Mac Studio Sync:

Script Schedule Source Destination Notes
sync-mac.sh DISABLED (Apr 13, 2026) bee@192.168.8.180:/Users/bee/SYNC/ /nvmepool/sync/ Was failing with rsync protocol error (exit 12). Syncthing may cover this path.

Backups:

Job Schedule Scope Compression Retention Storage
vzdump-daily 2:00 AM All VMs/CTs zstd 3 copies backup-hdd (/backups/dump/dump/)
Docker prune Sundays 4:00 AM CT100 โ€” โ€” Cleans dangling containers, networks, images
Radarr start Midnight CT100 โ€” โ€” Starts Radarr for nightly indexer hits
Radarr stop 5:00 AM CT100 โ€” โ€” Stops Radarr to limit downloads to off-hours
CWA processed cleanup 5:00 AM CT100 โ€” โ€” Clears calibre-web/processed_books
Kiwix ZIM watcher Every 5 min CT100 โ€” โ€” Restarts kiwix-serve when ZIM file list changes (MD5 hash check)

Offsite Backup:

A 20TB Seagate Exos (ST20000NM002C, serial ZXA0FLHC) in an ASMT105x USB 3.2 enclosure serves as the offsite backup drive. Formatted as ZFS pool offsite with zstd compression, atime=off, xattr=sa, ashift=12. Negotiates USB 3.2 Gen 2 (10 Gbps SuperSpeed Plus) on Bus 6 Port 1 โ€” critical to plug into the correct USB-A port: the other USB-A ports on the Minisforum Venus are USB 2.0 and will bottleneck transfers to ~42 MB/s. On the USB 3 port, rsync hits ~200 MB/s sustained (bottlenecked by spinning disk sequential write).

Dataset Source Contents
offsite/nvmepool-data /Biggest/nvmepool-backup/ Mirror of nvmepool (music, movies, books, sync, etc.)
offsite/maple /Biggest/Maple/ Unique archive data (Amigo, Ichabod, Monte)
offsite/seedbox โ€” Seedbox downloads (placeholder โ€” seedbox now on nvmepool/ingest)
offsite/ct100-backups /backups/dump/ Vzdump CT100 backups

Script: /usr/local/bin/offsite-backup.sh โ€” rsync with --delete for incremental updates. Workflow: connect drive โ†’ zpool import offsite โ†’ offsite-backup.sh โ†’ zpool export offsite โ†’ disconnect and take offsite.

Health Monitoring (v2, updated Apr 18 2026):

Script: /usr/local/bin/system-health-check.sh โ€” runs every 15 min via /etc/cron.d/system-health-check. Pushes alerts to Gotify. Checks: root disk space, all 4 active ZFS pools (nvmepool, Biggest, backups, offsite โ€” health + suspended + capacity + removed/faulted vdevs), backup age/location, USB hub errors and pool suspension events, snapshot counts, key services (pveproxy, pvedaemon, smbd). Daily summary at 7 AM.

ZFS Maintenance:

Task Schedule Pool
Auto-snapshot Every 15 min (keep 4 frequent, 24 hourly, 31 daily, 8 weekly, 12 monthly) All
Scrub Biggest 1st of month, 3 AM Biggest
Scrub nvmepool 8th of month, 3 AM nvmepool
Scrub backups 22nd of month, 3 AM backups
Security
Service Config
UFW Active โ€” default DROP on INPUT. Allowed: SSH (22), Proxmox (8006), SMB (445, 139), VNC (5900-5999), Spice (3128)
Fail2Ban Active โ€” jails: proxmox, sshd
SSH Key-based auth to seedbox (id_ed25519) and Mac Studio (id_rsa)
Monitoring & Notifications

Uptime Kuma (http://192.168.8.100:3001) โ€” 61 monitors covering:

Category Monitors Check Interval
Internet connectivity Google, Cloudflare, DNS 8.8.8.8 60s
Network infrastructure Router, CT100 ping 60-120s
CT100 Docker services Plex, Navidrome, CWA, Portainer, Gotify, FreshRSS, N8N, Audiobookshelf, Lidarr, Bookshelf, Shelfmark, Prowlarr, Radarr, Sonarr, Dozzle, FlareSolverr, Homepage, Prometheus, Grafana, Wallabag, Kiwix, ConvertX 120s
CT101 Docker services Immich 60s
Proxmox host Web UI, Cockpit, SMB, Syncthing, NZBGet tunnel, NZBHydra2 tunnel 120-300s
Mac Studio Ping, SSH, Life Archive API, Paperless-NGX, Syncthing, LM Studio, Embed Server, Hugo Bee Hub 120-300s
VPS Ping, Pangolin Dashboard, Bee Hub (VPS), DNS resolution 120-300s
SSL certificates Pangolin, Home Assistant 3600s
Keyword health checks Plex API, Navidrome API, Portainer API 300s
Farm Home Assistant (via Pangolin) 300s
Seedbox SSH 300s

Notification chain: Uptime Kuma โ†’ Gotify โ†’ Telegram bridge โ†’ @beenetworkbot

Gotify-Telegram Bridge (Docker, /opt/gotify-telegram/): Polls Gotify every 10 seconds for new messages and forwards to Telegram with priority-based emojis (๐Ÿ”ด critical, ๐ŸŸก warning, ๐Ÿ“ข info). All Gotify sources are forwarded โ€” Uptime Kuma alerts, health check script alerts, and any other Gotify notifications.

Setting Value
Telegram Bot @beenetworkbot
Telegram Chat ID 5289824155
Gotify App Token ARCkVc0wf001L.e
Gotify Client Token COXHgqAwb_mZdz0

Health Check Script (/usr/local/bin/system-health-check.sh): Runs every 15 min via cron. Monitors root disk space, all 4 ZFS pools (nvmepool, Biggest, backups, offsite โ€” health/suspended/capacity/vdevs), backup age, USB hub errors, snapshot counts, key services. Daily summary at 7 AM. Alerts via Gotify โ†’ Telegram. Updated Apr 18 2026.

Access Reference
Method Command / URL
Web UI https://192.168.8.221:8006
Cockpit https://192.168.8.221:9090
SSH ssh root@192.168.8.221
SMB (Music) smb://192.168.8.221/Music (user: bee)
SMB (Movies) smb://192.168.8.221/Movies (user: bee)
SMB (Books) smb://192.168.8.221/Books (user: bee)
SMB (Seedbox) smb://192.168.8.221/Seedbox (user: bee)
SMB (Review) smb://192.168.8.221/Review (user: bee)
NZBGet UI http://192.168.8.221:16789 (tunneled from seedbox)
NZBHydra2 UI http://192.168.8.221:15076 (tunneled from seedbox)
Plex http://192.168.8.100:32400/web
Calibre-Web http://192.168.8.100:8083