Architecture
How the plugin and the AI service are separated, what talks to what, and why it's designed this way.
Two processes, one contract
┌─────────────────────────────────────┐ ┌───────────────────────────────────┐
│ Jellyfin (dotnet / net9.0) │ │ Docker AI Service (python/fastapi)│
│ │ │ │
│ ┌───────────────────────────────┐ │ HTTP │ ┌──────────────────────────────┐ │
│ │ JellyfinUpscalerPlugin.dll │──┼─────────→┼──│ /Upscaler/ controller-like │ │
│ │ ├─ VideoProcessor │ │ :5000 │ │ /models, /hardware, /metrics │ │
│ │ ├─ VideoAnalyzer (ffprobe) │ │ │ │ /openapi.json, /health │ │
│ │ ├─ VideoFrameProcessor │ │ │ └──────────────────────────────┘ │
│ │ ├─ ProcessingMethodExecutor │ │ │ ┌──────────────────────────────┐ │
│ │ ├─ HttpUpscalerService ─────┼──┘ │ │ ONNX Runtime (CUDA/OV/ROCm) │ │
│ │ └─ 3 ScheduledTasks │ │ │ + TensorRT FP16 path │ │
│ └───────────────────────────────┘ │ └──────────────────────────────┘ │
│ ┌───────────────────────────────┐ │ ┌──────────────────────────────┐ │
│ │ Jellyfin MediaEncoder │ │ │ /app/models/*.onnx volume │ │
│ │ ffmpeg / ffprobe binaries │ │ │ (pulled on demand) │ │
│ └───────────────────────────────┘ │ └──────────────────────────────┘ │
└─────────────────────────────────────┘ └───────────────────────────────────┘
Why two processes?
- Native ABI isolation. ONNX Runtime + CUDA + cuDNN bring in a stack of native libraries that don't play well with the tightly-pinned native set inside Jellyfin's process. By running inference in a sibling container we never touch Jellyfin's memory space.
- Independent upgrade cadence. Bump CUDA / OpenVINO / ONNX independently from Jellyfin's ABI. The plugin targets Jellyfin 10.11.x and speaks HTTP to any service version with a compatible API surface.
- GPU portability. Same plugin DLL works against CUDA, OpenVINO, ROCm, and CPU service images. Pick the image that matches your host.
- Restart safety. Hot-reload models or upgrade the AI service without restarting Jellyfin. Jellyfin will see
/healthas degraded, then recover.
Jellyfin-side components
VideoProcessor (core orchestrator)
The plugin singleton. Holds references to MediaEncoder (for ffmpeg/ffprobe paths), HttpUpscalerService (HTTP client), and three sub-services described below. Processes jobs via ProcessVideoAsync.
VideoAnalyzer · VideoFrameProcessor · ProcessingMethodExecutor
Three specialised helpers that cache ffmpeg/ffprobe paths. All three had a cold-start bug (fixed in v1.6.1.16) where the captured paths were empty if Jellyfin constructed the plugin before MediaEncoder.EncoderPath/ProbePath finished resolving. The fix: new EnsureFFmpegReady() in VideoProcessor late-resolves and propagates updated paths via new Update*Path() methods.
HttpUpscalerService
Thin wrapper around IHttpClientFactory. Attaches X-Api-Token header when configured. Handles retries, transparent decompression, and per-call timeouts.
ScheduledTasks
- LibraryUpscaleScanTask — nightly scan, fires on Jellyfin's daily trigger (default 03:00).
- LibraryImageScanTask — optional pass for cover/poster upscaling.
- Cache cleanup — prunes the pre-processing cache according to retention config.
Service-side components
FastAPI application (/app/main.py)
Routes: /health, /status, /metrics, /features, /hardware, /gpus, /models, /models/{id}, /models/{id}/download, /models/{id}/load, /models/{id}/unload, /upscale-frame, /benchmark-frame, /face-restore/*, /logs-stream (SSE).
ONNX Runtime wrapper
Single-process Python wrapping onnxruntime-{gpu,openvino,rocm,cpu}. Hardware-detection at startup picks the execution provider. Each loaded model holds a session + preallocated input/output tensors.
TensorRT path (CUDA only)
On first load of a given (model, resolution) combo on an Ampere+ GPU, the service emits a .plan file for TensorRT FP16. Subsequent sessions pick this up for ~2× speedup.
Model catalog
Declarative manifest at /app/models/catalog.json with each model's download URL, SHA256, scale factor, provider compatibility, and hot-reloadable at runtime.
Data flow: nightly library scan
- Jellyfin scheduler fires
LibraryUpscaleScanTask.ExecuteAsync. - Plugin calls
HttpUpscalerService.GetHealthAsync— aborts if service is offline. - Plugin enumerates Jellyfin virtual folders, filters to those in
EnabledLibraryIds(or all if empty). - For each item:
VideoAnalyzer.AnalyzeVideoAsync→ resolution, fps, codec, HDR, interlaced. - If below threshold:
VideoProcessor.ProcessVideoAsync→ProcessingMethodExecutorpicks pipeline →VideoFrameProcessor.ExtractFramesAsyncdumps PNGs to a temp dir. - Plugin POSTs each frame to
/upscale-frame→ receives upscaled PNG → caches to disk. VideoFrameProcessor.ReassembleAsyncpipes frames + original audio back through ffmpeg with the selected codec.- Output lands alongside the source as
<basename>_upscaled.mp4. Jellyfin library watcher picks it up naturally.
Data flow: live quick-menu
- User opens a video; Jellyfin web client loads
player-integration.jsfrom the plugin. - Script injects the Upscaler button into the player toolbar.
- On click, the tabbed overlay renders — fetches
/Upscaler/modelsand/Upscaler/filter-presetsin parallel. - Filters tab sliders write directly to
<video>.style.filter— 60fps, no round-trip. - Models tab calls
POST /Upscaler/models/{id}/loadwhich forwards to the AI service's/models/{id}/load. Service warms the model in GPU memory. - Subsequent frames rendered by Jellyfin go through the loaded model when playback seeks (requires Real-Time AI processing mode).
Failure modes & recovery
| Failure | Plugin behaviour |
|---|---|
| Service offline | Test Connection shows red; scheduled task logs and skips cleanly; quick-menu shows "Standalone" badge. |
| Model download fails | Model row flips to error state with the HTTP status message. Retry button re-issues the download. |
| GPU OOM during load | Service returns 507 Insufficient Storage; plugin surfaces the service-provided hint verbatim. |
| ffmpeg/ffprobe paths unresolved | v1.6.1.16 EnsureFFmpegReady() re-queries MediaEncoder on every entry point. Logs warn if still unresolved instead of silently failing. |
| Plugin DLL missing transitive NuGet | Jellyfin tombstones the plugin. Recovery: add the missing DLL, call POST /Plugins/{id}/{version}/Enable. |