Back to Playground Operator Deck
Ollama Command Bridge
Built for tinkerers who break things on purpose. Run diagnostics, simulate recovery steps,
and pressure-test your local or remote inference path.
systemd Model Relocation OpenWebUI Link Failure Drills
Live Container Lab
This page now launches a real ephemeral container. Practice commands in isolation, then apply the same
workflow on your persistent Ollama host.
Connecting to lab engine...
Checking if real Linux environments are available
Real Container Lab
Launch an ephemeral LXC container for hands-on practice. Auto-destroys after 60 minutes.
Prefer offline mode?
Provisioning a real Linux container on isolated infrastructure
Creating LXC container on Proxmox... Booting Alpine Linux & installing tools... Opening live terminal session... !
Failed to create lab
Connected to live container -- this is a real, isolated Linux environment
Simulation Mode -- Could not reach lab engine. Using client-side sandbox. Session ---
Time Left --:--
Lab Engine Status
Checking Active sessions --
Capacity --
Template --
Last check --
Waiting for lab engine response...
Command Deck
Tap a check to replay output in the terminal pane.
Service health
Topology Pulse
Linux Workstation
Ollama daemon + local models
Healthy Remote Inference Host
GPU-backed Ollama endpoint
Check ACL OpenWebUI
Client over HTTP bridge
Connected Failure Matrix
No models listed
Likely cause: Incorrect OLLAMA_MODELS path or mount not ready
Fast fix: Verify override + RequiresMountsFor, then daemon-reload + restart
OpenWebUI cannot connect
Likely cause: Wrong base URL (often includes /api)
Fast fix: Use host:11434 exactly and validate with curl /api/tags
Service exits on restart
Likely cause: Permission mismatch or inaccessible model directory
Fast fix: Check journalctl -u ollama and directory ownership