Skip to main content
Back to Playground

Operator Deck

Ollama Command Bridge

Built for tinkerers who break things on purpose. Run diagnostics, simulate recovery steps, and pressure-test your local or remote inference path.

systemd Model Relocation OpenWebUI Link Failure Drills

Live Container Lab

This page now launches a real ephemeral container. Practice commands in isolation, then apply the same workflow on your persistent Ollama host.

Connecting to lab engine...

Checking if real Linux environments are available

LIVE Connecting...

Lab Engine Status

Checking
Active sessions --
Capacity --
Template --
Last check --

Waiting for lab engine response...

Command Deck

Tap a check to replay output in the terminal pane.

Service health
 
 

Topology Pulse

Linux Workstation

Ollama daemon + local models

Healthy

Remote Inference Host

GPU-backed Ollama endpoint

Check ACL

OpenWebUI

Client over HTTP bridge

Connected

Failure Matrix

No models listed

Likely cause: Incorrect OLLAMA_MODELS path or mount not ready

Fast fix: Verify override + RequiresMountsFor, then daemon-reload + restart

OpenWebUI cannot connect

Likely cause: Wrong base URL (often includes /api)

Fast fix: Use host:11434 exactly and validate with curl /api/tags

Service exits on restart

Likely cause: Permission mismatch or inaccessible model directory

Fast fix: Check journalctl -u ollama and directory ownership

Hard Mode Missions

0/5 complete