Dashboard Overview
Total Requests Today
—
Active Providers
—
System Status
—
Uptime
—
Provider Health
| Provider | Status | Error Rate | Models | Priority |
|---|---|---|---|---|
| Loading... | ||||
Providers
Loading providers...
API Keys
| Key | Usage Today | RPM | Status | Actions |
|---|---|---|---|---|
| Select a provider | ||||
Request Logs
| Time | Request ID | Provider | Model | Status | Latency | Tokens |
|---|---|---|---|---|---|---|
| Loading... | ||||||
Usage Statistics
Requests Today
—
Input Tokens
—
Output Tokens
—
Daily History
| Date | Total Requests | Details |
|---|---|---|
| Loading... | ||
AI Playground
Live Ready
🦙 Ollama Playground
Connection Status
Daemon Bridge
Ollama (11434)
Daemon Logs (Ollama)
Loading...
Settings
Backend Connection
The URL where your OmniRouteAI backend is running
This is the API_KEY set in your backend .env file
Local Daemon Connection
Stored at ~/.omniroute/local-cli/token.txt — required for Ollama Playground
Quick Actions
Connection Status
Not tested yet
Local CLI Auth Management
Harvested Sessions
The daemon automatically monitors your local config files (Claude Code, Cursor, GitHub gh, etc.) to borrow active sessions.
| Provider | Method | Status | Expiries In | Actions |
|---|---|---|---|---|
| Loading status... | ||||
Direct OAuth / MITM Settings
Inactive
Enabling
MITM_PROXY=true in daemon env allows capturing tokens from keychain-managed tools like Grok.