Total Requests Today
Active Providers
System Status
Uptime

Provider Health

Provider Status Error Rate Models Priority
Loading...
Loading providers...
Key Usage Today RPM Status Actions
Select a provider
Time Request ID Provider Model Status Latency Tokens
Loading...
Requests Today
Input Tokens
Output Tokens

Daily History

Date Total Requests Details
Loading...

Configuration

How can I help you test your AI providers today?

Connection Status

Daemon Bridge
Ollama (11434)

Configuration

Welcome to the local Ollama playground. Select a model and start chatting — all requests go directly to your local Ollama instance.

Daemon Logs (Ollama)

Loading...

Backend Connection

The URL where your OmniRouteAI backend is running
This is the API_KEY set in your backend .env file

Local Daemon Connection

Stored at ~/.omniroute/local-cli/token.txt — required for Ollama Playground

Quick Actions

Connection Status

Not tested yet

Harvested Sessions

The daemon automatically monitors your local config files (Claude Code, Cursor, GitHub gh, etc.) to borrow active sessions.

Provider Method Status Expiries In Actions
Loading status...

Direct OAuth / MITM Settings

Inactive Enabling MITM_PROXY=true in daemon env allows capturing tokens from keychain-managed tools like Grok.