About OpenRouter API Status
OpenRouter is a unified AI inference gateway that routes requests across 200+ models from OpenAI, Anthropic, Google, Meta, Mistral, and many others through a single OpenAI-compatible API. This page tracks OpenRouter API outages, degradations, and incidents in real time, automatically updated every 60 seconds from our monitoring infrastructure.
Official status page: https://status.openrouter.ai
Common OpenRouter Outage Symptoms
- ✕HTTP 429 — rate limit exceeded on OpenRouter's own limits or upstream provider capacity
- ✕HTTP 502 / 503 — routing errors when upstream providers are degraded
- ✕Model-specific routing failures when a particular upstream provider is down
- ✕Elevated latency as requests are retried across multiple providers during incidents
- ✕Unexpected model switching when primary providers are unavailable
- ✕Credit balance issues causing requests to be blocked during high-spend periods
What to Do During a OpenRouter Outage
- Honor the Retry-After header on 429 responses and apply exponential backoff.
- Explicitly specify fallback models using the 'models' array parameter in your request to control routing.
- Switch to a BYOK proxy (AI Badgr) for direct provider access, bypassing OpenRouter during its incidents.
- Monitor the official OpenRouter status page at status.openrouter.ai for incident announcements.
- Set the 'route' parameter to 'fallback' in your request to enable OpenRouter's built-in failover across providers.
Other AI Provider Status Pages
OpenRouter Outage FAQ
Is OpenRouter down right now?
This page checks our live monitoring infrastructure (updated every 60 s) which tracks the official OpenRouter status page and our own request telemetry. The status badge at the top reflects the current state.
Why is OpenRouter returning errors even though individual providers seem fine?
OpenRouter adds its own routing layer, so even if upstream providers are healthy, routing infrastructure issues can cause errors. Check status.openrouter.ai specifically for OpenRouter platform issues.
How do I handle OpenRouter 502 errors?
HTTP 502 from OpenRouter means the upstream provider returned an error. Use the 'models' array to specify multiple fallback models in priority order. OpenRouter will attempt each in sequence.
Can I automatically failover away from OpenRouter during an outage?
Yes. AI Badgr can proxy requests directly to individual providers, bypassing OpenRouter entirely. Change one line of code (base_url) and route directly to OpenAI, Anthropic, or other providers.
Does OpenRouter support all OpenAI API features?
OpenRouter supports the core chat completions and streaming endpoints in an OpenAI-compatible format. Some advanced features (like OpenAI function calling syntax on non-OpenAI models) may behave differently. Check OpenRouter's model-specific documentation.
Never get stuck in a OpenRouter outage again
AI Badgr acts as a transparent proxy for your existing API keys. One line of code change. Zero vendor lock-in. Instant failover when OpenRouter is down.
Get Started Free →