Infux
Connecting...

Open-source AIfor every coding agent

One API key. 17 open-weight models. Flat-rate pricing from $15/mo. Works with OpenClaw, Claude Code, Aider, and any OpenAI-compatible tool.

Works with

OpenClawOpenClawClaude CodeClaude CodeAiderAiderClineClineContinueContinueCodex CLICodex CLIOpenCodeOpenCodeCursorCursorRoo CodeRoo CodeZedZedNeovimNeovimVS CodeVS CodeJetBrainsJetBrainsOpenClawOpenClawClaude CodeClaude CodeAiderAiderClineClineContinueContinueCodex CLICodex CLIOpenCodeOpenCodeCursorCursorRoo CodeRoo CodeZedZedNeovimNeovimVS CodeVS CodeJetBrainsJetBrains
17
Open Models
9
Providers
3
API Formats
<1ms
Proxy Overhead
Featured Integration
OpenClaw

Power OpenClaw with Infux

215K+ GitHub Stars

OpenClaw is the fastest-growing open-source AI agent in history. Connect it to Infux for flat-rate access to 18 open-weight models — no per-token billing, no API key juggling. Just set your base URL and go.

Connect OpenClaw to Infux in 2 lines

OPENAI_API_KEY=sk-infux-your_key_here
OPENAI_BASE_URL=https://api.infux.dev/v1

Why developers switch

Same quality. 70% less.

MiniMax M2.5 scores 80.2% on SWE-bench — within 0.7% of Claude Opus 4.5 — at a fraction of the cost.

-70%

Claude Max 5×

$100/mo

Infux Pro:$30/mo
-75%

Claude Max 20×

$200/mo

Infux Max:$50/mo
-50%

Cursor Ultra

$200/mo

Infux Ultra:$100/mo
-23%

Copilot Pro+

$39/mo

Infux Pro:$30/mo

SWE-bench Verified — Coding Benchmark

Claude Opus 4.5
80.9%
$100/mo
MiniMax M2.5
80.2%
$30/mo with Infux
GLM-5
77.8%
Included
Kimi K2.5
76.8%
Included

Quick Start

Three lines. That's it.

Drop these into any tool that supports the OpenAI API format.

.env
# .env
OpenClawClaude CodeAiderClineContinueCodex CLIOpenCode

Built for agents

Infrastructure that disappears

3 API Formats

OpenAI, Anthropic, and Responses API — one endpoint serves all three.

Zero-Copy Streaming

Raw SSE passthrough. Sub-millisecond proxy overhead. No buffering, ever.

Auto-Fallback

Every model has 1–3 provider backends. Failover in <500ms. You never see outages.

Real-Time Dashboard

Track tokens, requests, latency per model. See your quota in real time.

Open Source (MIT)

Gateway proxy is fully open source. Inspect, contribute, or self-host.

Zero Vendor Lock-In

Standard OpenAI API. Switch providers by changing one env variable.

Model Registry

17 open-weight models

All open-weight. MIT & Apache 2.0 licensed. Zero vendor lock-in.

minimax-m2.580.2%
glm-577.8%
kimi-k2.576.8%
glm-4.773.8%
devstral-272.2%
kimi-k2.5-thinking76.8%
deepseek-r149.2%
deepseek-v3-032442.1%
llama-4-maverick50.1%
llama-4-scout-
mistral-small-
qwen3-32b-
qwen3-235b-
qwen3-235b-thinking-
glm-4.7-flash-
glm-4.7-flashx-
mistral-large-

Pricing

Flat-rate. No surprises.

No per-token billing. No overages. Hit the limit? It resets. Never pay extra.

STARTER
$15.00/mo

For individual developers getting started

  • 500 messages per 5 hours
  • 2,500 messages per week
  • All 18 models included
  • SSE streaming
  • Dashboard access
Most Popular
PRO
$30.00/mo

For active developers who need more

  • 2,000 messages per 5 hours
  • 10,000 messages per week
  • All 18 models included
  • Priority support
  • All Starter features
ULTRA
$100.00/mo

Maximum power for heavy usage

  • 5,000 messages per 5 hours
  • 25,000 messages per week
  • All 18 models included
  • Direct support channel
  • All Pro features

Start building with
open models today

Get your API key in seconds. No credit card required for the first 7 days.