Integration Guide
Step-by-step setup for using Infux with OpenCode
go install github.com/opencode-ai/opencode@latest or brew install opencode)Create an opencode.json in your project root:
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"infux": {
"npm": "@ai-sdk/openai-compatible",
"name": "Infux",
"options": {
"baseURL": "https://api.infux.dev/v1",
"apiKey": "{env:INFUX_API_KEY}"
},
"models": {
"deepseek-v3-0324": {
"name": "DeepSeek V3",
"limit": { "context": 131072, "output": 16384 }
},
"kimi-k2.5": {
"name": "Kimi K2.5",
"limit": { "context": 262144, "output": 16384 }
},
"deepseek-r1": {
"name": "DeepSeek R1",
"limit": { "context": 131072, "output": 16384 }
}
}
}
}
}Then set your key: export INFUX_API_KEY="sk-infux-your_key_here"
For quick setup without a config file:
export OPENAI_API_KEY="sk-infux-your_key_here" export OPENAI_BASE_URL="https://api.infux.dev/v1"
| Use Case | Model | Why |
|---|---|---|
| General coding | deepseek-v3-0324 | Good balance of speed and quality |
| Large codebases | kimi-k2.5 | 256K context for big repos |
| Reasoning tasks | deepseek-r1 | Extended thinking for hard problems |
opencode
Open the TUI, select your Infux model from the provider dropdown, and start coding.
Ensure npm is set to "@ai-sdk/openai-compatible" in the provider config. OpenCode uses the Vercel AI SDK under the hood.
Use the {env:INFUX_API_KEY} syntax in config and set the environment variable separately. Verify your key starts with sk-infux-.