Infux
Back to Home
Codex CLI

Integration Guide

Using Infux with Codex CLI

Step-by-step setup for using Infux with Codex CLI

Prerequisites

  • Codex CLI installed (npm install -g @openai/codex)
  • An Infux API key

Quick Start

Set the environment variables and run:

terminal
export OPENAI_API_KEY="sk-infux-your_key_here"
export OPENAI_BASE_URL="https://api.infux.dev/v1"

codex "Refactor this function to use async/await"

Persistent Configuration

Create a config.toml for persistent multi-provider setup:

~/.codex/config.toml
model = "deepseek-v3-0324"
model_provider = "infux"

[model_providers.infux]
name = "Infux"
base_url = "https://api.infux.dev/v1"
env_key = "INFUX_API_KEY"
wire_api = "chat"

Then set your key: export INFUX_API_KEY="sk-infux-your_key_here"

Alternative: Environment Variables Only

~/.zshrc
export OPENAI_API_KEY="sk-infux-your_key_here"
export OPENAI_BASE_URL="https://api.infux.dev/v1"

This overrides the default OpenAI endpoint globally. Then just run codex "your prompt".

Recommended Models

Use CaseModelWhy
General codingdeepseek-v3-0324Fast, great code quality
Large codebaseskimi-k2.5256K context for big repos
Reasoning tasksdeepseek-r1Extended thinking for hard problems

Verify

terminal
codex "What is 2+2?"

Troubleshooting

Model not found

Ensure you are using an Infux model ID (e.g., deepseek-v3-0324, not deepseek-chat).

Unauthorized

Verify your API key starts with sk-infux- and is active in the dashboard.

Using config.toml but still hitting OpenAI

Set model_provider = "infux" at the top level to route requests through your custom provider.