Agentsflare Codex Configuration Guide
Codex is OpenAI's official CLI coding tool that supports natural language instructions for code writing, debugging, and refactoring.
Installing Codex
Install globally via npm:
npm install -g @openai/codexOr run directly with npx (no installation required):
npx @openai/codexVerify installation:
codex --versionConfiguring Codex
Codex reads API settings through environment variables or a configuration file. Below is how to configure it with the Agentsflare proxy.
Option 1: Environment Variables (Recommended)
Set the following environment variables in your terminal before starting Codex:
export OPENAI_API_KEY="Your Agentsflare API Key"
export OPENAI_BASE_URL="https://api.agentsflare.com/v1"Then run Codex in the same terminal session:
codexOption 2: Configuration File
Create or modify the Codex configuration file:
- macOS / Linux:
~/.codex/config.yaml - Windows:
%USERPROFILE%\.codex\config.yaml
Configuration file content:
model: gpt-5.2
api_key: "Your Agentsflare API Key"
base_url: "https://api.agentsflare.com/v1"API Endpoint Reference
Agentsflare provides two OpenAI-compatible API endpoints. Choose based on your needs:
| Endpoint | URL | Use Case |
|---|---|---|
| Chat Completions | https://api.agentsflare.com/v1/chat/completions | Standard chat, code generation, general tasks |
| Responses | https://api.agentsflare.com/v1/responses | Complex reasoning, tool calling, multi-turn context |
Configure Chat Completions Endpoint
export OPENAI_API_KEY="Your Agentsflare API Key"
export OPENAI_BASE_URL="https://api.agentsflare.com/v1/chat/completions"Configure Responses Endpoint
export OPENAI_API_KEY="Your Agentsflare API Key"
export OPENAI_BASE_URL="https://api.agentsflare.com/v1/responses"Common Commands
Start an interactive session:
codexExecute a single instruction directly:
codex "Refactor utils.js into TypeScript"Launch with a specific model:
codex --model gpt-5.2View help:
codex --helpVerify Configuration
After configuration, test the connection with:
codex "Hi, please briefly introduce yourself"If Codex responds normally, the configuration is correct and you have successfully connected to OpenAI models through the Agentsflare proxy.