VS Code extension
Use this section when you drive AI coding from inside VS Code through the OpenAI Codex plugin (publisher: OpenAI, extension ID openai.chatgpt), which reads the same ~/.codex/config.toml as the Codex CLI. If you have already set up Codex CLI, the plugin will pick up that configuration automatically. Otherwise follow the steps below.
Launching VS Code from a Finder / Start Menu shortcut inherits whatever
environment the OS process has, which usually does not include
OPENAI_API_KEY. For the Codex plugin to see the key, always launch VS Code
from the shell where the exports were applied (code . from a terminal).
Prerequisites
- Install VS Code .
- Install the OpenAI Codex plugin from the VS Code Marketplace (publisher OpenAI, extension ID
openai.chatgpt) — not GitHub Copilot, which is a different product and cannot be routed through WorldRouter. - Open the plugin so it can write its config file: click the OpenAI icon in the Activity Bar on the left edge of VS Code (or open the Command Palette with Cmd+Shift+P on macOS / Ctrl+Shift+P on Windows / Linux and run ChatGPT: Focus on Chat View), then sign in once with any OpenAI account. The plugin creates
~/.codex/config.tomlon first sign-in — that is the file you will edit later in this guide. - Create a WorldRouter API key in the API Keys dashboard.
- Make sure the team has available credits on the Credits page.
- If the Codex plugin is already signed in with another provider, sign out first before switching the endpoint.
Environment (macOS / Linux)
Add this export to your shell profile (~/.zshrc or ~/.bashrc), or run it directly in your current session:
export OPENAI_API_KEY="your_api_key"If you edited your shell profile, reload it so the new value takes effect in the current terminal (or just open a fresh one):
source ~/.zshrc # or: source ~/.bashrcOPENAI_API_KEY in your shell. If you also use the official OpenAI API directly, consider using a project-level .env file or a tool like direnv to scope the override to specific directories.Environment (Windows PowerShell)
Set the variable for the current session. To persist it across sessions, run setx OPENAI_API_KEY "your_api_key" instead and restart PowerShell:
$env:OPENAI_API_KEY = "your_api_key"OPENAI_API_KEY in your shell. If you also use the official OpenAI API directly, consider using a project-level .env file or a tool like direnv to scope the override to specific directories.Codex config
Open ~/.codex/config.toml (Windows: %USERPROFILE%\.codex\config.toml; the plugin creates this file on first sign-in) and add a provider block pointing at WorldRouter:
# Tell Codex which provider and model to use by default
model_provider = "worldrouter"
model = "gpt-5.4"
model_reasoning_effort = "high"
[model_providers.worldrouter]
name = "WorldRouter"
# Base URL must include the /v1 suffix
base_url = "https://inference-api.worldrouter.ai/v1"
# Codex reads the API key from this environment variable
env_key = "WORLDROUTER_API_KEY"base_url must include /v1. The Codex plugin sends requests directly to
the URL you paste; copy it verbatim from the Base URL section of the
Quickstart. No trailing slash.
Restart VS Code and the Codex plugin after editing config.toml to apply changes.
Launch
Launch VS Code from the shell with OPENAI_API_KEY exported, so the Codex plugin picks it up:
cd your-project
code .Verify
Open the Codex panel inside VS Code and send a short test prompt, for example Reply with 'ok' if you can see this.. A reply confirms WorldRouter is reachable and the key is valid. The dashboard usage chart should also tick up once the response returns.
Troubleshooting
- 401 Unauthorized: confirm
echo $OPENAI_API_KEYis non-empty in the shell that launched VS Code. If the exports were added after launching VS Code, relaunch it from the updated shell. - Model not found: the
modelinconfig.tomlmust match a model id from the Models catalog. Model ids are case-sensitive. - Plugin still hits the default endpoint: Codex resolves
model_providerto the[model_providers.<name>]table of the same name. Confirm both sayworldrouterand restart VS Code. - 404 on the request:
base_urlis missing the/v1suffix. See the warning above.
Other extensions
Third-party extensions like Cline and Continue configure models in their own settings panel, but the values are the same: base URL = https://inference-api.worldrouter.ai/v1, API key = your WorldRouter key, model id from the Models catalog. Extension-specific walkthroughs are planned for the next docs pass.