OpenAI’s Codex CLI is a coding agent that runs locally and shells out to other tools. Authsome plugs in cleanly: wrap the agent invocation withDocumentation Index
Fetch the complete documentation index at: https://authsome.agentr.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
uvx authsome run -- and the proxy injects fresh credentials for every service the agent calls.
Run Codex under the auth proxy. Paste this to set up the wrapper.
What you get
- One login per service. Codex never sees raw
client_secrets, tokens, or API keys. - Automatic refresh. OAuth tokens stay fresh between Codex sessions.
- Multi-account support. Run Codex against your work GitHub one minute, personal the next.
- The same vault. Connections created from Codex live alongside connections from the regular CLI.
Recommended pattern: run Codex under the proxy
OPENAI_API_KEY=authsome-proxy-managedis set in Codex’s environment, so its SDK initializes.- Outbound HTTPS calls to
api.openai.com,api.github.com, and any other matched provider are intercepted and authenticated at the proxy layer. - Codex’s own subprocess calls inherit the same
HTTP_PROXY/HTTPS_PROXYvariables, so tools Codex spawns are also covered.
Alternative: env export before launch
If Codex is running in an environment where the proxy CA cannot be trusted, fall back to env-var export:eval after token refresh, or use the proxy pattern above to avoid that step.
Multiple accounts
--force first. See Multiple connections per provider.
Embedding the library
If you’re orchestrating Codex from a larger Python program and need explicit per-call control, drop below the proxy into the library:.env files. See Python library.
Troubleshooting
| Symptom | Fix |
|---|---|
| Codex starts but every API call fails with TLS errors | The mitmproxy CA isn’t trusted. See Proxy networking. |
Codex doesn’t see the env var even though authsome run set it | Verify with uvx authsome run -- env | grep OPENAI. |
| Authsome login hangs | uvx authsome doctor and check port 7998 availability. |
What’s next
Run agents with the proxy
The injection model in detail.
OpenAI integration
Set up the OpenAI key Codex will use.