Skip to main content

Documentation Index

Fetch the complete documentation index at: https://authsome.agentr.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

OpenCode is an open-source AI coding agent. It runs as a CLI, calls LLMs over HTTPS, and shells out to local tools. Authsome plugs in via the same two patterns we use everywhere: wrap the agent under authsome run --, or call the library from inside.
uvx authsome login github
uvx authsome login openai
uvx authsome run -- opencode
OpenCode sees placeholder env vars (OPENAI_API_KEY=authsome-proxy-managed) but never the real key. Outbound HTTPS to matched provider hosts is authenticated by the proxy.

Embedding the library

If you’re orchestrating OpenCode from a larger Python program and need explicit per-call control, drop below the proxy:
from authsome.server.dependencies import create_auth_service

auth = create_auth_service()
key = auth.get_access_token("openai", connection="default")
See Python library.

Multi-account

uvx authsome login github --connection personal
uvx authsome login github --connection work
uvx authsome login github --connection work --force
uvx authsome run -- opencode
The proxy uses each provider’s default connection. To switch which connection the proxy uses, set it as default with --force. See Multiple connections per provider.

Troubleshooting

SymptomFix
OPENAI_API_KEY rejected as invalidOpenCode is running outside authsome run. Wrap it.
TLS errors against api.openai.comTrust the mitmproxy CA. See Proxy networking.

What’s next

Run agents with the proxy

Full proxy walkthrough.

Generic Python agent

The library pattern for any Python script.