Skip to main content

Documentation Index

Fetch the complete documentation index at: https://authsome.agentr.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

The OpenAI Agents SDK reads OPENAI_API_KEY from the environment by default. Authsome supplies it either through the proxy (recommended) or through export.
uvx authsome login openai
uvx authsome run -- python my_agent.py
The Agents SDK initializes with OPENAI_API_KEY=authsome-proxy-managed. Outbound requests to api.openai.com are intercepted and authenticated at the proxy layer. The agent’s process never sees the real key.

Alternative: pass the key explicitly

from authsome.server.dependencies import create_auth_service
from openai import AsyncOpenAI
from agents import Agent, Runner, set_default_openai_client

auth = create_auth_service()
client = AsyncOpenAI(api_key=auth.get_access_token("openai"))
set_default_openai_client(client)

agent = Agent(name="my-agent", instructions="...")
result = await Runner.run(agent, "do the thing")
Use this when you need to control which connection the SDK uses, or when the proxy isn’t an option.

Multi-account workflows

uvx authsome login openai --connection personal
uvx authsome login openai --connection team
Through the proxy, set the default with --force. In code, pass connection=:
key = auth.get_access_token("openai", connection="team")
client = AsyncOpenAI(api_key=key)

Tools that hit other providers

The Agents SDK supports tools that call external APIs. Each tool typically reads its own credential. Authsome covers both bundled (OpenAI, GitHub, etc.) and custom (see Custom providers). For an MCP-style tool that calls api.github.com, just log in once: uvx authsome login github. The proxy will inject the GitHub token automatically when the tool makes its request.

What’s next

OpenAI integration

Set up the underlying OpenAI key.

Run agents with the proxy

The injection model in detail.