Skip to main content

Documentation Index

Fetch the complete documentation index at: https://authsome.agentr.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

LangChain reads provider credentials from environment variables or constructor arguments. The recommended setup: run the chain under authsome’s proxy and let LangChain’s SDKs read the placeholder env vars unchanged.

Run the whole chain under the proxy

uvx authsome login openai
uvx authsome login github
uvx authsome run -- python my_chain.py
LangChain SDKs read placeholder env vars (OPENAI_API_KEY=authsome-proxy-managed) and initialize. The proxy substitutes the real key for outbound HTTPS to api.openai.com, api.github.com, and other matched hosts. No code changes to your chain.

Tools that call out to providers

Many LangChain tools (GitHubToolkit, SerpAPIWrapper, BraveSearch, etc.) read their secret from an env var. With the proxy already wrapping the process, the existing wrappers just work:
from langchain_community.utilities import SerpAPIWrapper

# reads SERPAPI_API_KEY from env, which is the proxy placeholder
search = SerpAPIWrapper()
For tools that authsome doesn’t bundle, add a custom provider so the proxy can match by host.

Embedding the library

If you’re building a larger orchestrator around LangChain and need to pass tokens explicitly (different connection per call, or TLS-pinned SDKs that bypass the proxy), drop down to the library:
from authsome.server.dependencies import create_auth_service
from langchain_openai import ChatOpenAI
from langchain_community.tools.github.tool import GitHubAction

auth = create_auth_service()

llm = ChatOpenAI(
    api_key=auth.get_access_token("openai"),
    model="gpt-4o-mini",
)

github = GitHubAction(
    github_access_token=auth.get_access_token("github"),
)
Refresh is transparent. Construct the clients once per request rather than caching them across long-running processes if your tokens have short TTLs.

Async clients

AsyncChatOpenAI and other async wrappers work identically. Authsome’s auth layer is synchronous; if you do use the library, call get_access_token once at client construction and pass the value in.

Multi-account workflows

work = auth.get_access_token("github", connection="work")
personal = auth.get_access_token("github", connection="personal")
See Multiple connections per provider.

What’s next

Python library

The full library surface.

OpenAI integration

Set up the OpenAI key LangChain will use.