LangChain reads provider credentials from environment variables or constructor arguments. The recommended setup: run the chain under authsome’s proxy and let LangChain’s SDKs read the placeholder env vars unchanged.Documentation Index
Fetch the complete documentation index at: https://authsome.agentr.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
Run the whole chain under the proxy
OPENAI_API_KEY=authsome-proxy-managed) and initialize. The proxy substitutes the real key for outbound HTTPS to api.openai.com, api.github.com, and other matched hosts. No code changes to your chain.
Tools that call out to providers
Many LangChain tools (GitHubToolkit, SerpAPIWrapper, BraveSearch, etc.) read their secret from an env var. With the proxy already wrapping the process, the existing wrappers just work:
Embedding the library
If you’re building a larger orchestrator around LangChain and need to pass tokens explicitly (different connection per call, or TLS-pinned SDKs that bypass the proxy), drop down to the library:Async clients
AsyncChatOpenAI and other async wrappers work identically. Authsome’s auth layer is synchronous; if you do use the library, call get_access_token once at client construction and pass the value in.
Multi-account workflows
What’s next
Python library
The full library surface.
OpenAI integration
Set up the OpenAI key LangChain will use.