LlamaIndex connects LLMs to your data through loaders, indexes, and retrievers. Each of those typically needs a credential (LLM API key, data-source token). Run the script under authsome’s proxy and the loaders just work.Documentation Index
Fetch the complete documentation index at: https://authsome.agentr.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
Run the whole script under the proxy
OPENAI_API_KEY=authsome-proxy-managed and similar placeholders. The proxy injects the real value for outbound requests to matched provider hosts. No code changes needed in your script.
Data-loader credentials
LlamaIndex’s data readers cover dozens of services (Notion, Confluence, Slack, Google Drive, etc.). For each one:- Check
uvx authsome listfor the matching provider name. - If bundled, log in:
uvx authsome login <provider>. - Run your script under
uvx authsome run --. The reader picks the credential up from its standard env var.
Embedding the library
If you’re building a larger orchestrator around LlamaIndex and need explicit per-call control:get_access_token if your process lives long enough to outlast a token’s TTL.
Multi-account
What’s next
Python library
AuthService and the auth-layer API.Custom providers
Add a data-source provider authsome doesn’t ship.