How Arlopass works
import Callout from "../../../components/docs/Callout.astro"; Understanding the architecture — extension, SDK, bridge, and adapters ### The problem Web applications increasingly want AI capabilities — chat, summarization, code generation. But connecting directly to AI providers from frontend code means embedding API keys in JavaScript bundles, managing credentials in localStorage, and trusting every dependency in your supply chain with those secrets. That's not viable. ### The wallet analogy Arlopass works like MetaMask does for Ethereum. MetaMask sits between your web app and the blockchain — it holds your private keys, mediates every transaction, and asks for consent. Your app never touches the keys directly. Arlopass does the same for AI. The browser extension holds API credentials, mediates every request, and the web application never sees a single key. Your app talks to the SDK. The SDK talks to the extension. The extension talks to providers. <Callout type="info" title="Key insight"> The extension is the trust boundary. Everything on the web app side is untrusted. Everything on the extension side is controlled by the user. </Callout> ### Architecture layers The system has six layers, each with a clear responsibility: 1. **Web App** — your application, using the React SDK (`@arlopass/react`) or Web SDK (`@arlopass/web-sdk`). It calls hooks or client methods. It never manages credentials. 2. **SDK → Extension** — the SDK communicates via `window.arlopass`, a `ArlopassTransport` object injected by the extension's content script. Every message is wrapped in a canonical envelope with timestamps, nonces, and correlation IDs. 3. **Extension** — mediates consent, manages sessions, validates origins, attaches credentials, and enforces rate limits. This is where the user's API keys live. 4. **Bridge** — the extension routes requests to the appropriate provider adapter. The bridge handles protocol translation and connection management. 5. **Vault** — an encrypted file owned by the bridge. It stores credentials (API keys), provider configurations, app connections, and token usage. The extension reads and writes vault data through native messages — it never stores credentials in `chrome.storage`. Because the vault lives on the filesystem, one setup works across Chrome, Edge, and Firefox. 6. **Providers** — Ollama, Claude, OpenAI, Gemini, Amazon Bedrock, Azure, Perplexity, and more. Each has an adapter that normalizes its API into the Arlopass protocol. ```tsx title="SDK usage at each layer" // React SDK — the extension is detected automatically import { ArlopassProvider } from "@arlopass/react"; function App() { return ( <ArlopassProvider appId="my-app"> <Chat /> </ArlopassProvider> ); } // Web SDK — you pass the injected transport explicitly import { ArlopassClient } from "@arlopass/web-sdk"; const client = new ArlopassClient({ transport: window.arlopass }); await client.connect({ appId: "my-app" }); ``` ### Data flow A typical Arlopass session follows a connect → discover → select → chat → disconnect flow. The state machine enforces this order — you can't chat before connecting, and you can't connect twice. ```ts title="Typical session flow" // 1. Connect — establish a session with the extension await client.connect({ appId: "my-app" }); // state: disconnected → connecting → connected // 2. List providers — discover what's available const providers = await client.listProviders(); // [{ providerId: "ollama", models: [...] }, { providerId: "claude", ... }] // 3. Select a provider await client.selectProvider({ providerId: "ollama", modelId: "llama3" }); // 4. Chat — send messages, stream responses for await (const event of convo.stream("Explain monads")) { // token-by-token streaming } // 5. Disconnect — clean up await client.disconnect(); // state: connected → disconnected ``` ### The security boundary Credentials never touch the web application. The SDK sends a request envelope through `window.arlopass`. The extension validates the envelope, then asks the bridge to read the API key from the vault and forward the request to the provider. The response comes back through the same channel — stripped of any credential material. The web app and extension popup never see the raw API key. Even if your web app is compromised, the attacker gets access to nothing beyond what the user has already consented to in the current session. When a user adds a provider, the credential is encrypted and stored in the vault on the bridge — not in `chrome.storage` or any browser-accessible location. This means credentials are isolated from the browser process entirely, surviving extension updates and working across every browser the bridge is registered with. ### Session lifecycle Calling `connect()` creates a session with a unique `sessionId`. All subsequent operations — listing providers, selecting a model, sending messages — are scoped to that session. Calling `disconnect()` ends it. The extension cleans up resources, and the state machine returns to `disconnected`. If the connection degrades (e.g., the bridge goes down), the state machine transitions through `degraded` and `reconnecting` states automatically. If recovery fails, it lands in `failed`, which can either attempt reconnection or disconnect cleanly. ```ts title="Session and state machine" // Sessions are scoped and ephemeral. // connect() creates a new session with a unique sessionId. // All operations are tied to that sessionId. // disconnect() ends the session — the extension cleans up. // The state machine enforces this: // disconnected → connecting → connected → disconnected // ↓ // degraded → reconnecting → connected // ↓ // failed → reconnecting | disconnected ``` ### Vault setup On first use, the user sets up the vault with a master password or OS keychain integration. The bridge creates the encrypted vault file and all subsequent credential storage flows through it. On subsequent browser opens, the vault is already unlocked for the duration of the bridge process — or prompts once for the master password if the bridge was restarted. <Callout type="tip" title="Related"> See [Transport Model](/docs/concepts/transport-model) for how the SDK-to-extension communication works, or [Security Model](/docs/guides/security) for a deep dive into credential isolation and envelope security. </Callout>Understanding the architecture — extension, SDK, bridge, and adapters
The problem
Web applications increasingly want AI capabilities — chat, summarization, code generation. But connecting directly to AI providers from frontend code means embedding API keys in JavaScript bundles, managing credentials in localStorage, and trusting every dependency in your supply chain with those secrets. That’s not viable.
The wallet analogy
Arlopass works like MetaMask does for Ethereum. MetaMask sits between your web app and the blockchain — it holds your private keys, mediates every transaction, and asks for consent. Your app never touches the keys directly.
Arlopass does the same for AI. The browser extension holds API credentials, mediates every request, and the web application never sees a single key. Your app talks to the SDK. The SDK talks to the extension. The extension talks to providers.
Architecture layers
The system has six layers, each with a clear responsibility:
- Web App — your application, using the React SDK (
@arlopass/react) or Web SDK (@arlopass/web-sdk). It calls hooks or client methods. It never manages credentials. - SDK → Extension — the SDK communicates via
window.arlopass, aArlopassTransportobject injected by the extension’s content script. Every message is wrapped in a canonical envelope with timestamps, nonces, and correlation IDs. - Extension — mediates consent, manages sessions, validates origins, attaches credentials, and enforces rate limits. This is where the user’s API keys live.
- Bridge — the extension routes requests to the appropriate provider adapter. The bridge handles protocol translation and connection management.
- Vault — an encrypted file owned by the bridge. It stores credentials (API keys), provider configurations, app connections, and token usage. The extension reads and writes vault data through native messages — it never stores credentials in
chrome.storage. Because the vault lives on the filesystem, one setup works across Chrome, Edge, and Firefox. - Providers — Ollama, Claude, OpenAI, Gemini, Amazon Bedrock, Azure, Perplexity, and more. Each has an adapter that normalizes its API into the Arlopass protocol.
// React SDK — the extension is detected automatically
import { ArlopassProvider } from "@arlopass/react";
function App() {
return (
<ArlopassProvider appId="my-app">
<Chat />
</ArlopassProvider>
);
}
// Web SDK — you pass the injected transport explicitly
import { ArlopassClient } from "@arlopass/web-sdk";
const client = new ArlopassClient({ transport: window.arlopass });
await client.connect({ appId: "my-app" });
Data flow
A typical Arlopass session follows a connect → discover → select → chat → disconnect flow. The state machine enforces this order — you can’t chat before connecting, and you can’t connect twice.
// 1. Connect — establish a session with the extension
await client.connect({ appId: "my-app" });
// state: disconnected → connecting → connected
// 2. List providers — discover what's available
const providers = await client.listProviders();
// [{ providerId: "ollama", models: [...] }, { providerId: "claude", ... }]
// 3. Select a provider
await client.selectProvider({ providerId: "ollama", modelId: "llama3" });
// 4. Chat — send messages, stream responses
for await (const event of convo.stream("Explain monads")) {
// token-by-token streaming
}
// 5. Disconnect — clean up
await client.disconnect();
// state: connected → disconnected
The security boundary
Credentials never touch the web application. The SDK sends a request envelope through window.arlopass. The extension validates the envelope, then asks the bridge to read the API key from the vault and forward the request to the provider. The response comes back through the same channel — stripped of any credential material. The web app and extension popup never see the raw API key. Even if your web app is compromised, the attacker gets access to nothing beyond what the user has already consented to in the current session.
When a user adds a provider, the credential is encrypted and stored in the vault on the bridge — not in chrome.storage or any browser-accessible location. This means credentials are isolated from the browser process entirely, surviving extension updates and working across every browser the bridge is registered with.
Session lifecycle
Calling connect() creates a session with a unique sessionId. All subsequent operations — listing providers, selecting a model, sending messages — are scoped to that session. Calling disconnect() ends it. The extension cleans up resources, and the state machine returns to disconnected.
If the connection degrades (e.g., the bridge goes down), the state machine transitions through degraded and reconnecting states automatically. If recovery fails, it lands in failed, which can either attempt reconnection or disconnect cleanly.
// Sessions are scoped and ephemeral.
// connect() creates a new session with a unique sessionId.
// All operations are tied to that sessionId.
// disconnect() ends the session — the extension cleans up.
// The state machine enforces this:
// disconnected → connecting → connected → disconnected
// ↓
// degraded → reconnecting → connected
// ↓
// failed → reconnecting | disconnected
Vault setup
On first use, the user sets up the vault with a master password or OS keychain integration. The bridge creates the encrypted vault file and all subsequent credential storage flows through it. On subsequent browser opens, the vault is already unlocked for the duration of the bridge process — or prompts once for the master password if the bridge was restarted.