Skip to main content
HUMΛN
Engineering
Engineering

One org, many Companions: the deployment model that powers context-aware embeds

HUMΛN Team··8 min·Builders & platform engineers

The problem with one-size-fits-all chat widgets

A generic chat widget on a marketing site is different from a customer support Companion on a B2B portal. They need different KB scopes, different personalities (via prompt references), different branding, and critically — they need to know where they are. A Companion on the support portal should know the user is looking at ticket TKT-12345 "Login loop issue", not just "some page on the internet."

Most platforms solve this by asking developers to write bespoke backends for each deployment. HUMΛN doesn't.

What companion_deployments gives you

Each row in companion_deployments is a fully configured Companion instance:

-- One per deployment, not one per org
id              text   -- dep_abc123 — stable deploy ID
org_did         text   -- owner org
name            text   -- machine-readable slug
surface_label   text   -- 'customer-support', 'developer-portal', ...
display_name    text   -- widget title (default: 'HUMΛN')
target          text   -- agent URI
allowed_origins text[] -- CORS whitelist for the proxy
prompt_id       text   -- optional prompt registry reference
delegation_token text  -- scoped JWT — server-side only
token_expires_at timestamptz

When you create a deployment, HUMΛN automatically:

  1. Inserts the record
  2. Mints a scoped delegation token (companion:chat, kb:read:public, constrained to riskLevel: standard)
  3. Stores the token server-side — never in the browser

When you ask for the embed snippet, you get back ready-to-paste HTML that wires up proxy mode automatically.

The proxy pattern: why tokens stay on the server

Direct mode (putting a delegation token in the HTML) works for low-risk public deployments, but for anything customer-facing you want the token to stay on your server:

Browser widget → POST /api/companion/ask { text, deployment_id, surface_context }
Your server    → validates Origin against allowed_origins
               → looks up companion_deployments record
               → POSTs to HumanOS with Authorization: Bearer {token}
HumanOS        → Companion Agent → response
Your server    → streams response back

The token never touches the browser. If someone scrapes your embed snippet, they get a deployment ID and a surface label — not the credentials.

SurfaceContext: the Companion knows where it is

Every message sent from an embedded Companion includes a surface_context block:

{
  "surface": "customer-support",
  "page": "/support/tickets/TKT-12345",
  "page_title": "Login loop issue — Support Portal",
  "entity_type": "support_ticket",
  "entity_id": "TKT-12345",
  "entity_name": "Login loop issue",
  "deployment_id": "dep_abc123"
}

The Companion Agent receives this prepended to the user message:

[Context: Surface — customer-support, page: /support/tickets/TKT-12345,
viewing support_ticket 'Login loop issue' (id: TKT-12345), deployment: dep_abc123]
Can you summarise this ticket and suggest next steps?

Now the agent can answer questions that are grounded in what the user is looking at — not just KB docs in general.

One org, many deployments

A single org might run:

Deployment Surface Prompt Allowed origins
marketing-site marketing-site (default) builtwithhuman.com
developer-portal developer-portal portal-docs-prompt docs.haio.run
customer-support customer-support support-agent-prompt portal.yourcorp.com
internal-tools admin-console internal-brief-prompt app.yourcorp.com

Each gets its own token, its own allowed origins, and its own surface_label that gives the Companion its grounding. Managing them takes seconds from the CLI:

human companion deployment list
human companion deployment create --name developer-portal --surface-label developer-portal
human companion deployment rotate developer-portal   # rotates token, revokes old grant

The embed snippet in 5 seconds

Create a deployment, get the snippet, paste it:

human companion deployment create --name support-portal --surface-label customer-support
human companion deployment snippet support-portal

Output:

<script src="https://api.haio.run/cdn/companion-widget.js"></script>
<script>
  HUMAN.Companion.init({
    agentsCallUrl: window.location.origin + '/api/companion/ask',
    buildAgentInput: () => ({
      deployment_id: 'dep_abc123',
      surface_context: {
        surface: 'customer-support',
        page: window.location.pathname,
        page_title: document.title,
      },
    }),
    ui: { theme: 'auto', position: 'bottom-right' },
  }),
</script>

Add your proxy endpoint, paste the snippet, and your Companion is live — scoped to your org's KB, aware of what page it's on, and with a token that never leaves your server.

What's next

The deployment model is the foundation. On top of it:

  • Custom chromeonExpand, onMinimize, onMessage, onAuthChange hooks let you wire any UI effect without forking the widget
  • Entity context — inject entity_type, entity_id, entity_name for grounded ticket/invoice/record conversations
  • Prompt registry — point prompt_id at a prompt in the registry for per-deployment personality and directives
  • KB scoping — future: per-deployment KB visibility policies

The architecture is sound. Every deployment is a first-class object in the system.