Skip to content

ZeroTrace Companion

AI Settings

Provider, model, system prompt, MCP servers, and Companion's own AirLeak MCP server.

The AI settings section controls every aspect of Companion's AI assistant. It is split into four subsections: master switch, provider configuration, MCP-client servers, and Companion's own AirLeak MCP server.

Master switch

SettingWhat it does
AI enabledMaster switch for the assistant. When off, the drawer is hidden and no AI calls fire.

Off by default for fresh installs. Turn on after configuring a provider.

Provider

SettingWhat it does
ProviderOne of: Ollama, LM Studio, OpenAI, OpenRouter, Anthropic, Custom
Base URLWhere the model server is reachable. Each provider has a default.
API keyRequired for cloud providers. Stored in plaintext in the local settings file.
ModelThe model name to use. Provider-specific; see providers.
System promptOptional custom instructions the model receives at the start of every conversation.

When you switch provider, Companion remembers the previous provider's Base URL, API key, and model. Switching back restores the prior configuration.

For the per-provider setup walkthrough see setup.

API keys are stored in plaintext in Companion's local settings file. Anyone with access to your user account can read them. Treat the file the way you'd treat any local credential cache.

System prompt

SettingWhat it does
System promptOptional custom instructions the model receives at the start of every conversation.

The default system prompt is tuned for Companion's tool-calling and conversational use. Override it when you want:

  • A different tone or voice.
  • Specific domain knowledge to be assumed.
  • A non-English default response language.
  • Restricted scope (e.g., "only answer questions about wireless devices").

Leave blank to use the default.

For multilingual users: setting the system prompt to "Always respond in <your language>" is the easiest way to localise the assistant. Note that smaller models are typically less fluent in non-English languages.

MCP servers (client)

This subsection lets you connect Companion to external Model Context Protocol servers. Each connected server's tools merge into the assistant's catalog.

For each server:

FieldWhat it does
NameDisplay name and dedupe key
EnabledToggle the server on / off
Transportstdio (local subprocess) or http (HTTP endpoint)
Command + args(stdio) Executable to launch and its arguments
Environment(stdio) Environment variables in KEY=VALUE format
Working directory(stdio) Optional working directory
URL(http) The endpoint URL
Headers(http) Optional HTTP headers (e.g. for bearer auth)

For each connected server, the live status shows:

  • Connected / Connecting / Disabled / Error: <reason>.
  • Server name and version (when connected).
  • Tool count and the list of tool names.

For full guidance see MCP client.

AirLeak MCP server

This subsection controls Companion's own MCP server — the one that exposes AirLeak data to external agents.

SettingWhat it does
Expose AirLeak as MCPMaster toggle
Listen addressDefault :8765. Bind to localhost only or a network interface.

When enabled, status shows:

  • Running — yes / no.
  • Address — the URL external agents connect to.
  • Snapshot age — how stale the underlying capture data is.

External agents (Claude Desktop, Cursor, the claude CLI, custom code) connect to this URL and get access to 16 AirLeak tools.

For full guidance see MCP server.

Conversation behaviour (read-only)

These behaviours are not user-configurable in the current release:

  • Conversation persistence — conversations are not stored across launches.
  • Tool-call confirmation — state-changing tools always prompt for confirmation.
  • Auto-context — the assistant has access to active session and connected device metadata.

The behaviours follow the privacy-first defaults documented in AI privacy.

Disabling AI

When the master switch is off:

  • The assistant drawer is not accessible (Ctrl+Shift+A does nothing).
  • No AI calls fire.
  • No connection to any model server is attempted.
  • MCP-client connections are torn down.
  • Companion's own AirLeak MCP server is also stopped (it depends on AI being enabled).
  • All other Companion features remain fully functional.

For sensitive deployments where AI is not appropriate, simply turn it off.

Diagnostics

Settings → AI shows a status badge per subsystem:

  • Provider — green / yellow / red for the active provider.
  • MCP client — per-server connected count.
  • MCP server — running / stopped indicator.

Click any badge for diagnostic details. Common issues and fixes are in troubleshooting.

What's not in AI settings

  • Per-conversation context window size — fixed by the model.
  • Temperature / sampling parameters — Companion uses sensible defaults; not exposed.
  • Custom built-in tool definitions — the built-in tool catalog is built into the application. To add new tools, configure an MCP server.
  • TLS / authentication for Companion's own MCP server — currently localhost-only by default; advanced users can place an HTTP gateway in front.

Command Palette

Search for a command to run...