Skip to content
AI IntegrationLive in AirLeak Desktop & OSINT

Plug in any AI. Local-first, by default.

Pick OpenAI, Anthropic, Ollama, LMStudio, or a custom endpoint inside the app — or hook ChatGPT, Claude Desktop, or Cursor in via the built-in MCP server. Built into AirLeak desktop and the OSINT app.

The Demo

Your favourite AI client. AirLeak's data.

Turn on the MCP server inside AirLeak and any MCP-capable client speaks to it. ChatGPT Desktop, Claude Desktop, Cursor — they connect, AirLeak hands over devices, alerts, sessions, OSINT entities.

Inspect AirLeak session status
MCP connected
Using AirLeak MCP, check the current session status. In a few sentences, tell me what's unusual.
Worked for 18s

Session is live in monitor mode on COM31, snapshot ~2s old. Detector raised a red tracking posture — driven by three approaching devices (a Samsung TV, an iPhone, a Windows PC), not by a persistent-tracker pattern.

Alert mix is unusual: 66 total — overwhelmingly nearby_phone_unlocked (46) and probe_flood (19). No unsafe Wi-Fi flagged. Reads more as a busy, device-dense RF environment than an obvious stalking case.

Show me details on an iPhone you see.
Worked for 16s

One live iPhone is 65:FD:FB:E6:B1:E5 from Apple, Inc. Very close — current RSSI −45 dBm, best −27 dBm, est. distance 0.4 m.

Classified as iphone with 83% confidence, observed 1,444 times this session, currently locked, randomised MAC (rand: true). Fingerprint 0AE43113 — no matching note in the known-device library yet.

Ask for follow-up changes

Real conversation. AirLeak's MCP returned the device classifications, RSSI values, and alert types — the AI wrote them up.

Providers

Five paths to a working AI.

Configure inside the app's settings. Switch any time. Local-first by default — cloud is right there if you want it.

Cloud
OpenAI
GPT-4o, GPT-4 Turbo, GPT-4.5 — anything OpenAI exposes through their Chat Completions API.
Cloud
Anthropic
Claude Opus, Sonnet, Haiku — direct via the Messages API. Tool-calling enabled out of the box.
Local
Ollama
Run Llama, Qwen, Mistral, Phi, Gemma, DeepSeek — anything in your Ollama library, on your hardware.
Local
LMStudio
Point AirLeak at your LMStudio local server. Same model variety as Ollama, GUI-managed.
BYO endpoint
Custom
Any OpenAI-compatible endpoint URL. Self-hosted vLLM, llama.cpp server, internal proxy, you name it.
MCP Server

One toggle. Then any MCP client speaks AirLeak.

AirLeak ships with a Model Context Protocol server bound to localhost. Add the URL to your AI client of choice and the same tools used by the in-app assistant become available to it — list devices, query alerts, inspect sessions, pull OSINT entities.

  • Off by default — opt-in per session
  • Bound to 127.0.0.1, no LAN exposure
  • Tool surface mirrors the in-app assistant
  • Works with ChatGPT Desktop, Claude Desktop, Cursor, Continue, and any MCP-compliant client
~/.config/claude/mcp.json
Example
{
  "mcpServers": {
    "airleak": {
      "url": "http://127.0.0.1:7891/mcp",
      "transport": "http"
    }
  }
}
Tools exposed: list_devices · get_alerts · session_status · query_ble · osint_lookup
Privacy

Your data only leaves the device if you choose.

AI integration that respects the rest of the ZeroTrace stack: no logs, no telemetry, no surprise.

Local-first by default
AirLeak ships configured for Ollama on your machine. Your captures, alerts, and queries never leave the hardware unless you explicitly point a cloud provider at them.
No vendor proxy
When you use OpenAI or Anthropic, the request goes straight from the AirLeak app to the provider's API — same as if you'd called it directly. ZeroTrace doesn't touch it.
You own the prompt
System prompts, tool definitions, and model parameters are inspectable in the app. Audit what's being sent before you send it.
MCP is opt-in
The MCP server is off by default. Turn it on when you want AirLeak exposed to your AI client — turn it off the moment you don't.

Ship-ready AI integration. No subscription, no cloud lock-in.

Included with AirLeak desktop and the OSINT app. Bring whatever model you want.

Command Palette

Search for a command to run...