Skip to content
Back to Blog
OSINT

Authorized Testing Is A Workflow Problem

Most cybersecurity tooling marketing sounds the same. A list of features. A comparison table. A bullet point about AI. A claim about "the leading platform." If you...

Authorized Testing Is A Workflow Problem - ZeroTrace blog image
May 9, 20266 min read1,183 words
OSINTOSINTAuthorizedTestingWorkflow

The frame nobody is using

Most cybersecurity tooling marketing sounds the same. A list of features. A comparison table. A bullet point about AI. A claim about "the leading platform." If you sit through a few of these in a row you cannot remember which was which by the end.

That is not because the products are bad. It is because the buyer has been trained, by the marketing pattern as much as by anything else, to compare on the wrong axis. The axis used in nearly every category — which vendor has more features — is a holdover from the era when authorized testing was a craft activity. One operator. One device. One engagement. Pick the device with the most features and the most cult brand around it.

That is not the world the buyer lives in any more.

What changed

Three things. Each one shifted the buying problem; together they changed the category.

NIS2, DORA, and the EU Cyber Resilience Act moved authorized testing from "nice to have" to "document or be fined." A cybersecurity team that runs an annual penetration test now has procurement-relevant evidence to produce — engagement reports, scope documentation, signed authorized-testing language. The deliverable is the audit trail, not the witty payload.

Schrems II and supply-chain incidents made data-residency a buying criterion. A security team that runs testing data through a US-jurisdiction cloud is now answering a question they were not asked five years ago: where does my testing data go, and who logs it?

And subscription fatigue has reached the developer-tooling adjacent categories. Buyers with budgets are increasingly impatient with recurring fees on capabilities that should have been a one-time purchase.

Each of those changes, on its own, is a small thing. Together, they reframe what professional means in this category. The professional buyer of 2026 is not the operator who has the most exotic features. It is the team lead who can hand the procurement officer a clean answer to three questions: where is my testing data, what are the recurring costs, and what do I have if the vendor disappears.

Why this is a workflow problem, not a tool problem

If you frame authorized testing as a tool problem, the answer is "buy the tool with the most features." If you frame it as a workflow problem, the answer is different.

A workflow has multiple steps. It has artefacts that move between steps. It has handoff points where one operator (or one shift, or one cohort) hands the engagement to the next. It has evidence that survives the engagement and gets to a client, an editor, or an auditor.

A tool can be impressive. A workflow has to be repeatable. The repeatability is what an institutional buyer is paying for. The features are what the operator gets to enjoy.

Most of the loudest vendors in this category are still selling tools. They put their best feature first; they compare against the alternative tool's features second; they treat the workflow as a downstream problem the customer will solve themselves.

The customer can no longer solve the workflow themselves at any scale that matters. A boot camp running ten cohorts a year. A university running a 30-student lab. A consultancy running engagements across three industries. An internal red team that hands work between four operators across two timezones. Every one of these has a workflow problem hiding behind the tool problem the marketing is selling.

What the workflow needs

Three things, as far as we can tell from working with operators and watching how engagements actually fall apart:

The workflow needs ownership. If a tool can be revoked between engagements — by a subscription lapse, by a cloud-account outage, by a vendor pivot — the workflow has a fragility the buyer cannot defend against. Operators who have lived through the experience of arriving at an engagement and discovering their tooling expired know exactly why ownership matters.

The workflow needs artefacts. Investigation profiles, engagement reports, scope letters, signed authorized-testing language. These are what handoffs run on; these are what the auditor looks at; these are what the client is paying for. A tool that produces no durable artefacts is a tool that has to be wrapped in someone else's documentation tax.

The workflow needs consistency across operators. Two new analysts hired into the same team should be able to run the same investigation pattern within a week. If onboarding takes a month, the workflow has a tax that compounds with every hire. The right tool collapses that tax; the wrong tool inflates it.

The category move

We are not arguing for a new category name. The category exists; it is called professional pentest and OSINT tooling. We are arguing for a different criterion inside it. The buyer who frames the decision around features picks a different vendor than the buyer who frames the decision around workflow. Both buyers are real; the workflow buyer is the one whose share of the budget is growing.

The vendors who win this decade in this category will be the ones who let the workflow buyer point at them and say: that one. That one solves the problem I actually have.

That is not a marketing claim. It is a structural one. It is settled by what the tooling does in a procurement review, in an ethics committee, in a five-year-out vendor-survival question, in a cohort handoff. The marketing follows; it does not lead.

Where to look

If you are evaluating tooling for an authorized testing function — your own team, your consultancy, your cohort — three questions sharpen the conversation faster than any feature list:

What testing data does this tool produce, and who logs it? (If the answer is "the vendor logs it," the workflow has a residency problem.)

What does my team have if this vendor disappears tomorrow? (If the answer is "we lose access to our scripts and our payload library," the workflow has a fragility problem.)

How does an engagement get handed from one operator to the next? (If the answer is "screenshots and Slack threads," the workflow has a handoff problem.)

These three questions are the workflow problem in compact form. The vendor that has good answers to all three is selling you a workflow stack. The vendor that has good answers to one or two is selling you an impressive tool with workflow problems. The marketing will not always make the difference clear; the procurement review will.

Disclosure

We build ZeroTrace. We have a position in this argument. The argument is the one we are willing to be cited on, against any vendor in the category, including ours — the right axis is workflow, not features, and the vendor that handles the three questions above wins the institutional decision. If we ever stop handling all three, the same argument applies to us, and our customers should switch.

This is the front-and-back commitment. The workflow buyer recognises it. The feature buyer does not need to. Both are valid customers; each is asking a different question.

The point of writing this down is to surface the question.

Command Palette

Search for a command to run...