How legacy security companies succeed
Be more AI-friendly
Disclaimer: Opinions expressed are solely my own and do not express the views or opinions of my employer or any other entities with which I am affiliated.

I’ve intentionally made all of my posts free and without a paywall so that my content is more accessible. If you enjoy my content and would like to support me, please consider buying a paid subscription:
I’ve spent the last year writing about why I believe large cybersecurity companies are destined to fail. The narrative is familiar: they will be overtaken by AI-native startups with modern architectures, agent-friendly cores, and GTM motions that actually match the speed of an AI-transformed enterprise. These newcomers are designed for a growing, dynamic market, while the incumbents are fighting for scraps in a stagnating one, a recipe for high GTM costs and dwindling margins.
But failure isn't a foregone conclusion. If you are a legacy giant with thousands of customers and decades of data, do you just wait for the "software apocalypse" to claim you? Not necessarily. But the path to survival isn't just "adding AI features." It requires a fundamental transformation of what the product actually is.
The Palo Alto Networks route: Acquisitions as a strategic wedge
If you want to understand how an “older” company survives a paradigm shift, you have to look at Palo Alto Networks. When the cloud came out, PANW could have easily become a legacy firewall company. Instead, they took a high-risk, high-reward route: aggressive, strategic acquisitions and the enforcement of “platformization.”
Nikesh Arora and his team didn’t just buy companies; they integrated them into a unified vision. By acquiring players like RedLock and Twistlock early on, they built Prisma Cloud. More recently, their 2026 strategic moves, including the $25 billion acquisition of CyberArk to own the identity primitive and their deep integration with Chronosphere for cloud-native observability, show they understand that the next battle isn’t over “tools,” but over the infrastructure of the agentic era.
This strategy is risky. It only works if your company has either substantial cash flow from a legacy business (like their hardware firewalls) or the ability to raise significant debt. Most “second-tier” incumbents don’t have that luxury. Cybersecurity has become so competitive that operating margins have been thinned out. PANW succeeded where others failed because they moved past “stitching” products together. Most incumbents tried “platformization” by simply slapping a single UI over ten different acquired backends. In the AI era, this is a death sentence.
The “middleware” trap: Why legacy consolidation usually fails
An AI agent doesn’t care about a pretty dashboard; it cares about semantic integrity. If an agent queries an “identity” module and a “cloud security” module within the same legacy platform and gets conflicting schemas or mismatched data models, the agent breaks.
The “middleware trap” is the belief that you can hide technical debt behind a single UI. Incumbents must unify their data models at the API level. PANW’s “platformization” worked because they enforced a unified data lake (Cortex Data Lake) early on. To survive 2026, incumbents must stop being a collection of tools and start being a consistent data primitive.
For twenty years, security companies competed on who had the best “single pane of glass.” We built massive, complex dashboards designed to keep a human analyst staring at a screen for eight hours a day.
In the agentic era, the “single pane of glass” is the LLM Context Window.
AI agents are becoming the primary “users” of security software. According to Gartner’s 2026 Trend Report, 40% of enterprise applications will feature AI agents by year-end. These agents don’t want to log into a portal; they want to query a Model Context Protocol (MCP) server. Legacy giants shouldn’t be fighting to be the interface where the human makes a decision. Instead, they should win by becoming the “truth layer,” the high-fidelity context provider that an agent queries to understand the environment.
By exposing their features through MCP servers, as Datadog and Cloudflare did in early 2026, incumbents allow agents to “use” the tool directly. This turns a legacy liability (complexity) into an asset (depth). A feature-rich platform that is MCP-native offers more “levers” for an agent to pull than a lightweight startup.
The death of the “enterprise feature” moat
Historically, legacy companies relied on a “feature-rich” platform as their moat. They built niche features that a startup couldn’t replicate. But AI is destroying this moat. Development cycles have compressed, and startups can now replicate legacy “enterprise features” in months using agentic coding.
Switching costs are also dropping. We are moving toward a world of Forward Deployed Engineers (FDEs) who help integrate custom AI pipelines. According to recent 2026 hiring data, job listings for FDEs have surged by 800%. If an FDE can use an agent to migrate your entire policy set to a new vendor in a weekend, your “stickiness” is gone.
This transition fundamentally changes the business model. Seat-based pricing is a relic. In an agentic world, “per-seat” usage is a tax on productivity.
Legacy companies must move to an infrastructure model, pricing based on API calls, tokens, or autonomous actions. This might actually be favorable. Companies can argue that this ends up being very similar to what they would have paid otherwise in a normal seat-based situation, but it aligns the cost with the business outcome rather than the headcount.
The per-seat charge is an artifact of the 2010s SaaS era that tried to abstract away APIs into nice-looking UIs, but in the meantime, they developed a problematic business model. But, with AI, customers can build their own UIs and interact not in ways the vendor has defined, but in ways that make sense to them. This also makes the pricing make more sense for both sides: pay only for what you use. Anyway, I digress.
This creates a new kind of stickiness: the incumbent becomes the “operating system” of security, while the startup is just a single app. Users can do it themselves without requiring a solutions engineer.
The “service-as-software” model
Beyond MCP servers, there is a third path: leaning into the last mile of security. AI models are incredibly powerful, but in security, “90% accurate” is a failure. There is still a massive gap between an AI suggestion and a production-ready fix.
Legacy companies, with their existing professional services arms and massive historical datasets of customer tickets, are uniquely positioned to provide the human-in-the-loop services that make AI safe for the enterprise. This is the “service-as-software” model. By embedding their experts alongside their AI agents (the FDE model), incumbents can provide a level of “white-glove” security that a pure-software startup cannot match. You aren’t just selling a license; you’re selling the guaranteed outcome.
What happens to the “tool babysitters,” the analysts who mastered the “jank” of legacy UIs? We have to retrain them. I’ve argued before that we need more security generalists.
When the tool is no longer the skill, the professional focus shifts to problem-solving. They design the scenarios, and the agents execute them. The mastery moves from the tool to the discipline. This shift will initially be painful, but it is better for the business long-term. It creates a segment of users who problem-solve and thwart real threats rather than spending time mastering a tool to implement workarounds.
The “software apocalypse” is a filter. Large cybersecurity companies that continue to hide behind “enterprise features” and “seat-based” moats will be automated out of existence.
But those that embrace the Palo Alto Networks route, those that embrace the infrastructure pivot and become the reliable, agent-accessible primitives of the security world, will thrive. They will bridge the gap between their legacy data gravity and the new agentic workflows. They won’t just be the survivors of the AI era; they will be the foundation it’s built on.



