Open question no. 2: cybersecurity costs as a tax
I got your sandwich right here. You want it on bread? Oh, that's extra.
Enterprise software4 usually is delivered in multiple flavors. By default products deliver the user facing features that a company’s marketing department feels potential customers will find valuable. Typically the features that the security or privacy team wants to see, everything from database encryption to activity logs, to user control of data usage are an added cost, rather than a native part of the product. … But why are these an extra cost? Are security features today’s seatbelts? We pay for breaches, breach insurance, vulnerability detection, upgrade and patch disruption, all post-purchase. Let’s move those costs into coding and architecting secure products.
A few years ago I was at one of those “executive briefings” that vendors engage in. If you haven’t had the pleasure, executive briefings are quite nice: a firm invites you someplace, fêtes you with snacks and lunch, a nice dinner at an expensive restaurant, and spends an hour listening to you pontificate and pretending that your initiatives and campus issues are unique. All sandwiched between 3-4 extended presentations on some of their products. You get your ego stroked, they get to give you an extended sales pitch, they gather intelligence that helps them tailor what their marketing to you will look like, and it’s a pleasant break from the typical workday.
At this briefing they were trying to sell us on their threat intelligence services and as I was leaving I casually asked how many people worked in the threat intelligence unit. “500” was the answer. Let’s put this in perspective. My team at the time was ~25-30 people. When I worked at a smaller private university the entirety of the combined Library and IT staff was 110. A rough count of the non-central academic IT staff at a major R1 is around 1000. For those of us not working at mega-international corporations, 500 is a huge number; my being startled by this, however, had less to do with its scale than the degree to which I found it comforting. For far less than the cost of adding a single analyst to my team I could add the distilled threat intelligence of hundreds of analysts leveraging three orders of magnitude more network data than even my large university produced. It seemed to be a no-brainer1.
In this post I’d like to tackle the theme of open question no. 2, that cybersecurity is treated like a tax on IT, rather than a feature. Ironically, I think this practice extends into the space of purely cybersecurity products and services as well. But before getting too deep into that, let’s take a brief detour through the question of why our corporate masters are so underwhelming.
Essentially, we need to talk about Cisco. Or VMware. Or Amazon. Or Microsoft. Or Microsoft. Or Microsoft. How is it that these giant companies, which command enormous pools of IQ and resources keep failing so miserably? It’s tempting to defend them by arguing (and it is true) that their sheer size and thus surface area not only makes them targets for attacks, but additionally exposes them to a kind of scrutiny that is unmatched in the industry. Hack VMWare and you open 41% of the world to your agents and activities2.
It’s not for want of resources. Cisco made $10-15 billion in 2024 (GAAP/Non-GAAP). Microsoft made $88.136 billion net in 20243. I see two issues here. First, is there something intrinsic in the nature of massive organizations that put a cap on their capabilities4? Does complexity constrain production? Or is it more of a question that needs to be put in the context of late-stage capitalism, with market demands, shareholders, and meme-driven leadership? Second, have these giant companies gotten to the point where putting additional resources into building rigorous products is simply ineffective? That is, will throwing more designers, architects, and engineers at the problem just have no effect as the engineering-for-quality pipeline is basically at capacity? Or as I like to say, perhaps it’s a little from pile A and a little from pile B. (Pile being the operative word).
This quote from the original Cyber Safety Review Board report on the MS 2023 Exchange compromise is telling, “The Board finds that this intrusion was preventable and should never have occurred. The Board also concludes that Microsoft’s security culture was inadequate and requires an overhaul, particularly in light of the company’s centrality in the technology ecosystem and the level of trust customers place in the company to protect their data and operations….To drive the rapid cultural change that is needed within Microsoft, the Board believes that Microsoft’s customers would benefit from its CEO and Board of Directors directly focusing on the company’s security culture and developing and sharing publicly a plan with specific timelines to make fundamental, security-focused reforms across the company and its full suite of products.”
An inadequate security culture. Think about how you might respond to this comment if it were directed to your own organization. It’s not a call for more investment in tools or staff, it’s fundamentally a question of leadership. It calls for leadership to focus directly on the company’s security culture. I’ve always believed that smart leadership expresses his or her ‘focus’ through asking questions, because only the naïve or narcissistic leader believes their job is giving orders or issuing mandates. An organizational culture that is founded on thoughtfulness and ownership will respond to a leader’s questions and expectations5.
One should ask: how does an organizational leader demonstrate what constitutes an organization’s culture? For a company with as broad a portfolio as Microsoft, what does it mean to foster an effective security culture? Sticking with the notion that the fish rots from the head, is it really the role of the CEO who is faced with a thousand competing demands for a thousand separate product lines to shape and focus on the security culture of the organization? Yes, yes it is.
As I’ve discussed before, cybersecurity is not a product or a deliverable, but a process that is woven throughout an organization, creating resilience in the culture and its products. This is one of the essential tensions for the cybersecurity office located within IT: cybersecurity (or I should say Information Assurance) is a component of every aspect of an organization, in no way limited to centrally provided technologies or services. As such, cybersecurity within even as large an organization as a Microsoft or Cisco requires the attention of the most senior leadership; attention that deliberately articulates and shapes the broader ‘security culture’.
The connection from this observation to my earlier point that market demands and our modern meme-driven leadership model6 may be to blame is that in the largest organizations there simply is no incentive structure that rewards quality. Hell, given the apparent decoupling of company performance from CEO compensation, in many cases it’s unclear precisely what drives compensation other than the deeply embedded fallacy that quality leadership is tightly coupled to it7. Essentially, “We suck! We’d better pay the people in charge more!” is the engine driving leadership in much of modern America. Is it any wonder the inventor of the Cybertruck is the richest man in history?
The operative corollary here is my second supposition. Most product production lines have simply exceeded their capacity to absorb and benefit from additional resources. I’ve always assumed, to stick with software development, that any specific methodology can only support so many developmental components. That is, only so many coders, so many designers, or so many QA activities are possible before the tasks are so decomposed as to be unworkable by more than one person and the overhead of managing these elements eclipses the work done toward product improvement. Worse, as this level of decomposition is approached we destroy the ability to view any component in context. Could a million programmers, and a million QA analysts work on a million-line program and have it actually function? It’s hard enough to split a check with 6 people. Or pick a movie to stream with two8.
Which drives me to believe that while surely many companies do under-resource the cybersecurity fabric of their products, the core problem these mega-companies face is that their corporate failure is not a resource problem but a systemic cultural failure by their leadership. They continue to confuse the product-nature of cybersecurity as sold by their companies, with the more fundamental process-nature of cybersecurity. They’ve confused chemistry with physics to stick with an analogy from an earlier post. Thus we end up with fundamental failures such as we saw this week from Cisco9.
I’d like to return now to the original topic of this post, why we suffer through the treatment of cybersecurity as a tax on ordinary IT. Let’s see if I can tie this to the previous diversion. By ‘tax on ordinary IT’ I mean two things. Primarily I think of products that are sold without all the elements necessary to ensure the product can function securely. Examples include things like full featured DB encryption, sufficient logging to detect abuse or respond to an incident, or features that target specific attack types such as BEC anti-phishing functionality. In addition we’re required to buy and maintain an entire suite of services to compensate for the lack of security-by-design in ordinary IT. The obvious example being the vulnerability management programs we all tool, staff, and deploy because of flawed software products, but it’s tempting to include tooling such as intrusion detection and prevention systems that are additional bumps in your network wire (despite spending a fortune on the network infrastructure itself). Or endpoint suites we’re required to buy since operating systems lack mature native tools.
To some degree none of this is surprising or remarkable. We found a way to enable/disable a feature so we’re going to keep it disabled until it’s paid for. We did an analysis and found that feature contributed N dollars to the product so we’ll add 2N to the retail price. Now frame this in the context of how it contributes to an organization’s security culture. Security is a product, only as valuable as we can charge for it. A product doesn’t need to be secure, it needs to be a platform off of which we can charge for rigor. A network isn’t a secure ecosystem, it’s something intrinsically insecure until you buy additional features. A sandwich isn’t a sandwich until you pay extra for bread.
While I’m calling this ‘a tax’ in these posts, it would be more accurate to call it exploitation. It’s also worth pointing out that we should not point any fingers at the people who design, make, and test these products. Many of the finest and most responsible engineers, coders, and architects I know work in these corporations and are brutally sincere in their work to improve and deliver these products. But to repeat what I said earlier, this is not a resource problem but a systemic cultural failure by their leadership. The message the leadership of these organizations is sending is unmistakable, and without any significant feedback loop in the system it’s not self-correcting. Indeed, as mentioned earlier, the incentives work against cultural improvement by rewarding the failing CEOs10.
So in the final analysis, the question is what can we do about it? Obviously what we need is legislation that requires a base level of cybersecurity rigor and features in products. Why is anything permitted to be sold without fully encrypted data when it’s stored? Why are products allowed to make MFA optional? Why aren’t passphrases the default? Why are operational logs held hostage until the right tier of some subscription is purchased? Of course, the US has rejected any form of central planning or regulation, because, you know, “communism,” so this is less likely than ever.
But we can engage in the fight to denormalize these practices. As we negotiate agreements we need to insist on both compensation for vulnerabilities and the right to disclose any uncovered flaws. We can push back on the lie that a bundled service is expensive to support and so they need to charge extra for it. No more product license restrictions - if it’s built in, it should be enabled. Infrastructure represents an enormous sunk cost and comes with impossible change costs. Will anyone pivot away from Cisco due to last week’s announcement? Unlikely. But everyone should be calling their company reps and threaten to do so - not to negotiate a discount, but to apply pressure on the failure of corporate culture.
Ultimately, the “cybersecurity tax” is just a symptom of something larger: a technology economy that values novelty over durability, and optics over substance. Until leadership learns that resilience is not a feature but a foundation, we’ll keep rebuilding the same fragile architectures, one breach at a time. Maybe the market will never correct itself. Maybe the incentives are too warped, and the margins too seductive. But if that’s the case, then the least we can do is stop pretending this is normal. Call it what it is - exploitation - and start making that a reputational liability for the companies that profit from it. Culture changes when shame becomes more expensive than integrity.
This analysis ignores the obvious: without that extra analyst it was unlikely my team had the capacity to consume, integrate, and react to all this threat intelligence. As I said in an earlier post, “every introduction of new timesaving technology increases the staffing requirements of a team using it by 20%.”
Cisco’s CEO made $39.20 M in 2024 in total compensation up 23% over 2023. Microsoft’s CEO made $79.1 million up 63% over 2023.
It’s interesting to consider how the Federal Government handles this. While federal research labs exist, the bulk of research is done by giving grants to non-federal labs and individuals. That is, the government generally doesn’t let it’s bulk interfere with research, it outsources it to smaller and more nimble researchers. The challenge is largely one of creating just enough regulation around these grants to ensure they’re used properly and towards the goal they were directed. Of course, with the current attack on those researchers and research institutions, it appears we’re going to be stuck relying on the mega-corporations who don’t really do fundamental research at scale. The long tail of this will be felt for decades after the collapse of the American experiment.
“Because I freak’n said so” should always be the last leadership card you play. Sometimes it is necessary, but it is brutally corrosive to trust and a lack of trust destroys the scaffolding of any organization’s culture. Look, this issue is highly nuanced, balancing the bluntness of directives with the patient work of cultivating partnership with senior managers is one of the trickiest elements of leadership. But remember what got us here - the CSRB report doesn’t challenge the business direction of Microsoft, but the culture supporting it.
Which may be two sides of the same coin.
https://cooleypubco.com/2016/07/25/new-study-shows-inverse-correlation-between-ceo-pay-and-performance-over-the-long-term/. Note this is from 2016! The two HBR references in the sidebar are truly worth examining.
I can already hear some of my friends with CS backgrounds spinning up their emacs based email clients to gatling gun words at me saying, geez, we studied this problem in CS 101. I, however, have the benefit of having had a career in IT without any formal education in it.
It is tempting to wax poetic about the personality make up and background of the modern CEO, and why it almost precludes success at managing culture, or at least cultural change in their organizations. But that sounds like a separate future post.
Last year, because the market was up, CEO compensation also soared. No doubt product vulnerabilities and compromises disappeared. 🙄