Misaligned Incentives and Information Insecurity
Security professionals recognize that technology-based solutions are a necessary but markedly insufficient response to the hard problem of information security. The implied assumption is often that the underlying problem is technology. The late Ross Anderson argued that the root cause of many security problems is not technology but rather perverse incentives. The White House recently noted that poor incentives create a national security threat [1]. A closer look at misaligned incentives supports a better understanding of the nature of our current security problems and suggests better ways to mitigate them.
image credit: Joe Berger and Pascal Wyse
Ross Anderson was a well-known security academic perhaps best known for his seminal book on security engineering. He was also among a group of thought leaders who understood that examining the why of the security problem is an important antecedent to diagnosing the what (the manifested security problem) and prescribing how to treat it (the mitigation strategy). He was also early to recognize that the underlying issues were often not technological but microeconomic. His 2001 essay Why Information Security Is Hard — An Economic Perspective was an early contribution to the field of security economics [2].
Microeconomics is fundamentally about making optimal decisions under conditions of uncertainty and constraint. Several core concepts in microeconomic thinking include the idea that people and companies are more likely to face trade-offs rather than pure solutions. Decisions around those trade-offs, if a person or firm is a rational actor, are often made at the margins (e.g., marginal benefit and marginal cost). Those decisions also, in many cases, come with unintended consequences. Another ubiquitous economic idea is that incentives matter because people, organizations, and markets respond to incentives.
Perverse Incentives
Incentives are the underlying arrangements that influence the behavior of people and organizations. Bad incentives inhibit optimal outcomes, fuel dysfunction, and undermine good technology. They typically are not resolved by adding more technology to the security stack. However, they can provide tremendous insights as to why things are the way they are and perhaps some insight into what better governance might look like.
Misaligned Security Incentives
1. Negative Externalities — An externality is a side effect. A negative externality is the “imposition of a cost on a party as an indirect effect of the actions of another party. Negative externalities arise when one party, such as a business, makes another party worse off, yet does not bear the costs from doing so.” [3] For example, a company that collects and monetizes personal information on individuals may also inadvertently leak that information, increasing the risk of identity theft. While there may be subsequent mechanisms to place some cost on the data aggregator, there is no way to eliminate the cost placed on the party whose sensitive personal information is leaked.
· Tragedy of the Commons — When individuals have uncontrolled access to a public resource, there is a propensity to act in their own individual interests and not in the long-term interest of the resource [4]. The incentive of individual benefit outweighs the interest in protecting the resource and the result is usually a diminished common good. In the digital age, the internet became an information super highway for almost everyone and enabled great financial gain for many. However, as everyone was acting in their own interests there was little incentive to prevent the emerging cybersecurity risks to economic prosperity and social stability.
· Transaction Costs — Some costs are less about the good or services being purchased and instead describe effort and expenditure associated with executing the purchase (i.e., searching for and researching a product, negotiating a deal, and then enforcing the terms of an agreement). In practice, there are often substantial costs associated with switching a technology including training, the effort required to implement the technical change, and in some cases the replacement of equipment that still has economic value. This creates a vendor lock-in advantage for the incumbent and can substantially influence decisions on security tooling. It also creates an economic incentive to not make a change, even though absent to transaction costs it would clearly be the more optimal choice.
· Winner-Take-All Market Structures — In some markets, there are remarkable advantages associated with economies of scale, network effects, and influencing the playing field through establishing features and standards associated with a product category. In some cases, such as operating systems or search engines, that position can then be leveraged to lock-in market advantage. This creates a strong incentive to race to market share (and ideally dominance) ahead of competitors rather than invest in more robust security features. This is particularly true in digital markets where the marginal cost of additional units approaches zero.
· Hidden Effort/Attribute Problems — When one party in a transaction has more complete, accurate, and relevant information than the other party, it may produce sub-optimal decisions. This information asymmetry can take different forms, including hidden effort problems and hidden attribute problems. For example, proponents of open systems argue that open software allows many eyeballs to examine and identify security flaws (in addition to avoiding vendor lock-in). In theory, true. In practice, a hidden effort problem makes it difficult to determine, and there is growing evidence that vulnerabilities are frequently not mitigated [6]. Similarly, every bit of software has flaws, and companies are not incented to disclose every known vulnerability as that would diminish the value proposition. This is a hidden attribute problem, and the lack of full information on the part of the buyer or consumer could lead to sub-optimal decisions.
Parting Thoughts (and more on Ross Anderson)
These five examples of misaligned incentives are not mutually exclusive and certainly not collectively exhaustive. They demonstrate how incentives matter and are counterproductive to good information security. The examples are a small part of the growing body of knowledge in security economics.
We can thank, in no small part, Ross Anderson. The world of security lost a true thought leader on March 28th, 2024. Bruce Schneier’s thoughtful words provide insight into the breadth and depth of Ross’s contributions and the deep impact he made during his lifetime. Others have remembered him as a “pragmatic visionary”, “famed author”, “security economics pioneer”, and “venerable computer scientist and information security expert”. All such accolades are true.
[1] In a recent example, the White House cyber policy director has called out the lack of proper incentives resulting in Microsoft going down an undesirable path with national security implications. https://www.theregister.com/AMP/2024/04/21/microsoft_national_security_risk/
[2] The field of security economics is relatively new and traces back, in part, to the seminal essay Why Information Security Is Hard by Ross Anderson.
[3] https://www.britannica.com/topic/negative-externality
[4] The Tragedy of the Commons is often credited to William Forster Lloyd in 1883 as he explained the relatively poor capacity of shared pastures. However, recognition of the concept goes back to (at least) Aristotle.
[5] Bruce Schneier makes this point in open source does not equal secure. Indeed, recent open-source vulnerabilities have been widespread and substantial