Sunday Security Maxim
Methodist Maxim: While vulnerabilities determine the methods of attack, most vulnerability or risk assessments will act as if the reverse were true. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Methodist Maxim: While vulnerabilities determine the methods of attack, most vulnerability or risk assessments will act as if the reverse were true. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
I am Spartacus Maxim: Most vulnerability or risk assessments will let the good guys (and the existing security infrastructure, hardware, and strategies) define the problem, in contrast to real-world security applications where the bad guys get to. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Catastrophic Maxim: Most organizations mistakenly think about and prepare for rare, catastrophic attacks (if they do so at all) in the same way as for minor security incidents. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Rigormortis Maxim: The greater the amount of rigor claimed or implied for a given security analysis, vulnerability assessment, risk management exercise, or security design, the less careful, clever, critical, imaginative, and realistic thought has gone into it. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Success Maxim: Most security programs “succeed” (in the sense of their being no apparent major security incidents) not on their merits but for one of these reasons: (1) the attack was surreptitious and has not yet been detected, (2) the attack was covered up by insiders afraid of retaliation and is not yet widely known, (3) the bad guys are currently inept but that will change, or (4) there are currently no bad guys interested in exploiting the vulnerabilities, either because other targets are more tempting or because bad guys are actually fairly rare. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Better to be Lucky than Good Maxim: Most of the time when security appears to be working, it’s because no adversary is currently prepared to attack. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Blind-Sided Maxim: Organizations will usually be totally unprepared for the security implications of new technology, and the first impulse will be to try to mindlessly ban it. Comment: Thus increasing the cynicism regular (non-security) employees have towards security. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Accountability 2 Maxim: Organizations that talk a lot about holding people accountable for security will never have good security. Comment: Because if all you can do is threaten people, rather than developing and motivating good security practices, you will not get good results in the long term. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Accountability Maxim #1: Organizations that talk a lot about holding people accountable for security are talking about mindless retaliation, not a sophisticated approach to motivating good security practices by trying to understand human and organizational psychology, and the realities of the workplace. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Michener’s Maxim: We are never prepared for what we expect. Comment: From a quote by author James Michener (1907-1997). As an example, consider Hurricane Katrina. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Galileo’s Maxim: The more important the assets being guarded, or the more vulnerable the security program, the less willing its security managers will be to hear about vulnerabilities. Comment: The name of this maxim comes from the 1633 Inquisition where Church officials refused to look into Galileo’s telescope out of fear of what they might see. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Thursday Maxim: Organizations and security managers will tend to automatically invoke irrational or fanciful reasons for claiming that they are immune to any postulated or demonstrated attack. Comments: So named because if the attack or vulnerability was demonstrated on a Tuesday, it won’t be viewed as applicable on Thursday. Our favorite example of this maxim is when we made a video showing how to use GPS spoofing to hijack a truck that uses GPS tracking. In that video, the GPS antenna was shown attached to the side of the truck so that it could be easily seen on the video. After viewing the video, one security manager said it was all very interesting, but not relevant for their operations because their trucks had the antenna on the roof. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Just Walk It Off Maxim: Most organizations will become so focused on prevention (which is very difficult at best), that they fail to adequately plan for mitigating attacks, and for recovering when attacks occur. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Tabor’s Maxim #2 (Cost Maxim): Security is practically achieved by making the cost of obtaining or damaging an asset higher than the value of the asset itself. Comment: Note that “cost” isn’t necessarily measured in terms of dollars. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Tabor’s Maxim #1 (Narcissism Maxim): Security is an illusionary ideal created by people who have an overvalued sense of their own self worth. Comment: This maxim is cynical even by our depressing standards—though that doesn’t make it wrong. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Redundancy/Orthogonality Maxim: When different security measures are thought of as redundant or “backups”, they typically are not. Comment: Redundancy is often mistakenly assumed because the disparate functions of the two security measures aren’t carefully thought through. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Depth, What Depth? Maxim: For any given security program, the amount of critical, skeptical, and intelligent thinking that has been undertaken is inversely proportional to how strongly the strategy of "Security in Depth" (layered security) is embraced. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Takes One to Know One: The fourth most common excuse for not fixing security vulnerabilities is that "our adversaries are too stupid and/or unresourceful to figure that out." Comment: Never underestimate your adversaries, or the extent to which people will go to defeat security. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Hopeless Maxim: The third most common excuse for not fixing security vulnerabilities is that "all security devices, systems, and programs can be defeated". Comment: This maxim is typically expressed by the same person who initially invoked the Mermaid Maxim, when he/she is forced to acknowledge that the vulnerabilities actually exist because they’ve been demonstrated in his/her face. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory
Onion Maxim: The second most common excuse for not fixing security vulnerabilities is that "we have many layers of security", i.e., we rely on "Security in Depth". Comment: Security in Depth has its uses, but it should not be the knee jerk response to difficult security challenges, nor an excuse to stop thinking and improving security, as it often is. Compiled by Roger G. Johnston, Ph.D., CPP, Argonne National Laboratory