I am finishing up a textbook on elementary information security. Unlike other books, this one targets freshmen and sophomores, and eschews memorization for problem-solving.
Sprinkled here and there are concepts we all should recognize as "basic principles" of information security: ideas that transcend programming, network design, and system administration. Now that I'm finished, here is a summary of the ones I covered. I've also noted how they compare to Saltzer and Schroeder's classic list from 1975 and, briefly, the NIST principles in SP800-14.
Here are the eight basic principles that survived three years of writing, review, and rewriting:
- Continuous Improvement - continuously assessing how well we achieve our objectives and making changes to improve our results.
- Least Privilege - providing people or other entities with the minimum number of privileges necessary to allow them to perform their role in the system.
- Defense in Depth - building a system with independent layers of security so that an attacker must defeat multiple independent security measures for the attack to succeed.
- Open Design - building a security mechanism whose design does not need to be secret.
- Chain of Control - ensuring that either trustworthy software is being executed, or that the software's behavior is restricted to enforce the intended security policy.
- Deny by Default - granting no accesses except those specifically established in security rules.
- Transitive Trust - recognizing that if A trusts B, and B trusts C, then A also trusts C.
- Separation of Duty - decomposing a critical task into separate elements performed by separate individuals or entities.
I used two general guides for the topics covered in this textbook: the Information Security section of ACM's IT 2008 curriculum, and the CNSS curriculum standards required for Information Assurance Courseware certification, particularly NSTISSI 4011 for Information Security Professionals.
Saltzer and Schroeder
Many of us like to go back to the classic Saltzer and Schroeder paper "Protection of Information in Computer Systems" (find the paper at MIT
or U. Va.
), since they produced a list of "Basic Principles" that has inspired a lot of good work. I looked at their list from time to time as I wrote the book, but I never intentionally tried to incorporate their list. Here is their list of principles with a note on how I did - or did not - cover the same or similar ground:
- Economy of mechanism (NO) - This is the mantra, "Keep It Simple" as applied to security. While I agree with it 100%, I never found a good place to present and illustrate it as an elementary concept. Economy and simplicity seem lost in today's rush for improved functionality.
- Fail-safe defaults (YES) - I talk about "Deny by Default," which I've used more often in my own work. The phrase "fail-safe" does seem slightly more general.
- Complete mediation (NO) - I don't present it as a basic principle. Like economy of mechanism, there was never a good place to present and illustrate it as an elementary concept. I discuss it when I cover reference monitors.
- Open design (YES) - I even used the same phrasing.
- Separation of privilege (YES) - I talk about this as "Separation of Duty." These aren't exactly the same thing, but I've encountered "Duty" more often, and it seems more general.
- Least privilege (YES) - Their phrasing is still used today in the industry.
- Least common mechanism (YES, sort of) - This concept refers to the notion that mutually suspicious users or entities should rely on a minimum number of shared mechanisms. Every case of sharing increases the risk of undesired information flow, or of misplaced trust. This concept, or at least its inverse, is somewhat captured under Transitive Trust.
- Psychological acceptability (NO, not like that) - This isn't really a specific, pithy principle in the sense of the others, at least not as stated in the paper. The description actually covers two or three issues related to usability and user acceptance. I couldn't construct any basic principles related to this, since there were too many obvious examples of their not being followed in typical systems.
There were also two principles that Saltzer and Schroeder noted as familiar in physical security and possibly relevant to information security:
- Work factor (YES, sort of) - This principle argues that security measures must at least make attacks more challenging even if they don't always work. I capture this with Defense in Depth.
- Compromise recording (YES, sort of) - This principle argues that there should be a mechanism to record security incidents even if the system can't proactively block their occurrence. I capture this implicitly through Continuous Improvement: if you monitor the system and track what happens, you could detect violations that you don't block. The textbook includes examples of that sort.
I was also going to compare this list with the principles published by NIST in their SP800-14, "Generally Accepted Principles and Practices for Securing Information Technology Systems
," but my principles are at a completely different conceptual level. In any case, the textbook clearly covers the issues and concepts in the NIST publication, even if it doesn't grant their "Principles" with so grand a title.