Design principles help students learn the essentials of technical design tasks. Students apply simple-sounding rules to achieve desired design features. The trick is to present students with design principles they can see in real life, understand, and apply in their own efforts. Here are my observations on how to write design principles to teach technical design.
Many cybersecurity textbooks discuss security design principles. Some present classical lists, like that in a famous tutorial “The Protection of Information in Computer Systems” (Saltzer and Schroeder, 1975). Others, like my Elementary Information Security (Smith, 2012), develop a custom list of basic security principles. Lists also appear in other contexts, like UI design.

Developers have occasionally tried to make design principles part of standardized development processes. While I haven’t been part of the software development community for a long time, I’m skeptical of long term success.
As a software engineering instructor (now retired), I believe these lists of principles are best used in classrooms. The principles are rarely applied uniformly in practice, but they do highlight fundamental design issues.
I wrote a review of security design principles several years ago. Part of the review identified some “design principles for design principles,” focusing on principles used for education. A friend pointed those out to me a while back and noted that they were interesting in and of themselves (thanks, Tim!).
Four Principles for Design Principles
- Use Memorable Phrases
- Reflect the State of the Practice
- Introduce When Significant
- Use Repeatedly in Teaching
The following discussion uses the nine security design principles in my textbook Elementary Information Security as examples. While the details of those principles aren’t required to follow the discussion, I provide a summary of the nine principles at the end of this article.
Use Memorable Phrases
We need to coin a memorable phrase to describe each design principle. The phrase should embody its essentials as briefly as possible. Here are the names of the nine principles used in Elementary Information Security:
- Continuous Improvement
- Least Privilege
- Defense in Depth
- Open Design
- Chain of Control
- Deny by Default
- Transitive Trust
- Trust, but Verify (added to the Second Edition)
- Separation of Duty
Most of these phrases should sound familiar to cybersecurity professionals. “Chain of Control” isn’t traditionally stated as a design principle, but it highlights typical behavior in computer viruses and other malware.
Reflect the State of the Practice
Each principle should reflect the current state of the practice, and not simply a “nice to have” property. Otherwise there won’t be useful examples of the principles in existing systems.
For example, Saltzer and Schroeder’s 1975 list included “Complete Mediation,” which means that the system should completely assess access rights every time a component attempts to access a resource. This has proven impractical in layered networking environments. When we apply access control at a lower layer, we lack the information context to mediate access restrictions at higher layers. Packet-level filtering can’t assess message-level access controls. No single point in the system can perform complete mediation. Thus, it isn’t a useful security principle.
Useful security principles also help us critique publicized security failures. Real systems will often reflect those principles. We can often describe successful breaches as a failure to apply one or more principles.
It is often impossible to apply a specific security principle everywhere in every context. Practical systems may use a principle like “Defense in Depth” to defend a component that lacks sufficient “Separation of Duty,” or vice versa.
Introduce When Significant
Each principle is introduced when it plays a significant role in a new topic, and no sooner. Students aren’t required to learn and remember principles that don’t apply to the current topics.
In Elementary Information Security, the principles were introduced as follows:
- Chapter 1: Principles 1, 2, and 3
- Chapter 2: Principles 4 and 5
- Chapter 3: Principles 6 and 7
- Chapter 4: Principle 8
- Chapter 8: Principle 9
Use Repeatedly in Teaching
Each had to be important enough to appear repeatedly as new materials were covered. Here are examples:
The second principle, “Least Privilege,” appears first in Chapter 1, where we use physical restrictions (walls, containers, doors, and locks) to restrict access to resources. In Chapter 2, the examples rely on protection mechanisms implemented in the central processing unit to keep trustworthy software separate from less-trustworthy software. In Chapters 3, 4, and 5, the operating system enforces access restrictions on processes to files or other system resources. Chapter 8 applies the principle to cryptographic keys.
Chapter 3 introduces the seventh principle, “Transitive Trust,” as it applies to vulnerable programs and malware. Chapter 8 applies the principle to shared encryption keys via crypto nets and webs of trust. Chapter 11 illustrates it on the Internet.
The Textbook’s Nine Principles
Here’s a summary of the nine cybersecurity principles introduced in Elementary Information Security:
- Continuous Improvement – continuously assess how well we achieve our objectives and make changes to improve our results. Modern standards for information security management systems, like ISO 27001, are based on continuous improvement cycles. Such a process also implicitly incorporates compromise recording from 1975 and “design for iteration” from 2009. Introduced in Chapter 1, along with a basic six-step security process to use for textbook examples and exercises.
- Least Privilege – provide people or other entities with the minimum number of privileges necessary to allow them to perform their role in the system. This literally repeats one of the 1975 principles. Introduced in Chapter 1.
- Defense in Depth – build a system with independent layers of security so that an attacker must defeat multiple independent security measures for the attack to succeed. This echoes “least common mechanism” but seeks to address a separate problem. Defense in depth is also a well-known alternative for stating NIST’s Principle 16. Introduced in Chapter 1.
- Open Design – building a security mechanism whose design does not need to be secret. This also repeats a 1975 principle. Introduced in Chapter 2.
- Chain of Control – ensure that either trustworthy software is being executed, or that the software’s behavior is restricted to enforce the intended security policy. This is an analogy to the “chain of custody” concept in which evidence must always be held by a trustworthy party or be physically secured. A malware infection succeeds if it can redirect the CPU to execute its code with enough privileges to embed itself in the computer and spread. Introduced in Chapter 2.
- Deny by Default – grant no accesses except those specifically established in security rules. This is a more-specific variant of Saltzer and Schroeder’s “fail safe defaults” that focuses on access control. The original statement is less specific, so it applies in safety and control problems. Introduced in Chapter 3.
- Transitive Trust – If A trusts B, and B trusts C, then A also trusts C. In a sense this is an inverted statement of “least common mechanism,” but it states the problem in a simpler way for introductory students. Moreover, this is already a widely-used term in computer security. Introduced in Chapter 4.
- Trust, but Verify – ensure correct system operation by assessing the system’s compliance with security objectives and by recording and monitoring its ongoing operations. The first edition of the textbook mentioned this concept in Chapter 13; the second edition introduced it as a design principle in Chapter 4.
- Separation of Duty – decompose a critical task into separate elements performed by separate individuals or entities. This reflects the most common phrasing in the security community. Some writers phrase it as “segregation of duty” or “separation of privilege.” This was Principle #8 in the first edition. Introduced in Chapter 8.
References
- Saltzer, Jerome, and Schroeder, 1975. “The protection of information in computer systems,” Proc IEEE 63(9), September, 1975.
- Smith, Richard, 2012/2016/2021. Elementary Information Security, Burlington, MA: Jones and Bartlett.

You must be logged in to post a comment.