Skip to content

Software Engineering Principles (Collection)

★ New
adopt
Platform open-source N/A free

At a Glance

The canonical collection of named software engineering laws, heuristics, and principles — from Brooks's Law and Conway's Law to YAGNI, DRY, Hyrum's Law, and the Testing Pyramid — that form the shared vocabulary of software practitioners for reasoning about complexity, quality, and team dynamics.

Type
open-source
Pricing
free
License
N/A
Adoption fit
small, medium, enterprise

Software Engineering Principles (Collection)

What It Does

“Software engineering principles” refers to the accumulated body of named heuristics, laws, and theorems that practitioners use as cognitive shortcuts when reasoning about software complexity, team dynamics, system design, and quality. They are not formal specifications or standards — they are experiential patterns that have been named, replicated, and refined across decades of practice.

The canonical modern compilation is Dr. Milan Milanovic’s lawsofsoftwareengineering.com (56 principles) and accompanying book (63+ entries). Other notable compilations include Hacker Laws (github.com/dwmkerr/hacker-laws) and Laws of Software (laws-of-software.com). The underlying principles span 50+ years of software engineering history, from Fred Brooks’s 1975 “The Mythical Man-Month” to Hyrum Wright’s 2012 API observation.

This catalog entry covers the collection and high-value individual principles not warranting their own entry. For detailed entries see: Conway’s Law, CAP Theorem, Technical Debt, SOLID Principles.

Key Features

Teams domain:

  • Brooks’s Law (1975): Adding engineers to a late project makes it later. Mechanism: training overhead + increased communication paths outweigh productivity gain. Supported by NASA SEL data and 7,200-project studies.
  • Dunbar’s Number (~150): Cognitive limit on stable social relationships. Applied to software teams: beyond ~150 people, informal coordination mechanisms (trust, shared norms) require explicit process scaffolding.
  • Bus Factor: Minimum number of team members who could be “hit by a bus” before the project is in critical knowledge-loss trouble. A bus factor of 1 is a project risk.
  • Conway’s Law: See dedicated catalog entry.

Planning domain:

  • Hofstadter’s Law: Tasks always take longer than expected, even accounting for Hofstadter’s Law (recursive). Applied: assume the estimate will underestimate even after doubling.
  • Parkinson’s Law: Work expands to fill the time available for its completion. Applied: unbounded iterations produce waste; time-boxing is a tool against it.
  • Ninety-Ninety Rule (Tom Cargill): The first 90% of code takes 90% of development time. The remaining 10% takes the other 90%. Accurately describes why software projects routinely overshoot estimates.
  • Goodhart’s Law: When a measure becomes a target, it ceases to be a good measure. Applied to engineering metrics: once velocity is tracked as a performance metric, it stops correlating with output.

Architecture domain:

  • Gall’s Law: A complex system that works has invariably evolved from a simple system that worked. Applied: do not design complex systems from scratch; grow them from working simple ones.
  • Hyrum’s Law: With sufficient API users, all observable behaviors become implicit contracts. Applied: every undocumented behavior your API exposes will be depended on by someone.
  • Law of Leaky Abstractions (Joel Spolsky): All non-trivial abstractions leak. Applied: every abstraction layer that claims to hide complexity eventually exposes that complexity under pressure.
  • Tesler’s Law (Conservation of Complexity): Every application has an irreducible core of complexity; it can only be moved, not eliminated. Applied: moving complexity from the user to the developer is a valid design choice, but complexity never disappears.
  • Second-System Effect (Brooks): Engineers’ second systems are typically over-engineered rewrites because their first success bred overconfidence. Applied: greenfield rewrites of working systems are high-risk.
  • CAP Theorem: See dedicated catalog entry.

Quality domain:

  • Technical Debt (Cunningham 1992): See dedicated catalog entry.
  • Boy Scout Rule (Robert C. Martin): Always leave the campground (codebase) cleaner than you found it. Applied: per-commit micro-refactoring as an alternative to scheduled refactoring sprints.
  • Kernighan’s Law: Debugging is twice as hard as writing code. Therefore, if you write code as cleverly as you can, you are by definition not smart enough to debug it. Applied: write for the future reader, not the current writer.
  • Linus’s Law: Given enough eyeballs, all bugs are shallow. Applied: open-source scrutiny reduces defects, but only to the extent that reviewers actually read the code (large projects with low contributor engagement do not benefit).
  • Testing Pyramid: Unit tests at the base (many, fast), integration tests in the middle (fewer, slower), E2E/UI tests at the top (minimal, expensive). A practical heuristic for test portfolio allocation.
  • Pesticide Paradox: Every method used to prevent or find bugs leaves a residue immune to that method. Applied: diversify your test types and refactoring approaches — a stable test suite stops finding new bugs.
  • Lehman’s Laws of Software Evolution (1970s): Evolving software must be continually adapted or quality declines; as software grows, complexity increases unless actively reduced; functional content grows over time regardless of planned releases.

Design domain:

  • YAGNI (You Ain’t Gonna Need It): Do not implement features before they are needed. Applied: oppose speculative generalization and premature abstractions.
  • DRY (Don’t Repeat Yourself, Hunt & Thomas 1999): Every piece of knowledge should have a single authoritative representation. Applied: eliminate duplication of logic, not just duplication of code (the Pragmatic Programmer’s formulation).
  • KISS (Keep It Simple, Stupid): Prefer the simplest solution that works. Applied: resist the engineer’s instinct toward elegant over-engineering.
  • Law of Demeter (1987): A unit should only communicate with its immediate dependencies — do not reach through objects to call their collaborators’ methods. Applied: “only talk to your friends.”
  • Principle of Least Astonishment: A system component should behave in a way that most users expect it to. Applied: API design, UI affordances, and function naming should minimize cognitive surprise.

Decisions domain:

  • Pareto Principle (80/20): 80% of effects come from 20% of causes. Applied: 20% of code causes 80% of bugs; 20% of features get 80% of usage. Focus optimization effort on the load-bearing 20%.
  • Hype Cycle / Amara’s Law: People overestimate short-term technology impact and underestimate long-term. Applied: calibrate adoption timing against where a technology actually sits on the hype cycle.
  • Lindy Effect: Technologies that have survived a long time are more likely to survive than new entrants. Applied: prefer boring, proven infrastructure for load-bearing systems; reserve experimental tools for non-critical paths.
  • Dunning-Kruger Effect: Low-competence individuals overestimate ability; high-competence individuals underestimate it. Applied to engineering hiring and team communication: novice confidence and expert hedging both mislead.
  • Sunk Cost Fallacy: Past investment in a failing approach is not a rational reason to continue it. Applied: rewrite decisions, vendor lock-in exits, and technology strategy all benefit from ignoring sunk costs.
  • Goodhart’s Law: See Planning domain above.

Scale domain:

  • Amdahl’s Law: Maximum theoretical speedup from parallelization is limited by the sequential fraction of the program. Applied: a program that is 5% sequential cannot be sped up more than 20x regardless of how many processors are added.
  • Gustafson’s Law: Counterpoint to Amdahl’s — by scaling the problem size with available parallelism, larger speedups are achievable. Applied: embarrassingly parallel workloads (batch ML training, map-reduce) benefit from Gustafson framing.
  • Metcalfe’s Law: The value of a network is proportional to the square of its users. Applied to platform strategy: network effects compound, making early-mover advantage exponential in two-sided marketplaces.

Use Cases

  • Code review vocabulary: Using shared principle names (“this violates DRY,” “we’re accumulating technical debt here,” “Kernighan’s Law applies to this function”) to communicate design concerns without lengthy explanations.
  • Architecture decision records (ADRs): Citing relevant principles as justification for design choices (“we chose eventual consistency over strong consistency because partition tolerance is non-negotiable per CAP Theorem”).
  • Engineering onboarding: Structuring new-hire technical education around named principles provides a navigable conceptual map of the craft.
  • Post-mortem analysis: Identifying which principles were violated in an incident (“the second-system effect on the rewrite” or “we hit Hyrum’s Law when we changed the error message format”).
  • Stakeholder communication: Translating engineering concerns into accessible language (“Parkinson’s Law suggests we need a time-box on this exploration” or “Brooks’s Law means hiring three engineers this sprint will slow us down before it helps”).

Adoption Level Analysis

Small teams (<20 engineers): High value per unit of effort. A single afternoon reading the key principles (Brooks, Conway, DRY, YAGNI, KISS, Boy Scout Rule, technical debt) provides a shared vocabulary that improves team communication for years.

Medium orgs (20–200 engineers): Include in engineering culture documents (CLAUDE.md equivalent for engineering norms), onboarding curricula, and architecture review checklists. The principles help align distributed team decision-making without mandating process overhead.

Enterprise (200+ engineers): Enterprise architecture functions often maintain formal design principle documentation. The risk at this scale is principles becoming compliance checkboxes rather than reasoning tools. Keep the list short and actionable.

Alternatives

AlternativeKey DifferencePrefer when…
Pragmatic Programmer (Hunt & Thomas)Book-length treatment of pragmatic software practice with specific techniques, not just named principlesDeep education rather than quick reference
A Philosophy of Software Design (Ousterhout)Single coherent framework around deep modules and cognitive load reductionStrong opinions on class design; challenges SOLID’s SRP directly
Team Topologies (Skelton & Pais)Operationalizes Conway’s Law into a prescriptive team structure frameworkActive org design work, not just principle reference
Hacker Laws (github.com/dwmkerr/hacker-laws)Community-curated alternative compilation, broader scope, GitHub-hostedWant a freely editable, community-maintained reference

Evidence & Sources

Notes & Caveats

  • Not all principles are equally well-evidenced. Conway’s Law, CAP Theorem, and Amdahl’s Law have formal proofs or rigorous empirical studies. Zawinski’s Law (“every program attempts to expand until it can read mail”), Cunningham’s Law (“the best way to get the right answer is to post the wrong answer”), and the Dilbert Principle are folk wisdom or satire.
  • Principles are context-dependent. DRY applied to data schemas produces JOIN-heavy normalized databases that hurt read performance. SOLID applied to a 200-line script produces 20 files of boilerplate. Principles require judgment about applicability.
  • The “laws” framing is aspirational — these are not physical laws. Counterexamples exist for nearly every named principle. Their value is as thinking tools, not prescriptions.
  • Dr. Milan Milanovic’s compilation is the most accessible modern reference but is a practitioner’s synthesis, not a peer-reviewed survey. The academic counterpart is Lehman’s 1970s work on software evolution laws, which remains more rigorously grounded.

Related