adaptation vs. stability
building systems that evolve with context rather than seeking unchanging order.
A working glossary — terms, not claims. Each links to the passages that ground it in Quotes and the propositions it underwrites in Arguments.
building systems that evolve with context rather than seeking unchanging order.
a system's or organisation's ability to adjust its behaviour, structures, and responses to cope with novel or unanticipated conditions; the operational expression of resilience
governance shaped by algorithmic systems, where power can accrue through data, metrics, and optimization.
a risk management strategy based on predicting and preventing potential dangers before damage is done; relies on centralised control, precautionary regulation, and compliance frameworks; Wildavsky argues that overinvestment in anticipation erodes the adaptive capacity needed to cope with dangers that prediction cannot reach
a chronological record of system activities, decisions, and data changes that enables reconstruction of events and accountability; essential for traceability in regulated environments
distinct operational states in automated systems where control authority shifts between human operators and automated functions; can be hidden or partially legible, creating confusion during anomalies
loops that counteract change and stabilize the system (e.g., mutualism and scarcity damping inequality).
spare capacity, time, or resources between system components that absorb variation and create space for intervention; eliminated by optimization pressures, potentially reducing system resilience
organizational capacity to maintain or rapidly resume critical functions after disruption; focuses on recovery time objectives and continuity strategies rather than prevention alone
the process by which administrative legibility, imposed without awareness of local knowledge, destroys the complexity it cannot represent; Scott's term for the large-scale failures that result when simplified state models override the adaptive systems they were meant to describe
designed interruption points that allow systems to pause, isolate, or halt cascading failures.
premature lock-in to a course of action based on superficial planning, creating escalation dynamics that make reversal nearly impossible
condition where everything is documented but nothing can be found; totality that prevents navigation and obscures the essential
the structural substitution of procedural conformity for genuine trust; produces brittle systems because the capacity to pass audit is mistaken for the right to be accepted, and legibility is allowed to stand in for legitimacy
ongoing observation of system states, behaviors, and performance indicators to detect anomalies, degradation, or emerging risks in real-time rather than through periodic audits
the idea that \"there are no separate systems\" and everything is connected across gradients rather than hard divides.
the practitioner's relationship with systems they did not build and cannot fully comprehend; not mere maintenance but maintenance animated by purpose — tending inherited architectures with care, owning the decision to operate them despite their failure modes, and keeping them oriented toward the people they were built to serve
the assurance that data is attributable, legible, contemporaneous, original, accurate, complete, consistent, enduring, and available (ALCOA+); captures the record but not the contextual knowledge that makes the record meaningful
a mode of operation in which a system continues functioning with reduced safeguards, redundancy, or visibility; dangerous because often invisible to operators, making it a governance failure rather than merely a design flaw
the lag between action and its observable effect, which can distort understanding and control.
prioritizing mechanisms that learn and adapt over attempts at perfect foresight.
biases and outright errors in information that misguide decisions and weaken responsiveness.
the hardened end-state of ideals that were once living and adaptive; resists critique and change, narrows discourse, and blocks the information flows a system needs to remain healthy
a shifting balance maintained through ongoing feedback and adjustment, not static harmony.
acknowledgment that our knowledge and our models are partial, that the limits of a framework are invisible from inside the framework, and that what the system cannot represent does not thereby cease to exist
the instinct to reduce complex reality to a single defining property, cause, or explanation; in governance, it manifests as the demand for a single root cause, the belief that one fix can resolve systemic tension, and the habit of treating compliance as certainty
a posture of engagement with complex systems grounded in moral and epistemic humility rather than abstract principle; ethics precedes method — it begins with the acknowledgment of interdependence and participation rather than detached judgment
design features that default to a safe state when failures occur; rely on predictable failure modes and may not address system accidents arising from interactive complexity
probability distribution where extreme outcomes occur far more frequently than normal (Gaussian) models predict; implies standard contingency buffers are inadequate
information returning to a system about its own behaviour, enabling correction or amplification; the mechanism through which systems maintain contact with reality — its suppression, distortion, or structural prevention is the common pathway to fragility
a condition in which a system cannot adapt to changing circumstances, typically arising from suppressed feedback, inability to verify assumptions, or the confusion of documented stability with genuine health; risks failure when the gap between the system's model and its reality grows too wide
an ideology of supreme confidence in scientific and technical mastery that, when combined with state power and administrative legibility, can override local knowledge and produce catastrophic simplifications
cyber-physical systems that monitor and control industrial processes in critical infrastructure (energy, water, manufacturing); tightly coupled and increasingly complex as IT/OT converge
the process by which minor failures, deviations, or anomalies combine and amplify into more serious events; particularly dangerous in tightly coupled systems where response windows narrow rapidly
organizational capability to detect, interpret, contain, and recover from system failures or security events; effectiveness depends on traceability, clarity of system states, and ability to intervene without full understanding
the movement of data, signals, and stories through a system; the lifeblood of adaptation and change, and the primary axis along which governance power operates — who controls information flows controls collective understanding of the system's purpose and constraints
engaging as both observer and participant within systems, listening as much as acting.
the formal and informal structures — markets, institutions, feedback mechanisms, accumulated learning — that enable resilience to operate beyond bare survival
a property of systems where components interact outside normal or intended production sequences, producing outcomes that no individual component was designed to create; one of Perrow's two structural conditions (with tight coupling) that make serious accidents inevitable rather than exceptional
the condition in which parts of a system do not merely rely on each other but constitute each other; the recognition that our decisions reverberate through networks we cannot fully see, yet always participate in
a failure analysis approach that locates responsibility at the final human or technical action closest to the accident; produces closure without learning because it stops at the proximate cause and prevents examination of the systemic conditions that made the failure possible
favoring iterative improvement and openness over rigid command-and-control structures.
the process of making complex realities visible to administrative power by simplifying them into structures that can be read, measured, audited, and governed; necessary for governance but destructive when mistaken for a complete representation of reality
the quality of being accepted by the people subject to a system as genuinely serving the purpose it claims to serve; distinct from legibility in that it requires earned consent rather than documented compliance
places in a system where small shifts yield large effects; counterintuitive by nature — the deepest leverage lies not in adjusting parameters but in shifting the paradigm from which the system's goals and rules arise, and we typically use them backward
systems designed to hear themselves — through active, diverse feedback that informs adaptation, adjustment, change and renewal.
a warning that administrative simplifications do not merely describe reality but, when backed by institutional power, actively remake it; the model ceases to be a tool and becomes the authoritative guide to action, erasing the knowledge it cannot represent
practical, embodied, contextual intelligence that cannot be codified in manuals or captured in databases; the tacit knowledge of experienced practitioners that operates below the threshold of formal representation and that administrative systems structurally cannot capture
the risk that quantitative indicators substitute for substantive judgment or values.
acknowledgment that moral decision-making operates within limits, interdependencies, and consequences one cannot fully foresee; the existential awareness that freedom means participation in systems, not mastery over them
cause and effect relationship that does not unfold in simple, straight chronological lines, but rather indirect, delayed, or emergent.
an accident that is structurally inevitable in certain systems due to their interactive complexity and tight coupling.
cognitive tendency to underestimate risks and overestimate one's ability to manage them; a primary driver of megaproject failure
reinforcing feedback that privileges efficiency and stability over adaptability.
the productive movement between opposing states or poles in a system; distinct from mere fluctuation — in systems thinking, oscillation describes the dynamic tension that sustains equilibrium rather than indicating instability
the process by which ideals harden into dogma, narrowing discourse and choking feedback.
the question of who controls how information circulates and which voices are encoded in systems.
the underlying worldview or set of assumptions from which a system's goals, rules, and feedback loops arise.
transformation at the level of the underlying worldview or set of assumptions that reconfigures the whole system's behavior.
systematic tendency to produce forecasts that are indistinguishable from best-case scenarios, ignoring base rates and historical evidence
duplication of critical components, processes, or data to maintain function if one fails; effective for component failures but less so for system accidents where multiple redundant elements can fail simultaneously through unexpected interactions
estimation method that treats a project as 'one of those' — a member of a class whose outcomes are already known — rather than as unique
the capacity to examine one's own behaviour, assumptions, and frameworks from within; in governance, the active capability — distinct from reaction — that enables systems and practitioners to question whether their models still match reality
loops that amplify change (e.g., wealth begets power, power begets wealth).
a strategy for managing risk through the capacity to cope with unanticipated dangers after they become manifest, rather than predicting and preventing them in advance; operates through trial and error, distributed experimentation, and adaptive capacity; requires institutional scaffolding and surplus capacity to function beyond bare survival
the limited time available for operators to detect, interpret, and intervene before escalation becomes irreversible.
the ability to reverse system changes, configurations, or transactions to a previously known acceptable state; provides intervention capability when full understanding is unavailable during incidents
investigation method that seeks to identify the fundamental reason for a failure; runs the risk of defaulting to last-link thinking in complex systems where multiple interacting factors combine to produce accidents
the insight that safety is not a state to be achieved but a process that degrades over time and must be continuously reaccomplished through active engagement with risk
a governance pathology in which compliance activities become their own justification, losing contact with the operational purpose they were designed to protect; arises when the capacity to pass audit is treated as evidence of quality rather than evidence of auditability, and legibility is accepted as a substitute for legitimacy
a system encompassing both technical components (machines, software, infrastructure) and social elements (people, organizations, practices)
pattern where visionary ambition exceeds structural capacity, producing systems that survive only through continuous rescue and perpetual intervention
deliberate distortion of forecasts, costs, or benefits to win approval or contracts; distinct from optimism bias in that it is intentional rather than cognitive
the global level of general resources — wealth, slack, reserve capability — that makes resilience possible by enabling a society or organisation to absorb shocks and tolerate the costs of trial and error
the conversion of activity into monitorable data; transforms the relationship between observer and observed, and when embedded in governance systems, can substitute visibility for understanding and tighten control loops at the expense of adaptive capacity
selection effect in project approval where cost underestimation and benefit overestimation increase approval likelihood, causing the most deceptively forecasted projects to proceed
a pattern of relationships, not a mere collection of parts; a dynamic whole shaped by stocks, interactions, flows, and feedback loops.
a failure arising from the unanticipated interaction of multiple normal or minor failures within a tightly coupled, complex system.
conceptual lines used for modeling; incomplete and permeable because systems interpenetrate.
the quality of a system's functioning, indicated by resilience, openness to feedback, adaptability, and learning.
systematic, codifiable, teachable knowledge — the kind of knowing that can be written down in manuals, encoded in databases, and transferred through formal instruction; powerful but incomplete without the contextual intelligence of mētis
a condition in which system components are linked with little or no slack, buffer, or delay, causing effects to propagate rapidly.
the ability to reconstruct system states, decisions, data, and changes across time and organizational boundaries.
a decentralised process of experimentation, failure, learning, and adaptation through which societies develop the capacity to cope with the unexpected; Wildavsky's core mechanism for resilience
conviction that one's project or situation is special and exempt from base rates that govern similar endeavors