A
- adaptation vs. stability
- building systems that evolve with context rather than seeking unchanging order.
- algorithmic governance
- governance shaped by algorithmic systems, where power can accrue through data, metrics, and optimization.
- audit trail
- a chronological record of system activities, decisions, and data changes that enables reconstruction of events and accountability; essential for traceability in regulated environments
- automation modes
- distinct operational states in automated systems where control authority shifts between human operators and automated functions; can be hidden or partially legible, creating confusion during anomalies
B
- balancing (negative) feedback loops
- loops that counteract change and stabilize the system (e.g., mutualism and scarcity damping inequality).
- buffers / slack
- spare capacity, time, or resources between system components that absorb variation and create space for intervention; eliminated by optimization pressures, potentially reducing system resilience
- business continuity
- organizational capacity to maintain or rapidly resume critical functions after disruption; focuses on recovery time objectives and continuity strategies rather than prevention alone
C
- circuit breakers
- designed interruption points that allow systems to pause, isolate, or halt cascading failures.
- compliance vs. trust
- when procedural conformity is treated as equivalent to genuine trust this typically produces brittle systems.
- continuous monitoring
- ongoing observation of system states, behaviors, and performance indicators to detect anomalies, degradation, or emerging risks in real-time rather than through periodic audits
- continuum
- the idea that “there are no separate systems” and everything is connected across gradients rather than hard divides.
- custodianship
- a governance stance that emphasizes responsibility for maintaining, limiting, or retiring risky systems rather than mastering them.
D
- degraded state
- a mode of operation in which a system continues functioning with reduced safeguards, redundancy, or visibility.
- delay
- the lag between action and its observable effect, which can distort understanding and control.
- design for feedback (not prediction)
- prioritizing mechanisms that learn and adapt over attempts at perfect foresight.
- distortions
- biases and outright errors in information that misguide decisions and weaken responsiveness.
- dogma
- fixed belief that resists critique and change, blocking the flow of information.
- dynamic equilibrium
- a shifting balance maintained through ongoing feedback and adjustment, not static harmony.
E
- epistemic humility
- acknowledgment that our knowledge and our models are partial and that systems will surprise us; considered important for learning.
- essentialism
- reducing reality to a single defining property or value.
- ethical stance
- a stance that is taken because or for a certain idea of what is ethical or moral.
F
- fail-safe mechanisms
- design features that default to a safe state when failures occur; rely on predictable failure modes and may not address system accidents arising from interactive complexity
- feedback
- information returning to a system or person about its behavior, enabling correction or amplification.
- fragility
- a state where a system cannot adapt, and risks failure when conditions within the system, or in its wider environment, change.
H
- hidden modes
- system states or automation conditions that are active but not easily visible or legible to operators.
I
- ICS (Industrial Control Systems)
- cyber-physical systems that monitor and control industrial processes in critical infrastructure (energy, water, manufacturing); tightly coupled and increasingly complex as IT/OT converge
- incident escalation
- the process by which minor failures, deviations, or anomalies combine and amplify into more serious events; particularly dangerous in tightly coupled systems where response windows narrow rapidly
- incident response
- organizational capability to detect, interpret, contain, and recover from system failures or security events; effectiveness depends on traceability, clarity of system states, and ability to intervene without full understanding
- information flows
- the movement of data, signals, and stories through a system; the lifeblood of adaptation and change.
- inside-out participation
- engaging as both observer and participant within systems, listening as much as acting.
- interactive complexity
- a property of systems where components interact in unexpected, non-linear ways outside normal or intended sequences.
- interdependence
- mutual reliance among parts of a system.
L
- last-link mindset
- a failure analysis approach that locates responsibility at the final human or technical action closest to the accident.
- learning vs. control
- favoring iterative improvement and openness over rigid command-and-control structures.
- leverage points
- places in a system where small shifts yield large effects.
- listening systems
- systems designed to hear themselves—through active, diverse feedback that informs adaptation, adjustment, change and renewal.
M
- map vs. territory
- a warning that models and metrics can colonize reality when they are mistaken for it.
- metrics vs. meaning
- the risk that quantitative indicators substitute for substantive judgment or values.
- moral humility
- recognition of limits, interdependence, and participation in the face of moral decision-making.
N
- nonlinear causality
- cause and effect relationship that does not unfold in simple, straight chronological lines, but rather indirect, delayed, or emergent.
- normal accident
- an accident that is structurally inevitable in certain systems due to their interactive complexity and tight coupling.
O
- optimization loops
- reinforcing feedback that privileges efficiency and stability over adaptability.
- oscillation
- movement back and forth between states or poles.
- ossification
- the process by which ideals harden into dogma, narrowing discourse and choking feedback.
- ownership of feedback loops
- the question of who controls how information circulates and which voices are encoded in systems.
P
- paradigm
- the underlying worldview or set of assumptions from which a system’s goals, rules, and feedback loops arise.
- paradigm shift
- transformation at the level of the underlying worldview or set of assumptions that reconfigures the whole system’s behavior.
R
- redundancy
- duplication of critical components, processes, or data to maintain function if one fails; effective for component failures but less so for system accidents where multiple redundant elements can fail simultaneously through unexpected interactions
- reflection
- the capacity to examine one’s own behavior and assumptions, beyond reactive control.
- reinforcing (positive) feedback loops
- loops that amplify change (e.g., wealth begets power, power begets wealth).
- resilience
- a system’s capacity to absorb shocks, adapt, and keep functioning; lost when feedback is suppressed or distorted.
- response window
- the limited time available for operators to detect, interpret, and intervene before escalation becomes irreversible.
- rollback
- the ability to reverse system changes, configurations, or transactions to a previously known acceptable state; provides intervention capability when full understanding is unavailable during incidents
- root cause analysis
- investigation method that seeks to identify the fundamental reason for a failure; runs the risk of defaulting to last-link thinking in complex systems where multiple interacting factors combine to produce accidents
S
- socio-technical system
- a system encompassing both technical components (machines, software, infrastructure) and social elements (people, organizations, practices)
- surveillance
- data collection and monitoring that can tighten control loops and skew incentives.
- system
- a pattern of relationships, not a mere collection of parts; a dynamic whole shaped by stocks, interactions, flows, and feedback loops.
- system accident
- a failure arising from the unanticipated interaction of multiple normal or minor failures within a tightly coupled, complex system.
- system boundaries
- conceptual lines used for modeling; incomplete and permeable because systems interpenetrate.
- system health
- the quality of a system’s functioning, indicated by resilience, openness to feedback, adaptability, and learning.
T
- tight coupling
- a condition in which system components are linked with little or no slack, buffer, or delay, causing effects to propagate rapidly.
- traceability
- the ability to reconstruct system states, decisions, data, and changes across time and organizational boundaries.