This article, originally presented to the Cross-Border Data Forum, expands upon arguments first set forth by the authors in “Flat Light: Data Protection for the Disoriented, From Policy to Practice,” The Hoover Institution, November 20, 2018.
Flat light is the state of disorientation, feared among pilots, in which all visual references are lost. The effects of flat light “completely obscure features of the terrain, creating an inability to distinguish distances and closure rates. As a result of this reflected light, [flat light] can give pilots the illusion that they are ascending or descending when they may actually be flying level.”
This is the state of information security today.
Four years ago, there was the Heartbleed problem, a common-mode failure among products that were compliant with a particular networking standard—products that were inherently vulnerable to attack by way of their compliance itself. Early in the discussion of it here on Lawfare, this paragraph appeared:
Science tends to take us places where policy cannot follow. Policy tends to take us places where science cannot follow. Yet neither science nor policy can be unmindful of the other. Here I will confine myself to six points where I see science, including applied science, asking us to look ahead (The following is necessarily short; for a longer treatment of the science of security, per se, see "T.S.
I begin with a paragraph from Wikipedia:
Self-organized criticality is one of a number of important discoveries made in statistical physics and related fields over the latter half of the 20th century, discoveries which relate particularly to the study of complexity in nature. For example, the study of cellular automata, from the early discoveries of Stanislaw Ulam and John von Neumann through to John Conway’s Game of Life and the extensive work of Stephen Wolfram, made it clear that complexity could be generated