Letzte Aktualisierung am 13. November 2014.

Das Buch „Robust Control Systems Network“ von Ralph Langner, dem bekannten deutschen Stuxnet Analysten, sollte zur Pflichtlektüre für alle IKT-Sicherheitsexperten gemacht werden. Einerseits, um die Industriesteuerungsanlagenumgebung besser verstehen zu lernen und andererseits, da er auch zahlreiche Denkansätze für eine nachhaltige IKT- und Cyber-Umgebung aufzeigt.

Auszugsweise werden hier einige Aussagen wiedergegeben:

So, rather than doing more of the same, it might be a better idea to look for alternatives: alternative concepts and methods that do not necessarily replace the existing school of though, which largely borrows from IT security, but that complement it.

Insufficient control system reliability can and will be caused by unstructured growth of system complexity – even without the presence of adversaries such as hackers, crackers, or malware.

Nevertheless, the driving factor for this development is not an increased presence of cyber threats, it is growing system complexity, or, to be more specific, growing entropy.

According to the second law of thermodynamics, the more entropy, the more efforts is required to maintain order.

The statistical probability that a fragile system will function (its reliability) is disproportional to time: It decreases exponentially.

Damit wird einmal mehr der Zusammenhang zur Komplexität und den Schwarzen Schwänen hergestellt.

The logical model of risk has more often been abused to argue risk away rather than to plan and implement mitigation strategies.

Instead of high risk and low risk, the terms fragility and robustness will be used. → cyber fragility and cyber robustness; it is actually no longer necessary to use the term risk at all. The benefit of talking less about risk (and more about robustness, reliability, and maintainability) is that decision makers will argue less that the whole problem might be only hypothetical. While risk and security are hypothetical, fragility and robustness are factual.

Robustness is determined by negative testing. A system that is only tested positively may be functional but fragile. This observation extends beyond control logic. Systems that have become so complex that they are not fully testable any more because of combinatorial explosion can hardly be robust.

The most prominent indication of a fragile system or installation is inadequate system understanding, evidenced by lack of documentation.

Robustness is not about using specific technology, it is about proper use of technology.

A system that is not fully understood cannot be reliable and poses a major problem for maintenance.

The additional options, or delta, that are not required provide for fragility without adding value. If this delta is reduced, ideally to zero, then robustness is increased without decreasing a systems’s value.

Every software application, service, and interface, and the network connectivity associated with it, introduces a new degree of freedom to the system, and therefore adds potential for fragility.

Human users present one of the biggest challenges from system reliability, because the bandwidth of human behavior when interacting with cyber systems is extreme – from beneficial troubleshooting to international malicious attacks.

The benefit of reducing network exposure cannot be overstated. In essence, it is twofold: First, reduce network exposure reduces the number of potential sources of problems. Second, it reduce the number of affected systems, and therefore the potential for damage.

Many more problems have been caused in IACS environments by deliberate change than by dramatic, malicious cyber attacks.

There may be individuals, groups, or departments with a good understanding of parts of the system; however, experience shows that such groups – for example, networking and operations – don’t communicate well with each other. Furthermore, these systems may be more complex than they have to be, and more fragile than should be tolerable.

One of the worst characteristics of cyber is that, unlike mechanics, there is no predictable degradation. In mechanics, problems usually build up gradually over time. Not so in Cyber. Many cyber effects are not obvious and catch operators by surprise. The surprise may even exaggerate technical problems, especially when operators misinterpret system behavior based on the – inaccurate – model they have in mind. It then becomes obvious that the notion of having full control over a process was illusionary, even if the illusion worked for many years. Hidden, nonanticipated cyber effects that suddenly pop up and surprise operators and maintenance engineers challenge the operators‘ ability to operate the process reliably and safely. Surprise is proof of insufficient predictability.

While a robust mechanical design often can be identified even by laypeople (a metal cover is used instead of plastic, thick wiring is used, etc.), cyber robustness is invisible in most of its aspects.

Fragile systems may function flawlessly for many years, but do so only a s long as several undocumented (and perhaps unknown) parameters are met.

Cyber robustness and cyber security may be viewed as complementing paradigms, addressing similar, sometimes even identical problems. Many cyber security incidents may be attributed to cyber fragility, and increasing cyber robustness will reduce risk. The fundamental difference is that the security approach teaches to be careful, where the robustness approach teaches to be strong.

Fragility should be seen as a problem itself. Therefore fragility should be minimized even without being able to specify the potential cause of variation and change.

Robustification is not about defense and mitigation. It is not primarily against anything. The logic of the “against”, which implicitly assumes some form of external antagonist, is out of scope for a robustness perspective.

For fragile systems, determining external (specific/assignable) cause of failure, or hypothesizing about potential external cause of failure is misleading, because potential causes for failure are all around in the typical environment of the target system and may occur just randomly. The ultimate cause of problems is not external; it is internal.

Aussagen, die voll unterstrichen werden können. Gerade im derzeitigen Drang, immer mehr Komplexität zu schaffen (Stichworte wie „Smart“, „Internet of Things“, „Systems of Systems“), sollten diese Denkansätze entgegengesetzt werden.