White swans: Along with ducks spray painted black

When I received “The black swan: The impact of the highly improbable” by Nassim Nicholas Taleb (Random House, New York, 2007), I was intrigued by a view into the financial world of high finance. In the interest of saving you time reading this review I will be looking at how to apply Taleb’s ideas and techniques to cyber warfare and my perceptions of the book. Taleb has some very specific thought on academics in general, practitioners of particular trades, and statisticians specifically. Not many people are going to be amazed to find me gravitating to a thought process or book that pulls the short hairs of the statisticians with a firm yank every page or two.

Nassim Taleb describes himself as a skeptical empiricist and voracious reader which along with his career gives him an interesting perspective on the events of science and more specifically calculating risk. Or, as he might be more willing to say failing to calculate risk correctly.  He starts out by using extensive anecdotes to discuss risk and reward. As an example he asks the question about how recording technology has disenfranchised the musician of today (page 29) and the secondary or even tertiary effects of such a change in the economic landscape and thereby globalization.

To understand effects Taleb discusses “mediocristan” and “extremistan” and carries those through the book as his two landscapes of discussion (page 35). Basically the elements of the bell curve and standard deviation is found in mediocristan and those of the outlier type are found in extremistan. As my research and discipline wades around all day long in extremistan I was instantly interested. Cyberwarfare as well as most conflict resides in extremistan. In technology and information assurance and security we look at tightly coupled traffic flows and use cases. Very much bell curve oriented, but when something bad occurs traffic can go to zero (an extreme), or saturates the system (as in a distributed denial of service). The normal peaks and ebbs are of little interest. It is the extremes that get my attention.

There is an extensive discussion thread through the chapters looking at the idea of missing evidence and induction and deduction logic. The survivors tell the story while the victims of the shipwreck are silent (page 100). This silent evidence has a tendency to be overlooked. The fallacy of stating there are no ghosts versus I have no evidence to support there are ghosts and the natural leap to the former as logical (where it actually is not) is a common human trait.  This narrative fallacy has a tendency to hide the reality and camouflage what Taleb calls Black Swans.

This exact issue of missing evidence fills the computing systems of the world and is why information assurance and security cannot get traction in the minds of business. The reason we do not do computer security well can be found in the minds of business professionals suckled on the bell curve as a business tool. I can never prove the absence of bugs in software I can only prove the ones I have found and sometimes that I fixed them. The absence of knowledge regarding the totality of the environment is not knowing what I don’t know. A trait missing from many or covered over by the enthusiasm to embrace the narcissism of false perceived knowledge. I will never say a computer system is secure, never can say that a network is secure, I can only assert that I have used best practices and will monitor for new methods of exploitation.

When meeting with military and security people there is an obvious shared camaraderie as Taleb in “Lunch at Lake Como” (page 125) finds those who deal with “life and death” risk are aware at a visceral level of the risk quotients. Whereas bankers and businessmen have a tendency to categorize security and apply projections (projections themselves being an issue) the security professional builds processes that are flexible and defensible. The unknown unknown (page 127) is a factor that the security professional is willing to admit. Though Taleb does not state it as such this axiom is found in the military cliché “No plan survives contact with enemy”.

Consider the heterogeneous versus homogenous network security paradigm. In other words do you have all of the same systems and one exploit effects them all upon creation, or do you have many different systems and the likelihood of an exploit being found increases but only some systems will be effected.  In business or government the front end purchase is likely the result of much negotiation when buying servers and desktop systems. The far ranging effect (outlier) is not considered. If the entity can afford to operate without computing systems for some long period of time it likely does not matter. If the entity involved must operate through any kind of vulnerability and exploit then the only choice is high-level wide ranging heterogeneity in the systems. That likely won’t come up in the vendor negotiations.  

The window of purchasing risk is tightly constrained by the perceived risks and incentives. The consideration of catastrophic exploitation and the cascading failures and costs are ignored for cost savings that are minimal to negligible next to the “Black Swan” event.  I am sure that there is some pithy homily to be inserted here but the reality is that those who push for homogenous networks and systems acquisitions should not be in the decision loop. There is no “fighting through the attack” when all you had was infantry and they have artillery and armor.

This inability to predict left me wondering for a few pages how it would be addressed. Popper (as in Karl) rises from the page to discuss the limitation of historical and soft “science” that rely on the narrative versus the empirical (page 171) as predictive. Creating empiricism from evidence in a narrative would seem to be dark voodoo to Taleb. Of course it was on page 171 that Taleb threw out the challenge that he wanted to see a well thought out consideration of his work titled “The White Swan” based on the insults traded between Algazel and Averroes. I’ll get to my pans of his work but now you know where my title came from. 

No field of study is beyond the withering pen of Taleb. In his mind the academician who is tenured feel that they own the knowledge. I would say that in eastern elitist institutions with ivy and withering intellectualism that may be true. In the middle of America where the yeoman work of research is done, the hidden and less noticed academics study, and the sharing of knowledge is freely given. Though the University itself looks at the intellectual capital of the mavens it finds as monetary patents in waiting the intellectual still shares. In the instantiation of mid-western agronomic values is a certain feeling of civic duty to insure purity and if not that perspiration is inherent in the learning task.  To be honest it is harder to publish, get grants, achieve notoriety, or even acknowledgement when you work at a University in the mid-west. What we get called is blue-collar universities where we “do stuff” rather than real research. Which brings me to the other point. Those of us who study like crazy to accomplish something only to be denigrated by the word Nerd kind of have come to adopt the term as our own. Using it as a derogatory term as Taleb does in various places challenges his own assertion that doing is better than thinking. One nose poke to Taleb for forgetting many “black swans” are the work of Nerds.

I was constantly wondering about positive “black swans” events though improbable had high impact that was not detrimental. Some examples though minimal were publishing and venture capital (page 207).  In the computer security world positive “black swans” are few and far between. Some of the ones I came up when thinking about this section of the book was firewalls, Linux, and wireless networking. Why wireless networking which is notoriously bad for a security posture? Because, it drove development of end to end secure communication. Taleb says there is a tendency of those who admit that “Black Swans” exist to stop attempting to predict and invest in preparedness such as by the military (page 208). That preparedness over prediction could end up being a positive “Black Swan”. There could likely be an entire academic conference trying to find positive “Black Swans” especially dealing with information technology.

For examples of bad “Black Swans” and using financial examples we get warned to not trust government because they attempt to predict using the Gaussian bell curve, are about perpetuating their existence, and when mixed with business this can create misunderstandings of the risks (page 209-210), and all of that is mixed in with the digestion that occurs with narrative. Not exactly an empirical/logical basis no matter what they call it.  This opens the topic of asymmetry (page 210). Unfortunately Taleb doesn’t really get the totality of this expression of asymmetrical abstraction. He falls into the trap of narrative that he so often warns the reader about. He needed only look at the concept of conflict to find better explanations.

The views that are brought out on Globalization ring true. The fragility of the differing economic systems and the absolute lack of resiliency in world trade likely can create the devastating Black Swans predicted (page 225).  Then in hopes of not making Nassim Taleb mad he predicts:

“We have never lived before under the threat of global collapse. Financial institutions have been merging into a smaller number of very large banks. Almost all banks are now interrelated. So the financial ecology is swelling into gigantic incestuous, bureaucratic banks (often Gaussianized in their risk measurement) when one falls they all fall. The increased concentration among banks seems to hae the effect of making financial crisis less likely, but when they happen they are more global in scale and hit us very hard” (page 225-226).

 Well among other similar reasoned arguments dealing with finance specifically Taleb years ago (according to his other book) seems to have reasoned out the threat that we are going through now. Realizing that evidence of the fact could just mean even a blind squirrel can find a nut, but also realizing that the arguments put forward were based in several large scale events in the past. The reality is that systemic pressures have been visible in our current crisis for a long time. Though some would wish it I’m going to stay away from high finance that I don’t necessarily understand (I know what I don’t know about that as the set of what I do know is so small).

The book at the end spends quite a bit of time talking about economic theory, but I want to jump back in where the author discusses triangles. In computer and information technology the little becomes big (one packet called a ping of death can kill a server).  There is a beautiful allegory to the financial system through the divination and mathematical prowess of Mandelbrot (page 257). The fractal theory is how computers work. Though computer graphics is used as an example the foundation of computer systems is very much fractal. The fractal starts as something and maintains geometry (go with this: substance) relatively based on scale. So the pebble resembles the boulder resembles the mountain. 

A packet as a representation (abstraction) of data becomes the representation in memory becomes the picture or data on the screen. The changes are primarily in scale rather than qualitative difference I expect many to cry foul at that but the substance of the packet is the same as a signal on the network that presents the abstraction on the screen. The display is a human process and the computer is no longer involved. That leads to some interesting thoughts on how heuristics could be used and adapted to fit the security task but that is a discussion for another time.

There is an interesting piece of advice that is shared near the end of the book:

“I don’t run for trains” (page 297)

I did much of what the author talks about. I quit my high paying job, I looked at life goals, and I considered my children before my career. Being a tenured professor will never pay as well as being an information technology/security “fixer” for major corporations. Being at a small regional university attached to a R1 Midwestern engineering powerhouse means I will never have the reputation that others might achieve. Then again I never engage in something so pitiful as running for a train.

I wanted to mention that in reviewing books Taleb says that book reviewers borrow from each other. Though other reviews have already hit the street I have not read any of them. I don’t read book reviews. I review books that I am reading or have been sent to me that I am interested in. Much like Signore Umberto Ecos library mentioned in Taleb’s introduction there is so much to be found in the books that are unread. The purpose of my reviews is not to gain acclamation but write down my perceptions on books I’ve read so that I can own that knowledge/perception and go on to the next.

Leave a Reply