Bracing against the wind  

Monday, July 19, 2004

Distributing Risk

Is this house a good deal or am I going to lose money on it? Should I get a new job, or is it too risky? Analysis of risk is essential to our daily lives. When we buy a carseat for our baby, or read the daily news, we are assessing risk. In fact, the risk assessment can be seen as a model for a large portion of our thought processes. Our brains are constantly assigning probabilities of failure and utility of the various outcomes to every event and action. This process goes on naturally, without our conscious mind ever being aware of it.

Although a large portion of risk theory can be taught to high school students, many of us go about our lives without any training in risk analysis.

An introduction to risk analysis illustrates some problems with classic risk analysis. It makes many assumptions and seems unnecessarily complex. Perhaps this is the reason why people instinctively lose interest in classic risk analysis. Or perhaps it is that we know, instinctively, that our experience alone cannot account for all of the factors influencing the outcome of a given event. We also know that, from experience, when we consciously analyze things our performance tends to worsen in the short term, not get better.

But what if a very small, unknown piece of information can cause the decision you make to be wrong? What if you buy a house and everything seems perfect, but something unexpected causes you to the house to become valueless? Isn't it worth our effort to consciously understand risk?

Mathematics is riddled with equations that are unpredictable. The study of unpredictability is known as chaos theory. We've all seen pictures of bridges whipping about wildly in the wind and snapping apart. And we've heard the doomsday predictions of Jeff Goldblum in Jurrassic park.

It is not the stuff we know that cause upsets in our lives, nor is it the stuff we know that we don't know. We don't know whether there are termites in a house, but we can buy termite insurance to account for that. It's the stuff that is entirely unaccounted for in our own, unconscious, risk models of our own environments that causes breakdowns.

How many "unknown risk factors" are there? The answer is obvious. It's unknown. Most probably, it's infinite.

Can you account for this deep level of unknown? Can you take actions in accordance with principles that will impact risk in this area of chaos and uncertainty?

Suppose you are assembling a secure communications device. You can hire one company to do it, but if one of their employees is a spy, then you've lost the security of the device. So, instead, you hire three companies to build the device. Each company builds its own encryption algorithm, and the final device is designed so that all three components are necessary for communication. You have tripled the likelihood of encountering a spy (p*3). But you have also reduced the odds that communication is going to be compromised from (p) to (p^3).

We don't know the likelihood of there being a spy. In fact, assuming we hiring experts that know more than us, we don't know much about the algorithms used by the three companies or whether they're secure at all. We think that we've reduced our risk by some unknown factor (p - p ^ 3), so we pat ourselves on the back.

Conclusion: Hierarchically organizing and distributing tasks reduces risk emanating from the edge of the hierarchy.

Suppose you are operating a hosting business. You hand out two servers to each customer, so that if one server goes down, another one takes its place. You've got 8 servers named [a-j]. Do you hand them out in pairs? a/b c/d e/f g/h i/j ? Or do you hand them out in any combination? a/b a/c b/e , etc. There are 45 combinations of servers to hand out. Suppose two random servers go down. Under the pairs system, there's a good chance nobody will notice. Under the random system, some customers will always have a problem. Statistically, over time, the two systems are equal.

So, since everything is guaranteed to be equal, what's the better system?

In the "pairs" system, your outages will be much more rare, but will affect all of your customers. In the random system, outages will happen much more often, but will affect far fewer customers. The more distributed system, even though it doesn't improve the likelihood of an outage, is the better system. Small outages can be dealt with by a smaller customer service staff. If you lose 1% of your customers one day over a small outage, you'll survive. But if you lose 10% of your customers in one day, the resulting press and company reorganization could trigger a collapse of the whole firm.

Conclusion: Reduced organization and decentralization works to smooth out the bumps and prevent the possibility of a systemic collapse.

In the first conclusion, we see that a central organizing force reduced risk by organizing a project. In the second scenario, risk did not decrease, but we did decreased the likelihood of a systemic collapse. Both tasks are essential. On one hand, a system cannot survive if it isn't organized. On the other hand, an organizer can purposefully "disorganize" as much as possible to reduce the possibility of a system-wide breakdown.


[View/Post Comments] [Digg] [] [Stumble]

Home | Email me when this weblog updates: | View Archive

(C) 2002 Erik Aronesty/DocumentRoot.Com. Right to copy, without attribution, is given freely to anyone for any reason.

Listed on BlogShares | Bloghop: the best pretty good | Blogarama | Technorati | Blogwise