TMC – Too Much Connection

You ever wonder about the first person who bought a fax machine? The first person to buy a cell phone? How about the first person who set up an email account? These early pioneers must have had some difficulty in demonstrating the benefits of these these technologies because of the very small base of users. It wasn’t until a critical mass of users adopted the technologies that the fax, cellphone, email, and other networked technologies demonstrated their true value. This is the network effect – the more people that using a networked technology makes it more valuable.

I remember when I set up my Commodore 64 with a 300 baud modem to connect to my high school friend across the small town of Winchester, Kentucky so that we could type text messages to each other. I spent an entire weekend typing in machine code from RUN Magazine to create a bulletin board system (BBS). My parents didn’t understand why I went to all this trouble when I could have just called Steve, mailed him a letter, or just see him at school the next day. Today, billions of texts and tweets are sent daily all thanks to the network effect.

The Internet has fundamentally changed our world because it has helped us connect on a level never seen before in human history. I don’t believe we can go back to a time before we had the Internet because so much of our current economic and societal systems depend on this connectivity. And, according to William Davidow, we are realizing new dangers as we go from highly-connected to overconnected.

Davidow argues that as complex dynamic systems (such as economies) become more and more connected they shift from stability to instability. There is a cultural lag as organizations and societies cultural practices lag behind technological advances. Institutions begin to falter because they are not flexible enough to keep up with the rapid changes and increasing demands of more and more connections. We enter a vulnerability sequence as positive feedback from the connections lead to more specialization and network lock-ins.

Davidow gives numerous examples of the dangers of overconnection such as Three-Mile Island, the decline of the American steel industry, and the 2008 mortgage meltdown. His best example is a two-chapter examination of how Iceland’s attempt to be an Internet banking superpower led to the collapse of the Iceland economy and government. Here we can see how positive feedback driven by the Internet led to riskier investments by Iceland banks and citizens that made them very vulnerable to an external event – the collapse of Lehman Brothers. Thanks to a chain of connections from New York to London to Paris and so on, the ripple effects from Lehman Brothers collapse were magnified so that the ripple became a tsunami that led to a massive devaluation of Iceland’s currency.

That is the secondary danger of overconnection – the magnification of the effects of small events into greater dangers. You may have heard of the black swan theory in which Nassim Taleb describes events that are so highly improbable that they are hard to foresee but can have significant impact when they occur. Thanks to overconnection we are subject to more black swan events that have their effects magnified by the positive feedback of overconnections.

So what does this mean for government agencies? As agencies rush to increase social networking inside and outside of their organizations they are in danger of becoming overconnected. Can the agency’s culture deal with the increasing demands of the connections? Is the agency flexible enough to deal with the unexpected events that will come being more open to the world? Will the management even realize when a black swan event has occurred?

To combat the effects of overconnection Davidow describes three things organizations must do:

  1. Provide buffers to mitigate the increasing positive feedback.
  2. Develop more robust systems that can better handle system accidents.
  3. Restructure organizations to be more effective and adaptable.

As we embrace Open Government we must realize that increasing transparency, openness, and collaboration has great benefits but can also lead to major unintended consequences. We need to strike that delicate balance between highly-connected and overconnected by moving at a pace where we transform agencies into more effective and adaptable organizations without going into a vulnerability sequence.

Davidow, W.H. (2011). Overconnected: The promise and threat of the Internet. Harrison, NY: Delphinium Books.

Leave a Comment


Leave a Reply

John Bordeaux

This touches on two related fields: Wicked problems and resilience engineering. Suggest we expand the conversation beyond Davidow…

Rittel, H. W. J., & Webber, M. M. (1973). Dilemmas in a General Theory of Planning. Policy Sciences, 4, 155-169.

Hollnagel, E., Woods, D. D., & Leveson, N. (2006). Resilience Engineering: Concepts and Precepts. Surrey, UK: Ashgate Publishing.

Avatar photo Bill Brantley

@John – Agreed. To paraphrase the classic dilemma: what happens when the overconnected organization meets the wicked problem? This is what I envision:

1) The parts of the overconnected organization (OC) tries to understand the wicked problem (WP) and communicates their perspectives to the rest of the organization. Soon, the OC is clogged with confusing messages and starts to shift toward instability.

2) Responses to the WP become erratic as control and coordination break down under the communication overload.

3) One of the CO’s responses looks promising which triggers positive feedback and the CO enters the vulnerability sequence.

4) The CO continues to hammer the WP with the continually-specialized solution that leads to increasingly random unintended consequences.

5) The WP morphs into a completely different problem that the CO can no longer understand let alone manage.

6) The CO collapses as it disintegrates due to overspecialization.

7) A new CO attempts to resolve the new WP.

Thanks for the suggestions. I am seeing a fascinating link to complexity economics here.

Steve Richardson

I’m glad to see that I’m not the only one who sees risk and burden where some see only collaboration. I’m also not surprised to see John’s comment here, since it is no coincidence he wrote on complex systems a day or two before (Job-Killing Processes). BTW, John, thanks for the references below. As I mentioned in my comment on his post, I just published a book on this subject (The Political Economy of Bureaucracy). My work focuses on relationships and the folly of organizing under an assumption of cooperation when conflict is more likely. Failure occurs not just due to territorial attitude but because of the natural behavior of complex systems. As you point out, when everyone is connected, we all move together or not at all. Learning and adaptation are impossible under such conditions. We’re humans, not machines!