Civil unrest, regardless of cause, creates unexpected risks to lives and property. Predicting the timing and scale of these events would allow for better tactical management and a more effective training process. However, theoretical work by complex systems scientists and real-world experiences of first responders make a strong case that such forecasting may be impossible. Still, recent advances in understanding complex systems can provide profound contributions to the preparedness community.
Many researchers have come to focus on the concept that some natural world systems are complex and adaptive and that these systems share universal properties regardless of their character. Complex adaptive systems include markets, national and local economies, political movements, ecosystems, revolutions and insurgencies, crowds, and civil unrest and riots. Several institutions have evolved in the 21st century to study complexity – the prominent examples including the Santa Fe Institute, the Center for the Study of Complex Systems at the University of Michigan, and the Center for Social Complexity at George Mason University. The patterns in complex systems are more than the sum of their parts. In fact, the pattern that emerges – recessions, riots, or revolutions – cannot be understood or predicted in the traditional way: breaking the systems down into individual parts.
Moreover, the adaptive character of individuals makes such systems impossible to predict. In traditional systems like a car, a factory, or a computer, past performance can be analyzed statistically to predict future behavior. Given the same inputs, the system (most of the time) produces the same output. The adaptivity in complex adaptive systems means that people learn, plan, anticipate, adjust, and react. Thus, when the same inputs, the same policies, the same interventions are applied, the system will not respond in the same way.
Expecting the Unexpected
For emergency preparedness planners, the most important implication – and one that any good police department understands – is the need to expect the unexpected, managing each situation in its own context. Emergency preparedness planners have to think more like stock traders than like airline pilots. Their “system” – a neighborhood of people with many backgrounds, cultures, and ages – is never going to be predictable. For that reason, it is probably impossible to create a “flight simulator” for a civil disturbance response situation. And, the constant adaptation means that people’s attitudes about police change daily. The same event that triggered no public response in March might create intense, destructive protests in May, and vice versa.
There are many other situations in which public planners must operate without the ability to predict. The precise timing and location of large earthquakes or winter storms (beyond a few weeks) are good examples. The precursors that exist are unreliable, producing more false alarms than accurate warnings. As in these examples, propagating a false alarm about impending civil unrest among the public or even only in official channels can cause more harm than good. It creates “numbness” and unwillingness to act decisively in the face of a real violent incident.
Strategic Planning Teams, Information Sources & Simulations
Planners still need to plan even if traditional prediction is not appropriate for the onset of civil unrest. One of the most effective tools is the inclusion of a broad, diverse range of thinkers in the strategic planning team. University of Michigan researcher Scott Page has generated an excellent body of thought that helps guide the creation of more adaptive and effective management teams – especially in the face of complex adaptive systems. If all members of a community engagement team share the same background, the same education, the same knowledge base, it may be beneficial to draw in people who think differently and who draw from different experiences.
Another key element in managing the unknown is the need for constantly evolving sources of information. Decision-makers often get “channel-locked” into looking at particular indicators that, while appropriate for a time, become ineffective sensors of the overall situation. A police department, for example, may have a process to monitor the spread of graffiti “tags” or specific segments of social media. Months later, however, this source of community knowledge goes stale: everybody realizes these things are monitored, gangs and other threats to the community move to alternative channels. The only solution is a constant reanalysis of how the community moves information around, and an expectation that it will always change.
The third component that complexity scientists have found effective is the use of simulation – sometimes called a “computational” approach because of the leverage provided by modern computer systems and object-oriented software. Crowds and communities are far too complex to simulate precisely, but much progress has been made by a “divide and conquer” approach. By simulating simple components of human behavior, such as the belief in the legitimacy of government or the level of “grievance,” scientists have been able to replicate the unpredictable bursts of violence.
Using the “Wrong” Civil Unrest Model
Prominent mathematician George Box famously wrote in 1978, “All models are wrong. Some are useful.” An agent-based model of civil unrest is shown in Figure 1. It is certainly “wrong” in the sense that it fails to include the many factors that constitute a neighborhood. Streets and buildings, for example, are not depicted and do not constrain the movement of people. Such models cannot determine the precise optimum number of police patrols necessary to suppress violence. They can help, however, guide an assessment of risk. If properly validated, models and simulations can show the impact of doubling police patrols or cutting them in half. They can also suggest whether there is a point of diminishing returns, where adding or reducing police patrols makes no difference.
Complex adaptive systems, therefore, require a complex adaptive response. Diversity in thinking and experience in the strategic planning team helps to harness the power of creativity and evolving ideas. An adaptive situational awareness process is vital to avoid drawing information from obsolete or misleading sources. Just as important is the need to apply this creativity and adaptivity to the response system. Methods of community engagement are vital to managing good police-community relations, but they lose their effectiveness in time. This requires a process of constant innovation and adaptation. Such a process is intuitive and difficult to formalize, train for, and even describe and define. It is part of the “blue sense” that comes from front-line experience.
The good news is that the strategic approach on the part of first responders has evolved to encompass many of these concepts. Successful police leaders realize the simple fact that they are never “done.” There is no permanent solution – no “best practice” that will create a permanent solution. Instead, they must be constantly innovating, avoiding unproductive mindsets, and questioning assumptions. New ideas can come from a variety of sources: within the force, the community, other emergency preparedness professionals, academia, the military, or even international sources.
Future Development
These are the early days in understanding how to manage complex adaptive systems. The techniques presented here, and others, are under active development by economists, market analysts, the insurance industry, public policy designers, traffic engineers, the military, and cybersecurity professionals – just to mention a few who recognize the unique challenges of complexity. New concepts and ideas surface all the time, and many communities learn from one another. The cross-pollination is enabled by a growing community of academic researchers, who experiment with new ways to analyze the massive data associated with these large-scale systems. In the coming years, it is highly likely that emergency preparedness planners with be able to draw from these successes and significantly improve the ability to plan for and adapt to the challenge of “black swan” events.
Kenneth Comer
Kenneth W. Comer, Ph.D., is an associate professor in the Volgenau School of Engineering at George Mason University as well as a decision sciences expert with Booz Allen Hamilton. His current research centers on quantitative analysis and modeling of social systems and self-organizing social networks. In August 2012, he retired as a senior executive from the Department of Defense (DOD), where he had a number of assignments, culminating as deputy director of the Joint IED Defeat Organization (JIEDDO). His work was featured in an article in the 31 March 2011 article of Nature, entitled “Web of War.” He also served five years as a U.S. Navy nuclear submarine officer, and 22 years as an analyst at the Central Intelligence Agency. He holds a Ph.D. in systems engineering and operations research (from George Mason) and master’s degrees from The George Washington University and Georgetown University and a B.S. from Cornell.
- This author does not have any more posts.