An event happens, and a thorough and careful investigation is initiated, which leads to policy change as a result of careful investigation, assessment, and policy design
By Dr Caroline Nabuzale
Based on a report from the ministry of disaster preparedness and Refugee affairs of 2010, disasters affect more than 200,000 Ugandans every year on average. Natural disasters, such as drought, flooding and landslides, affected 3.6 million Ugandans between 2000 and 2009 in 1987, drought-affected 600,000 people and epidemic diseases killed 156 people two years later.
In 2005, drought-affected 600,000 people and the following year (2006) epidemic diseases killed 100 more. In the year 2007, floods affected 718,045 people while epidemic diseases killed 67 people and landslides 5 people. In 2012 landslides buried hundreds of people in Bududa in Eastern Uganda.
The institutional framework for disaster risk management in Uganda comprises of the ministry of disaster preparedness and refugee affairs together with the office of the prime minister. From a disaster communication perspective, the quality of crisis communication content is a critical influence on natural disasters prevention and recovery in communities.
The Perspective of Disasters in the USA
According to Dr Thomas A. Birkland a Professor of Public Policy at North Carolina State University - USA, an event happens, and then change happens with little or no effort devoted to learning from the event. A major example is the USA Patriot Act, which was enacted very soon after the September 11 attacks, without any real effort expended to see whether the policy tools contained in that act would really be the most effective in preventing terrorist attacks.
The same author states that, an event happens, and an investigation is undertaken that is agency serving, is incomplete, or states the obvious, without any evidence of a serious attempt to learn. An example is the Executive Office of the President's Lessons Learned from Katrina, the point of which is as much rhetoric. Such reports simply hope to contain the scope of conflict by creating the appearance of learning or reform. Of course, there may well be some real learning reflected in such reports, but their primary function, ultimately, is public reassurance, not internal evaluation.
Furthermore, an event happens, and an investigation is initiated, which leads to a policy change, but that policy change cannot be linked to the investigation or policy changes without reference to the changes recommended in the post-event investigation. For example, there were many different attempts to investigate September 11, but it is not clear whether the creation of the Department of Homeland Security was a direct outcome of these investigations, particularly given the thin evidence that such an agency was really necessary. Indeed, DHS was created 2 days before the major investigation - popularly known as the September 11 commission - was established. Its final report was submitted in September 2004.
However, we might still find the learning process to be functional if the crisis was so anomalous that no intervention could improve policy performance, such as the unforeseen ‘freak accident', or if the remedy for the problem would create more problems than the original problem itself. For example, we know that some number of people may be trapped in cars by seat belts in accidents, and may perish in a fire if the car catches fire. We also know that some very small fraction of people who are vaccinated against diseases may react badly to the vaccine, resulting in illness or death. But we do not generally contemplate removing seat belts or halting vaccinations because the broader social good these things do far outweighs the small potential harms.
An event happens, and a thorough and careful investigation is initiated, which leads to policy change as a result of careful investigation, assessment, and policy design. An example is the Columbia Accident Investigation Board, which probed the 2003 space shuttle accident. There were changes at NASA as a result of this report, including a much closer inspection of heat shields and, in particular, of potential damage to wings from falling foam debris from the external fuel tank.
However, one must not make too much of ‘successful' learning, because these lessons can decay over time, as they did between the loss of Challenger and Columbia. On the other hand, the second shuttle accident has led to fundamental rethinking about spaceship design, with new craft being simplified and designed to put the crew far forward of the dangerous fuel tanks; this focus on safety and survivability is a function of double-loop learning.
Many careful investigations yield to single-loop learning that does yield operational and regulatory change without being elevated to the legislative level.
From a political perspective, organizations are barriers to effective learning from disasters, and builds on general theory building on learning from extreme events to explain this phenomenon. Fantasy documents are not generally about the ‘real' causes and solutions to disasters; rather, they are generated to prove that some authoritative actor has ‘done something' about a disaster.
Because it is difficult to test whether learning happened after an extreme event, these post-disaster documents are generally ignored after they are published.
The writer is a postdoctoral research fellow at the department of communication, North Carolina State University