1,885
16
Research Paper, 5 pages (1300 words)

The man-made disaster: chernobyl

Gulin Langbroek 11. 1 THE MAN-MADE DISASTER: CHERNOBYL “ It is one of histories ironies that the worst nuclear accident began as a test to improve safety. ”, states Snell (1988). The Soviets wanted to find out how the Chernobyl power plant would cope with a sudden power loss, therefore the experiment tested how long a spinning turbine could provide electric power to certain systems in the plant. Like many accidents, the Chernobyl accident resulted from a combination of human error and weaknesses in the design of the plant.

The man-made disaster occured at Unit 4 of the Chernobyl nuclear power plant in the former Ukranian Republic belonging to the Union of Soviet Socialist Republics and near the borders of Belarus and the Russian Federation. Following a short explanation of thehealthand social impacts of the accident, this essay will discuss the errors in judgment and biases that went on while running the Chernobyl power plant. As a result of the accident, tons of radioactive material was released to the air, still posing a threat to living beings in that region.

The radioactive doses caused long term health effects ranging from thyroid cancer to leukemia. The Chernobyl area was also connected directly with the river systems of the Ukranian Republic, causing destruction of biological life in rivers and also deaths of people who had consumed river water. It is also a fact that cleaning the area was just as dangerous to those people who had to do it as they were exposed to higher doses of radiation. Agricultural regions near Chernobyl had caused the production of foods such as milk and vegetables with radioactive material contamination.

Lots of people were forced to migrate from contaminated areas to uncontaminated areas, creating social problems such as loss of staff, no job availability and many more difficulties which made everyday life miserable. Overall, the Chernobyl accident has caused great distress and casualties in the USSR and European countries. 1 There were some errors which should be mentioned before going into details on the errors in judgment. One error which might have caused the accident was that it was a rushed experiment.

The test was scheduled to be carried out just before a reactor shutdown which only occurred once a year, so the operators felt under pressure to complete it promptly so that another year wouldn’t have to be waited. This probably didn’t trigger the accident directly but perhaps was one of the factors causing the necessary measures and precautions to not be taken. The test was thought to be an electrical test only, so instead of the reactor specialists, turbine manufacturers were the ones who were observing it. Thus, the effects on the reactor was not weighed fully.

Finally, the Chernobyl plant was one of the most developed and highly technologic power plants ever constructed, therefore the operators running it felt as if they were an exclusive and elite crew and had built too much overconfidence, not realising possible disasters. To be specific, some biases could be named and analyzed further. Perhaps the most crucial bias which should be looked at in all man-made disasters is the neglect of probability which is the tendency to omit the probability offailurewhen making a decision.

This also ties in with the overconfidence bias since if the managers had doubted the reactor in the first place, more precautions would have been taken. In this case, such a massive disaster had never happened before among Russia, and since the power plant as stated before was assumed to be very reputable and exceptional, the managers of the plant had neglected any probability of the experiment going wrong. According to Kletz (2001)“ The managers do not seem to have asked themselves what would occur if the experiment was unsuccessful.

Before every experiment we should list all possible outcomes and their effects and decide how they will be handled. ” 2 The second biggest bias of the owners and constructors of the plant which caused the accident was the functional fixedness bias. As it is stated in Wikipedia (“ List of Cognitive Biases 2012) “ This bias limits a person to using an object only in the way it is traditionally used”. The reactor was operated in a rule-based behaviour, meaning that the operators were informed on what tasks they should complete but not told why it was so important to complete them.

This had caused them to operate the plant in a way which Kletz (2001) states as “ process feel rather than theoretical knowledge”. Before the Chernobyl accident, all reactors were designed and relied on the fact that rules would be obeyed and instructions would be followed so there was no need to set up extra protective facilities. This of course could have been the worst approach to building a nuclear plant, considering the fact that the workers were not trained to their best abilities.

Instead of relying on the traditional method of assuming operators would follow the rules, the reactor should have been built in a way that the rules could not be ignored. That way the workers would not have been limited to using their insufficient information on how to run a power plant andtechnologywould have done this job instead of them. In short, the traditional way of relying on man-made decisions should have been abandoned and relying on automatic equipment should have been adapted. Assuming operators would obey rules brings another issue to light, the projection bias.

The projection bias is defined as unconsciously assuming that one’s personal emotions, thoughts and values are shared by others. The lack ofcommunicationbetween the managers of the power plant and the operators in how seriously safety measures should have been taken is among the biggest causes of the disaster. According to Kletz (2012), the managers of Chernobyl had “ talked about getting things done without any mention of safety, leaving the operators with the impression that safety is less important.

Managers should remember, when giving instructions, that what you don’t say is as important as what you do say. ” 3 Last but not least, the biggest error in judgment the operators could have had was caused by the ostrich effect. This bias is the act of ignoring an obvious negative sitution. The big question is, why should any operator ignore situations which could cause the death of many people including their own? The answer lies in how the management system was established.

Because the reactor relied on decisions of the higher authorities and not on protective safety equipments, every little detail of the power plant had to be consulted with the managers. As Kletz states (2012), “ Everything had to be referred to the top so it was necessary to break the rules in order to get anything done”. Running a power plant should have not relied on this kind of system since operators were more likely to take shortcuts, not inform the managers or simply ignore problems so that they could get things done quickly. Had these biases and errors in judgment not occured, the accident would perhaps never have happened.

In operating such intricate systems such as a power plant, one must keep in mind two crucial things: Always having protective equipment installed and never letting workers neglect safety rules. Unfortunately as humans, only after this disaster have we began to take these precautions, making us victims of the normalcy bias. In any case, we must always look out for human errors that might lead to irreversible damage. 4 RESOURCES Marples, D. R. , & Snell, V. G. (1988). The social impact of the chernobyl disaster. London: The Macmillan Press Kletz, T. (2001). Learning from accidents.

Retrieved from ftp://193. 218. 136. 74/pub/anon/ELSEVIER-Referex/1-Chemical%20Petrochemical%20and%20Process%20Collection/CD1/KLETZ,%20T. %20A. %20(2001). %20Learning%20from%20Accidents%20(3rd%20ed. )/Learning_from_Accidents_3E. pdf European Commision, International Atomic Energy Agency & World Health Organization. (1996). One decade after chernobyl: Summing up the consequences of the accident. Austria: IAEA List of Cognitive Biases. (2012). In Wikipedia. Retrieved November 16, 2012, from http://en. wikipedia. org/wiki/List_of_biases_in_judgment_and_decision_making 5

Thank's for Your Vote!
The man-made disaster: chernobyl. Page 1
The man-made disaster: chernobyl. Page 2
The man-made disaster: chernobyl. Page 3
The man-made disaster: chernobyl. Page 4
The man-made disaster: chernobyl. Page 5
The man-made disaster: chernobyl. Page 6

This work, titled "The man-made disaster: chernobyl" was written and willingly shared by a fellow student. This sample can be utilized as a research and reference resource to aid in the writing of your own work. Any use of the work that does not include an appropriate citation is banned.

If you are the owner of this work and don’t want it to be published on AssignBuster, request its removal.

Request Removal
Cite this Research Paper

References

AssignBuster. (2022) 'The man-made disaster: chernobyl'. 8 September.

Reference

AssignBuster. (2022, September 8). The man-made disaster: chernobyl. Retrieved from https://assignbuster.com/the-man-made-disaster-chernobyl/

References

AssignBuster. 2022. "The man-made disaster: chernobyl." September 8, 2022. https://assignbuster.com/the-man-made-disaster-chernobyl/.

1. AssignBuster. "The man-made disaster: chernobyl." September 8, 2022. https://assignbuster.com/the-man-made-disaster-chernobyl/.


Bibliography


AssignBuster. "The man-made disaster: chernobyl." September 8, 2022. https://assignbuster.com/the-man-made-disaster-chernobyl/.

Work Cited

"The man-made disaster: chernobyl." AssignBuster, 8 Sept. 2022, assignbuster.com/the-man-made-disaster-chernobyl/.

Get in Touch

Please, let us know if you have any ideas on improving The man-made disaster: chernobyl, or our service. We will be happy to hear what you think: [email protected]