Logo for British Columbia/Yukon Open Authoring Platform

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Chapter 5 – Case Studies

Case Study – Space Shuttle Challenger

Please see the earlier section in this textbook that gives more background on this disaster.

Introduction to the Case

On January 28, 1986, seven astronauts were killed when the space shuttle they were piloting, the Challenger, exploded at just over a minute into the flight. The failure of the solid rocket booster O-rings to seal properly allowed hot combustion gases to leak from the side of the booster and burn through the external fuel tank. The failure of the O-ring was attributed to several factors, including faulty design of the solid rocket boosters, insufficient low-temperature testing of the O-ring material and of the joints that the O-ring sealed, and lack of proper communication between different levels of NASA management.

Here are a list of case studies relating to this incident

Engineering.com

posted this case study in  The Engineer   on October 24, 2006

https://www.engineering.com/Library/ArticlesPage/tabid/85/ArticleID/170/The-Space-Shuttle-Challenger-Disaster.aspx

Adapted from material by the Department of Philosophy and Department of Mechanical Engineering  at Texas A&M University   NSF Grant Number DIR-9012252

How does the implied social contract of professionals apply to this case?   What profressional responsibilities were neglected, if any?   Should NASA have done anything differently in their launch decision procedure?

Texas A&M Univerisity Case Studies

https://ethics.tamu.edu/case-studies/

American Society for Engineering Education Case 2014 Study

https://www.asee.org/file_server/papers/attachment/…/2013-Paper-ASEE-Shuttle.pdf

The Challenger Disaster: A Case of Subjective Engineering from the IEEE

From the IEEE archives: NASA’s resistance to probabilistic risk analysis contributed to the Challenger disaster

https://spectrum.ieee.org/tech-history/heroic-failures/the-space-shuttle-a-case-of-subjective-engineering

To the extent possible under law, Jennifer Kirkey has waived all copyright and related or neighboring rights to Engineering and Technology in Society - Canada , except where otherwise noted.

Share This Book

For IEEE Members

Ieee spectrum, follow ieee spectrum, support ieee spectrum, enjoy more free content and benefits by creating an account, saving articles to read later requires an ieee spectrum account, the institute content is only available for members, downloading full pdf issues is exclusive for ieee members, downloading this e-book is exclusive for ieee members, access to spectrum 's digital edition is exclusive for ieee members, following topics is a feature exclusive for ieee members, adding your response to an article requires an ieee spectrum account, create an account to access more content and features on ieee spectrum , including the ability to save articles to read later, download spectrum collections, and participate in conversations with readers and editors. for more exclusive content and features, consider joining ieee ., join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of spectrum’s articles, archives, pdf downloads, and other benefits. learn more →, join the world’s largest professional organization devoted to engineering and applied sciences and get access to this e-book plus all of ieee spectrum’s articles, archives, pdf downloads, and other benefits. learn more →, access thousands of articles — completely free, create an account and get exclusive content and features: save articles, download collections, and talk to tech insiders — all free for full access and benefits, join ieee as a paying member., the challenger disaster: a case of subjective engineering, from the archives: nasa’s resistance to probabilistic risk analysis contributed to the challenger disaster.

Illustration: Barry Ross

Editor’s Note: Today is the 30 th  anniversary of the loss of the space shuttle Challenger, which was destroyed 73 seconds in its flight, killing all onboard. To mark the anniversary, IEEE Spectrum is republishing this seminal article which first appeared in June 1989 as part of a special report on risk. The article has been widely cited in both histories of the space program and in analyses of engineering risk management.

“Statistics don’t count for anything,” declared Will Willoughby, the National Aeronautics and Space Administration’s former head of reliability and safety during the Apollo moon landing program. “They have no place in engineering anywhere.” Now director of reliability management and quality assurance for the U.S. Navy, Washington, D.C., he still holds that risk is minimized not by statistical test programs, but by “attention taken in design, where it belongs.” His design-­oriented view prevailed in NASA in the 1970s, when the space shuttle was designed and built by many of the engineers who had worked on the Apollo program.

“The real value of probabilistic risk analysis is in understanding the system and its vulnerabilities,” said Benjamin Buchbinder, manager of NASA’s two-year-old risk management program. He maintains that probabilistic risk analysis can go beyond design-oriented qualitative techniques in looking at the interactions of subsystems, ascertaining the effects of human activity and environmental conditions, and detecting common-cause failures.

NASA started experimenting with this program in response to the Jan. 28, 1986, Challenger accident that killed seven astronauts . The program’s goals are to establish a policy on risk management and to conduct risk assessments independent of normal engineering analyses. But success is slow because of past official policy that favored “engineering judgment” over “probability numbers,” resulting in NASA’s failure to collect the type of statistical test and flight data useful for quantitative risk assessment.

This Catch 22–the agency lacks appropriate statistical data because it did not believe in the technique requiring the data, so it did not gather the relevant data–is one example of how an organization’s underlying culture and explicit policy can affect the overall reliability of the projects it undertakes.

External forces such as politics further shape an organization’s response. Whereas the Apollo program was widely supported by the President and the U.S. Congress and had all the money it needed , the shuttle program was strongly criticized and underbudgeted from the beginning. Political pressures, coupled with the lack of hard numerical data, led to differences of more than three orders of magnitude in the few quantitative estimates of a shuttle launch failure that NASA was required by law to conduct.

Some observers still worry that, despite NASA’s late adoption of quantitative risk assessment, its internal culture and its fear of political opposition may be pushing it to repeat dangerous errors of the shuttle program in the new space station program.

Basic Facts

System: National Space Transportation System (NSTS)—the space shuttle

Risk assessments conducted during design and operation: preliminary hazards analysis; failure modes and effects analysis with critical items list; various safety assessments, all qualitative at the system level, but with quantitative analyses conducted for specific subsystems.

Worst failure: In the January 1986 Challenger accident, primary and secondary O-rings in the field joint of the right solid-fuel rocket booster were burnt through by hot gases.

Consequences: loss of $3 billion vehicle and crew.

Predictability: long history of erosion in O-rings, not envisaged in the original design.

Causes: inadequate original design (booster joint rotated farther open than intended); faulty judgment (managers decided to launch despite record low temperatures and ice on launch pad); possible unanticipated external events (severe wind shear may have been a contributing factor).

Lessons learned: in design, to use probabilistic risk assessment more in evaluating and assigning priorities to risks; in operation, to establish certain launch commit criteria that cannot be waived by anyone.

Other outcomes: redesign of booster joint and other shuttle subsystems that also had a high level of risk or unanticipated failures; reassessment of critical items.

NASA’s preference for a design approach to reliability to the exclusion of quantitative risk analysis was strengthened by a negative early brush with the field. According to Haggai Cohen, who during the Apollo days was NASA’s deputy chief engineer, NASA contracted with General Electric Co. in Daytona Beach, Fla., to do a “full numerical PRA [probabilistic risk assessment]” to assess the likelihood of success in landing a man on the moon and returning him safely to earth. The GE study indicated the chance of success was “less than 5 percent.” When the NASA Administrator was presented with the results, he felt that if made public, “the numbers could do irreparable harm, and he disbanded the effort,” Cohen said. “We studiously stayed away from [numerical risk assessment] as a result.”

“That’s when we threw all that garbage out and got down to work,” Willoughby agreed. The study’s proponents, he said, contended “ ‘you build up confidence by statistical test programs. ’ We said, ‘No, go fly a kite, we’ll build up confidence by design.’ Testing gives you only a snapshot under particular conditions. Reality may not give you the same set of circumstances, and you can be lulled into a false sense of security or insecurity.”

As a result, NASA adopted qualitative failure modes and effects analysis (FMEA) as its principal means of identifying design features whose worst-case failure could lead to a catastrophe. The worst cases were ranked as Criticality 1 if they threatened the life of the crew members or the existence of the vehicle; Criticality 2 if they threatened the mission; and Criticality 3 for anything less. An R designated a redundant system [see “How NASA determined shuttle risk,”]. Quantitative techniques were limited to calculating the probability of the occurrence of an individual failure mode “if we had to present a rationale on how to live with a single failure point,” Cohen explained.

The politics of risk

By the late 1960s and early 1970s the space shuttle was being portrayed as a reusable airliner capable of carrying 15-ton payloads into orbit and 5-ton payloads back to earth. Shuttle astronauts would wear shirtsleeves during takeoff and landing instead of the bulky spacesuits of the Gemini and Apollo days. And eventually the shuttle would carry just plain folks: non-astronaut scientists, politicians, schoolteachers, and journalists.

NASA documents show that the airline vision also applied to risk. For example, in the 1969 NASA Space Shuttle Task Group Report , the authors wrote: “It is desirable that the vehicle configuration provide for crew/passenger safety in a manner and to the degree as provided in present day commercial jet aircraft.”

Statistically an airliner is the least risky form of transportation, which implies high reliability. And in the early 1970s, when President Richard M. Nixon, Congress, and the Office of Management and Budget (OMB) were all skeptical of the shuttle, proving high reliability was crucial to the program’s continued funding.

OMB even directed NASA to hire an outside contractor to do an economic analysis of how the shuttle compared with other launch systems for cost-effectiveness, observed John M. Logsdon, director of the graduate program in science, technology, and public policy at George Washington University in Washington, D.C. “No previous space programme had been subject to independent professional economic evaluation,” Logsdon wrote in the journal Space Policy in May 1986. “It forced NASA into a belief that it had to propose a Shuttle that could launch all foreseeable payloads ... [and] would be less expensive than alternative launch systems” and that, indeed, would supplant all expendable rockets. It also was politically necessary to show that the shuttle would be cheap and routine, rather than large and risky, with respect to both technology and cost, Logsdon pointed out.

Amid such political unpopularity, which threatened the program’s very existence, “some NASA people began to confuse desire with reality,” said Adelbert Tischler, retired NASA director of launch vehicles and propulsion. “One result was to assess risk in terms of what was thought acceptable without regard for verifying the assessment.” He added: “Note that under such circumstances real risk management is shut out.”

‘Disregarding data’

By the early 1980s many figures were being quoted for the overall risk to the shuttle, with estimates of a catastrophic failure ranging from less than 1 chance in 100 to 1 chance in 100 000. “The higher figures [1 in 100] come from working engineers, and the very low figures [1 in 100 000] from management,” wrote physicist Richard P. Feynman in his appendix “Personal Observations on Reliability of Shuttle” to the 1986 Report of the Presidential Commission on the Space Shuttle Challenger Accident .

The probabilities originated in a series of quantitative risk assessments NASA was required to conduct by the Interagency Nuclear Safety Review Panel (INSRP), in anticipation of the launch of the Galileo spacecraft on its voyage to Jupiter, originally scheduled for the early 1980s. Galileo was powered by a plutonium-­fueled radioisotope thermoelectric generator, and Presidential Directive/NSC-25 ruled that either the U.S. President or the director of the office of science and technology policy must examine the safety of any launch of nuclear material before approving it. The INSRP (which consisted of representatives of NASA as the launching agency, the Department of Energy, which manages nuclear devices, and the Department of Defense, whose Air Force manages range safety at launch) was charged with ascertaining the quantitative risks of a catastrophic launch dispersing the radioactive poison into the atmosphere. There were a number of studies because the upper stage for boosting Galileo into interplanetary space was reconfigured several times.

The first study was conducted by the J. H. Wiggins Co. of Redondo Beach, Calif., and published in three volumes between 1979 and 1982. It put the overall risk of losing a shuttle with its spacecraft payload during launch at between 1 chance in 1000 and 1 in 10, 000. The greatest risk was posed by the solid-fuel rocket boosters (SRBs). The Wiggins author noted that the history of other solid-fuel rockets showed them as undergoing catastrophic launches somewhere between 1 time in 59 and 1 time in 34, but that the study’s contract overseers, the Space Shuttle Range Safety Ad Hoc Committee, made an “engineering judgment” and “decided that a reduction in the failure probability estimate was warranted for the Space Shuttle SRBs” because “the historical data includes motors developed 10 to 20 years ago.” The Ad Hoc Committee therefore “decided to assume a failure probability of 1 x 10 -3 for each SRB. “ In addition, the Wiggins author pointed out, “it was decided by the Ad-Hoc Committee that a second probability should be considered… which is one order of magnitude less” or 1 in 10, 000, “justified due to unique improvements made in the design and manufacturing process used for these motors to achieve man rating.”

In 1983 a second study was conducted by Teledyne Energy Systems Inc., Timonium, Md., for the Air Force Weapons Laboratory at Kirtland Air Force Base, N.M. It described the Wiggins analysis as consisting of “an interesting presentation of launch data from several Navy, Air Force, and NASA missile programs and the disregarding of that data and arbitrary assignment of risk levels apparently per sponsor direction” with “no quantita­tive justification at all.” After reanalyzing the data, the Teledyne authors concluded that the boosters ’ track record “suggest[s] a failure rate of around one-in-a-hundred.”

When risk analysis isn’t

NASA conducted its own internal safety analysis for Galileo, which was published in 1985 by the Johnson Space Center. The Johnson authors went through failure mode worksheets assigning probability levels. A fracture in the solid-rocket motor case or case joints —similar to the accident that destroyed Challenger —was assigned a probability level of 2; which a separate table defined as corresponding to a chance of 1 in 100, 000 and described as “remote,” or “so unlikely, it can be assumed that this hazard will not be experienced.”

The Johnson authors’ value of 1 in 100 000 implied, as Feynman spelled out, that “one could put a Shuttle up each day for 300 years expecting to lose only one.” Yet even after the Challenger accident, NASA’s chief engineer Milton Silveira, in a hearing on the Galileo thermonuclear generator held March 4, 1986, before the U.S. House of Representatives Committee on Science and Technology, said: “We think that using a number like 10 to the minus 3, as suggested, is probably a little pessimistic.” In his view, the actual risk “would be 10 to the minus 5, and that is our design objective.” When asked how the number was deduced, Silveira replied, “We came to those probabilities based on engineering judgment in review of the design rather than taking a statistical data base, because we didn’t feel we had that.”

After the Challenger accident, the 1986 presidential commission learned the O-rings in the field joints of the shuttle’s solid-­fuel rocket boosters had a history of damage correlated with low air temperature at launch. So the commission repeatedly asked the witnesses it called to hearings why systematic temperature­-correlation data had been unavailable before launch.

NASA’s “management methodology” for collection of data and determination of risk was laid out in NASA’s 1985 safety analysis for Galileo. The Johnson space center authors explained: “Early in the program it was decided not to use reliability (or probability) numbers in the design of the Shuttle” because the magnitude of testing required to statistically verify the numerical predictions “is not considered practical.” Furthermore, they noted, “experience has shown that with the safety, reliability, and quality assurance requirements imposed on manned space­flight contractors, standard failure rate data are pessimistic.”

“In lieu of using probability numbers, the NSTS [National Space Transportation System] relies on engineering judgment using rigid and well-documented design, configuration, safety, reliability, and quality assurance controls,” the Johnson authors continued. This outlook determined the data NASA managers required engineers to collect. For example, no “lapsed-time indicators” were kept on shuttle components, subsystems, and systems, although “a fairly accurate estimate of time and/or cycles could be derived,” the Johnson authors added.

One reason was economic. According to George Rodney, NASA’s associate administrator of safety, reliability, maintain­ability and quality assurance, it is not hard to get time and cycle data, “but it’s expensive and a big bookkeeping problem.”

Another reason was NASA’s “normal program development: you don’t continue to take data; you certify the components and get on with it,” said Rodney’s deputy, James Ehl. “People think that since we’ve flown 28 times, then we have 28 times as much data, but we don ’t. We have maybe three or four tests from the first development flights.”

In addition, Rodney noted, “For everyone in NASA that’s a big PRA [probabilistic risk assessment] seller, I can find you 10 that are equally convinced that PRA is oversold…  [They] are so dubious of its importance that they won ’t convince themselves that the end product is worthwhile.”

Risk and the organizational culture

One reason NASA has so strongly resisted probabilistic risk analysis may be the fact that “PRA runs against all traditions of engineering, where you handle reliability by safety factors,” said Elisabeth Paté-Cornell, associate professor in the department of industrial engineering and engineering management at Stanford University in California, who is now studying organizational factors and risk assessment in NASA. In addition, with NASA’s strong pride in design, PRA may be “perceived as an insult to their capabilities, that the system they ’ve designed is not 100 percent perfect and absolutely safe,” she added. Thus, the character of an organization influences the reliability and failure of the systems it builds because its structure, policy, and culture determine the priorities, incentives, and communication paths for the engineers and managers doing the work, she said.

“Part of the problem is getting the engineers to understand that they are using subjective methods for determining risk, because they don’t like to admit that,” said Ray A. Williamson, senior associate at the U.S. Congress Office of Technology Assessment in Washington, D.C. “Yet they talk in terms of sounding objective and fool themselves into thinking they are being objective.”

“It’s not that simple,” Buchbinder said. “A probabilistic way of thinking is not something that most people are attuned to. We don’t know what will happen precisely each time. We can only say what is likely to happen a certain percentage of the time.” Unless engineers and managers become familiar with probability theory, they don ’t know what to make of “large uncertainties that represent the state of current knowledge,” he said. “And that is no comfort to the poor decision-maker who wants a simple answer to the question, ‘Is this system safe enough? ’”

As an example of how the “mindset” in the agency is now changing in favor of “a willingness to explore other things,” Buchbinder cited the new risk management program, the workshops it has been holding to train engineers and others in quantitative risk assessment techniques, and a new management instruction policy that requires NASA to “provide disciplined and documented management of risks throughout program life cycles.”

Hidden risks to the space station

NASA is now at work on its big project for the 1990s: a space station, projected to cost $30 billion and to be assembled in orbit , 220 nautical miles above the earth, from modules carried aloft in some two dozen shuttle launches. A National Research Council committee evaluated the space station program and concluded in a study in September 1987: “If the probability of damaging an Orbiter beyond repair on any single Shuttle flight is 1 percent— the demonstrated rate is now one loss in 25 launches, or 4 percent —the probability of losing an Orbiter before [the space station's first phase] is complete is about 60 percent.”

The probability is within the right order of magnitude, to judge by the latest INSRP-mandated study completed in December for Buchbinder’s group in NASA by Planning Research Corp., McLean, Va. The study, which reevaluates the risk of the long-delayed launch of the Galileo spacecraft on its voyage to Jupiter, now scheduled for later this year, estimates the chance of losing a shuttle from launch through payload deployment at 1 in 78, or between 1 and 2 percent, with an uncertainty of a factor of 2.

Those figures frighten some observers because of the dire con­sequences of losing part of the space station. “The space station has no redundancy — no backup parts,” said Jerry Grey, director of science and technology policy for the American Institute of Aeronautics and Astronautics in Washington, D.C.

The worst case would be loss of the shuttle carrying the logistics module, which is needed for reboost, Grey pointed out. The space station’s orbit will subject it to atmospheric drag such that, if not periodically boosted higher, it will drift downward and with in eight months plunge back to earth and be destroyed, as was the Skylab space station in July 1979. “If you lost the shuttle with the logistics module, you don ’t have a spare, and you can ’t build one in eight months,” Grey said, “so you may lose not only that one payload, but also whatever was put up there earlier.”

Why are there no backup parts? “Politically the space station is under fire [from the U.S. Congress] all the time because NASA hasn’t done an adequate job of justifying it,” said Grey. “NASA is apprehensive that Congress might cancel the entire program”— and so is trying to trim costs as much as possible.

Grey estimated that spares of the crucial modules might add another 10 percent to the space station’s cost. “But NASA is not willing to go to bat for that extra because they ’re unwilling to take the political risk,” he said— a replay, he fears, of NASA’s response to the political negativism over the shuttle in the 1970s.

The NRC space station committee warned: “It is dangerous and misleading to assume there will be no losses and thus fail to plan for such events.”

“Let’s face it, space is a risky business,” commented former Apollo safety officer Cohen. “I always considered every launch a barely controlled explosion.”

“The real problem is: whatever the numbers are, acceptance of that risk and planning for it is what needs to be done,” Grey said. He fears that “NASA doesn’t do that yet.”

In addition to the sources named in the text, the authors would like to acknowledge the information and insights afforded by the following: E. William Colglazier (director of the Energy, Environment and Resources Center at the University of Tennessee, Knoxville) and Robert K. Weatherwax (president of Sierra Energy & Risk Assessment Inc., Roseville, Calif.), the two authors of the 1983 Teledyne/Air Force Weapons Laboratory study; Larry Crawford, director of reliability and trends analysis at NASA head­quarters in Washington, D.C.; Joseph R. Fragola, vice president, Science Applications International Corp., New York City; Byron Peter Leonard, president, L Systems Inc., El Segundo, Calif.; George E. Mueller, former NASA associate administrator for manned spaceflight; and Marcia Smith, specialist in aerospace policy, Congressional Research Service, Washington, D.C.

This article first appeared in print in June 1989 as part of the special report “Managing Risk In Large Complex Systems” under the title “The space shuttle: a case of subjective engineering.” 

How NASA Determined Shuttle Risk

At the start of the space shuttle’s design, the National Aeronautics and Space Administration defined risk as “the chance (qualitative) of loss of personnel capability, toss of system, or damage to or loss of equipment or property.” NASA accordingly relied on several techniques for determining reliability and potential design problems, concluded the U.S. National Research Council’s Committee on Shuttle Criticality Review and Hazard Analysis Audit in its January 1988 report Post-Challenger Evaluation of Space Shuttle Risk Assessment and Management . But, the report noted, the analyses did “not address the relative probabilities of a particular hazardous condition arising from failure modes, human errors, or external situations,” so did not measure risk.

A failure modes and effects analysis (FMEA) was the heart of NASA’s effort to ensure reliability, the NRC report noted. An FMEA, carried out by the contractor building each shuttle element or subsystem, was performed on all flight hardware and on ground support equipment that interfaced with flight hard ware. Its chief purpose was to i dentify hardware critical to the performance and safety of the mission.

Items that did not meet certain design, reliability and safety requirements specified by NASA’s top management and whose failure could threaten the toss of crew, vehicle, or mis­sion, made up a critical i tems list (CIL).

Although the FMEA/CIL was first viewed as a design tool, NASA now uses it during operations and management as well, to analyze problems, assess whether corrective actions are effective, identify where and when inspection an d maintenance are needed, and reveal trends in failures.

Second, NASA conducted hazards analyses, performed jointly by shuttle engineers and by NASA’s safety and operations organizations. They made use of the FMEA/ CIL, various design reviews, safety analyses, and other studies. They considered not only the failure modes identified In the FMEA, but also other threats p osed by the mission activities, crew­machine interfaces, and the environment. After hazards and their causes were identified, NASA engineers and managers had to make one of three decisions: to eliminate the cause of each hazard, to control the cause if it could not be eliminated, or to accept the hazards that could not be controlled.

NASA also conducted an element i nterface functional analysis (EIFA) to look at the shuttle more nearly as a com plete system. Both the FMEA and the hazards analyses concentrated only on i ndividual elements of the shuttle: the space shuttle’s main engines i n the orbiter, the rest of the orbiter, the external tank, and the solid fuel rocket boosters. The EIFA assessed hazards at the mating of the elements.

Also to examine the shuttle as a system, NASA conducted a one-time critical functions assessment in 1978, which searched for multiple and cascading failures. The information from all these studies fed one way into an overall mission safety assessment.

The NRC committee had several criticisms. In practice, the FMEA was the sole basis for some engineering change decisions and all engineering waivers and rationales for re taining certain high-risk design features. However, the NRC report noted, hazard analyses for some important, high-risk subsystems “were not updated for years at a time even though design changes had occurred or dangerous failures were experienced.” On one procedural flow chart, the report noted, “the ‘Hazard Analysis As Required ’ is a dead-end box with inputs but no output with respect to waiver approval decisions.”

The NAC committee concluded that “the isolation of the hazard analysis within NASA’s risk assessment and management process to date can be seen as reflecting the past weakness of the entire safety organization” —T.E.B. and K.E.

A Bosch Engineer Speeds Hybrid Race Cars to the Finish Line

Powering planes with microwaves is not the craziest idea, tsunenobu kimoto leads the charge in power devices.

Engineering Ethics Case Study: The Challenger Disaster

Topic outline.

space shuttle challenger disaster ethics case study

Credits : 4 PDH

Pdh course description:.

  • Common errors to avoid in studying the history of an engineering failure: the retrospective fallacy and the myth of perfect engineering practice
  • Shuttle hardware involved in the disaster
  • Decisions made in the period preceding the launch
  • Ethical issue: NASA giving first priority to public safety over other concerns
  • Ethical issue: the contractor giving first priority to public safety over other concerns
  • Ethical issue: whistle blowing
  • Ethical issue: informed consent
  • Ethical issue: ownership of company records
  • Ethical issue: how the public perceives that an engineering decision involves an ethical violation

To take this course:

Setup Menus in Admin Panel

No products in the cart.

E – 1142 – Engineering Ethics Case Study: The Challenger Disaster

Profile Photo

  • Course No E – 1142
  • PDH Units 3.00

space shuttle challenger disaster ethics case study

Intended Audience: all types of engineers

Pdh units: 3, learning objectives.

  • Common errors to avoid in studying the history of an engineering failure
  • Retrospective fallacy and the myth of perfect engineering practice
  • Shuttle hardware involved in the disaster
  • Decisions made in the period preceding the launch
  • Ethical issues related to: NASA and contractors giving first priority to public safety over other concerns; whistle blowing; informed consent; ownership of company records; and when and why the public can perceive an engineering decision as involving an ethical violation (and what to do about it ).

Course Reviews

space shuttle challenger disaster ethics case study

Great course. I especially liked the reminders that engineering does not happen in a vacuum and that things that are clear after the fact may not seem out of the ordinary during the course of day to day work.

Once completed, your order and certificate of completion will be available in your profile when you’re logged in to the site.

Ethics Courses

space shuttle challenger disaster ethics case study

E – 1316 – Engineering Ethics Case Study: The Taum Sauk Reservoir Failure by Mark P. Rossow, PhD, P.E.

space shuttle challenger disaster ethics case study

E – 1544 – ADA Update: A Primer for Small Business by Mark P. Rossow, PhD, P.E.

Environmental engineering.

space shuttle challenger disaster ethics case study

E – 1202 EPA Green Streets by Mark P. Rossow, PhD, P.E.

space shuttle challenger disaster ethics case study

E – 1898 Offshore Wind Energy Strategies by Mark P. Rossow, PhD, P.E.

Related courses.

space shuttle challenger disaster ethics case study

E - 2043 Florida Laws and Rules for Professional Engineers Biennium Cycle 2023-2025

Profile Photo

Mark P. Rossow, PhD, P.E.

space shuttle challenger disaster ethics case study

E - 2040 Engineering Ethics: “Hold Safety Paramount” to Prevent Loss of Life - the Case of the Ford Pinto Fires

Profile Photo

Dr. Abolhassan Astaneh-Asl, Professor Emeritus. Ph.D., PE

space shuttle challenger disaster ethics case study

E - 2021 Engineering Ethics—Case Studies in Theft through Fraud

Space Shuttle Challenger Disaster and Ethical Issues Case Study

  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment

The case study details the procedures that took place before the launch of the Challenger Space Shuttle. The author notes that NASA was under pressure from politicians and competing space agencies, which is why the management pushed for the launch despite insufficient testing and the faulty design of the O-rings. Delays due to weather conditions put more pressure on the management and the team. As a result, the O-rings did not close properly, causing an explosion of the Challenger soon after the launch. Seven astronauts were killed in the blast.

The key ethical issues evident in the case were the lack of communication between managers and poor safety culture, which prevented adequate technology testing before the launch. The primary ethical issue, in this case, was the lack of a proper safety culture. It manifested in the management’s decision to launch the shuttle despite insufficient testing and the faults in the design of the O-rings. Since both of these problems increased the risk of explosion during launch, the decision of the management violated the ethical principles of public good and human well-being. Upholding these principles is among the key professional duties of engineers because failure to do so may result in terrible accidents. Besides exploding in the air, the shuttle could have fallen, harming even more people. Hence, the case illustrates the consequences of unethical decision-making in engineering.

Nevertheless, there was another option of how the case could have progressed. Better decision-making from the management could have prevented the explosion from happening. For example, the management could order further testing and delay the launch until the trial has been finished. Additionally, the managers could have canceled the launch to fix the faults in the design of the shuttle, thus also avoiding the explosion. Their subordinates could have also influenced the progression of the case. For instance, drawing the attention of the higher management to the problem could have helped to resolve the failure in communication. This, in turn, could lead to a delay or cancellation of the launch. All in all, both the managers and the subordinates should have prioritized safety by insisting on further testing and design corrections. Stronger safety culture and ethical decision-making would have assisted NASA in avoiding the disaster.

  • The Challenger Space Shuttle Accident Analysis
  • Examining the Challenger Explosion Case
  • The Space Challenger Shuttle: Advocacy vs. Inquiry
  • Responsible Citizenship and College Experience
  • Ethics of Stem Cell Research Creating Superhumans
  • The Right to Suicide: Arguments in Favor
  • Anthropocentric and Non-Anthropocentric Environmental Ethics
  • Publishing Controversial Photographs: To Be or Not To Be?
  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2021, August 25). Space Shuttle Challenger Disaster and Ethical Issues. https://ivypanda.com/essays/space-shuttle-challenger-disaster-and-ethical-issues/

"Space Shuttle Challenger Disaster and Ethical Issues." IvyPanda , 25 Aug. 2021, ivypanda.com/essays/space-shuttle-challenger-disaster-and-ethical-issues/.

IvyPanda . (2021) 'Space Shuttle Challenger Disaster and Ethical Issues'. 25 August.

IvyPanda . 2021. "Space Shuttle Challenger Disaster and Ethical Issues." August 25, 2021. https://ivypanda.com/essays/space-shuttle-challenger-disaster-and-ethical-issues/.

1. IvyPanda . "Space Shuttle Challenger Disaster and Ethical Issues." August 25, 2021. https://ivypanda.com/essays/space-shuttle-challenger-disaster-and-ethical-issues/.

Bibliography

IvyPanda . "Space Shuttle Challenger Disaster and Ethical Issues." August 25, 2021. https://ivypanda.com/essays/space-shuttle-challenger-disaster-and-ethical-issues/.

Failure Case Studies

  • Information for Faculty
  • Bridge Failure Cases
  • Dam Failure Cases
  • Building Failure Cases
  • Other Failure Cases

Space Shuttle Challenger

The Challenger exploded 73 seconds after launch on January 28, 1986. It was the 25th shuttle mission. The seven astronauts killed included a high school teacher, Christa McAuliffe, who was to broadcast lessons from space. Government investigations and redesign efforts grounded the shuttle program for the next two years (Freiman and Schlager 1995a, p. 161).

The Challenger explosion was investigated by a presidential commission, as well as the U.S. Senate and House of Representatives. This disaster represented a rare engineering failure where a number of people knew the cause immediately.

The Rogers Commission, appointed by President Ronald Reagan, recorded 15,000 pages of testimony and reviewed 170,000 pages of documents and hundreds of photographs. More than 6,000 people were interviewed as part of the three-month investigation. The commission’s findings were released on June 6, 1986, in a 256-page report. The House Committee on Science and Technology also conducted hearings and issued a report. After the incident, a number of astronauts and other key personnel resigned from NASA (Freiman and Schlager 1995a, pp. 169-171).

The U.S. House of Representatives Committee on Science and Technology report (Committee on Science and Technology 1986), “Investigation of the Challenger Accident, October 29, 1986,” is available on-line at http://history.nasa.gov/rogersrep/51lcover.htm and http://www.gpoaccess.gov/challenger/64_420.pdf . A summary is provided in Freiman and Schlager (1995a, pp. 161-172). Two ethics websites, http://onlineethics.org (Case Western Reserve University 2007b) and Texas A&M University ( http://ethics.tamu.edu/ethics/shuttle/shuttle1.htm Space Shuttle Challenger Disaster 2007), feature this case study.

  • Search for:
  • Make a Gift
  • Maps / Directions

Facebook

What the Challenger Disaster Proved

We take the workings of wide, complicated technological systems on faith. But they depend on people—and, sometimes, people fail.

Collage of Challenger

Listen to this article

Produced by ElevenLabs and News Over Audio (NOA) using AI narration.

T he modern world runs on a kind of secular faith. Most of us turn on the faucet and expect water, enter an elevator and expect it to take us to our destination, drive over a bridge and expect it to hold up beneath us. Airplanes make this conviction especially visible. Although American aviation is incredibly safe, fear of flying is a widespread, well-known anxiety: Some people just can’t quite stomach hurtling unnaturally through the air, seven miles above everything familiar. But for the large majority of those who travel on planes, trust supersedes these fears—or they just don’t think that hard about it.

In January, though, a door plug that appears to have been improperly installed flew off at 16,000 feet, tearing a hole in the side of a plane. That same month, two aircraft collided on a Japanese runway, resulting in a huge fire. Social media and news articles described how a landing-gear tire fell from a plane and crashed in a parking lot; an engine cowling blew away during takeoff; two weeks ago, one flight’s violent turbulence resulted in a death and dozens of injuries. Suddenly, passengers unaccustomed to thinking about how planes stay up began to panic. Although aviation experts and journalists were quick to reassure the public that planes are built with multiple safeguards and that pilots are trained for emergency scenarios, a bit of the magic had vanished.

This same stunning disillusionment happened on the morning of January 28, 1986, when the American public watched the Space Shuttle Challenger rise into the sky and then disappear in a cloud of white vapor. But first came the confidence, which had been inspired and stoked by two decades of human spaceflight preceding this mission. This belief was also undergirded by a trust in American ingenuity and in the unstoppable march of technological progress. And no one seemed to embody this faith better than the American teacher and astronaut Christa McAuliffe as she lay in her place on the orbiter’s middeck, waiting for liftoff.

She was one of seven astronauts on the doomed flight, but McAuliffe’s name is the enduring symbol of the disaster, because she was not a scientist or an engineer; she was a regular person chosen specifically to be the inaugural “Teacher in Space”—our ambassador to the stars. Of the six other crew members, one, Gregory Jarvis, was a civilian brought along to conduct experiments on fluid dynamics. The rest were career astronauts; among them were Ronald McNair, Ellison Onizuka, and Judith Resnik, pioneers in diversifying NASA’s astronaut corps. In addition to McAuliffe, two others on board, Jarvis and Michael Smith, would be going to space for the first time; McNair and the mission commander, Dick Scobee, had flown on Challenger before.

Although NASA had successfully ferried people to space and back nearly five dozen times, there was still a significant amount of risk in what they were doing—each crew member knew that. But in July 1985, McAuliffe sat down with Johnny Carson on The Tonight Show . Two days before, Challenger had had a thorny launch; while it blasted off, a malfunction had forced one engine to shut down and threatened a second. Without enough thrust to make it to their planned height, the crew was forced to bail out to a lower orbit of Earth. “Are you in any way frightened of something like that?” Carson asked. “Because just the other day … they had a frightening lift.” “Yes,” McAuliffe replied. “I really haven’t thought of it in those terms, because I see the shuttle program as a very safe program.” When the British journalist Adam Higginbotham relates this anecdote in his stunning new book, Challenger , he notes that she was answering as she was expected to: Part of her deal with NASA was to make calm, composed media appearances to spread the good word about the American space agency. But she was also game. The night before everything went wrong, waiting to hear if her mission would go ahead, she told her friend, “I still can’t wait.”

space shuttle challenger disaster ethics case study

She had faith, essentially. Examined closely, it could be described as a belief that the engineers who’d designed the ship’s complex components knew what they were doing; that the manufacturers responsible for assembling them did so correctly; that the crews responsible for repairs and maintenance—fueling and insulating the external tank, repairing the orbiter’s heat-resistant tiles—performed their jobs fastidiously; that Scobee and Smith in the cockpit would fly with experience and precision; that Mission Control would give commands wisely and safely; that any problems found would be openly discussed and rectified. Everything depended on a wide, complicated system of human beings—and on the last point, they failed.

B y the time the crew was on the launchpad on the morning of the 28th, their mission, officially deemed STS-51-L, had been scheduled and scratched multiple times in the previous six days for suboptimal conditions. “They don’t delay unless it’s not perfect,” McAuliffe’s husband had told TV reporters. This would be the 25th mission of a NASA space shuttle, and Challenger’s sister shuttle Columbia had just returned from a six-day journey 10 days before. In classrooms across the country, approximately 2.5 million schoolchildren were watching satellite broadcasts of McAuliffe’s voyage. Routine launches promised to fulfill a long-held dream of a sort of taxi service to space: In 1972, President Richard Nixon had signed off on a “Space Transportation System,” which gave the missions their acronym, saying the U.S. should work to “transform the space frontier of the seventies into familiar territory, easily accessible for human endeavor in the eighties and nineties.” As a result, shuttle seats were no longer limited to just NASA’s rarified crews. Senator Jake Garn made his way onto a 1985 flight, with the controversial goal of overseeing what the government was paying for; NASA was mulling sending a journalist on a mission (Walter Cronkite was considered a front-runner). Space travel was, it seemed, on the cusp of becoming routine.

But the reader, reaching the moment 340 pages into the book when the crew is finally sealed into the orbiter, knows this dream won’t be fulfilled. And Higginbotham includes a horrifying moment of McAuliffe’s faith being shaken: The astronaut assisting them into place and finishing final preflight checks “looked down into her face and saw that her Girl Scout pluck had deserted her,” he writes. “In her eyes he saw neither excitement nor anticipation, but recognized only one emotion: terror.”

She would fly for 73 seconds before the shuttle broke apart in a fireball and a cloud of smoke. After that gut-wrenching instant, and more seconds of stunned silence, a NASA public-affairs officer would speak the understatement that would become famous: “Obviously a major malfunction.”

Read: Our strange new era of space travel

Higginbotham’s book, like his previous one, Midnight in Chernobyl , is a gruesome and meticulous reconstruction of a 1986 disaster. Challenger’s failure is a story of advanced technology that breaks down not because of an unforeseeable act of God, but because of utterly human failures. In this case, according to the Rogers Commission , hot gases snuck out of a joint of one of the shuttle’s two solid rocket boosters before synthetic rubber seals, called O-rings, could expand to close the gap. Flames from the booster burned the surface of the main fuel tank, where half a million gallons of liquid hydrogen and oxygen waited to ignite. The booster wrenched itself from the assembly and tore the ship apart. This happened because the O-rings were slow to expand and inflexible in cold weather. (The physicist Richard Feynman would show this in a televised post-event hearing through a devastatingly simple demonstration: dunking the material in a cup of ice water.) The morning of January 28 was below freezing in Cape Canaveral, Florida, and Challenger had sat on the launchpad in such weather overnight.

The double O-rings had long been a problematic fix to a technical snag where two pieces of the rocket fit together. The joint had been inspired by a missile used by the Air Force; facing budget pressure for the first time in its existence, the space agency was compelled by “the imperative to invent nothing new,” as Higginbotham explains. Engineers at Morton Thiokol, the firm that designed and manufactured the rockets, added a second O-ring to enhance the original joint, among other modifications. Because they’d changed an existing technology, the company and their supervisors at the Marshall Space Flight Center, in Huntsville, Alabama, felt, according to Higginbotham, that they had only taken a known joint and made it safer—but they were really embarking on a dangerous trial of an untested technology, and the peril quickly made itself known. Issues were documented for years; multiple flights had potentially catastrophic damage that became apparent only after the boosters were recovered and examined. By January 1985, a year before the Challenger explosion, Roger Boisjoly, a Thiokol engineer who “knew the O-rings better than anyone,” was telling his colleagues that low temperatures were likely to cause leaks that could cause a total loss of mission, crew, and vehicle.

So the O-ring problem was known at Morton Thiokol. It was also known at Marshall, home of NASA’s rocketry hub. And it was certainly known at the Kennedy Space Center, in Florida, where the launch would happen: The night before the tragedy, there was a three-way conference call among Thiokol in Utah, specialists in Huntsville, and the NASA team in Cape Canaveral, where Thiokol engineers laid out a step-by-step case against going ahead the next morning, outlining the dangers of launching with O-rings colder than 53 degrees Fahrenheit. But NASA pushed back. The team argued that the data weren’t strong enough to establish that air temperature on its own was a significant contributor to seal problems, Higginbotham details; they pointed at a flight with particularly bad leakage launched in warmer temperatures, and four test motors that fired in the cold without issue. Morton Thiokol’s leaders took a caucus. It was time to make a “management decision,” they said. Half an hour later, they got back on the call and told the group they’d changed their minds. The company was asked to—literally—sign off on the launch, the book explains, contravening the typical convention of an oral poll on whether or not to move forward. It had been decided that the launch lay within the boundary of acceptable risk.

The engineers, including Boisjoly, felt crushed. They’d assembled a last-minute presentation to try to avert a crisis; they had been overruled without the gathering of any new evidence to contradict their finding. Perhaps individual Thiokol representatives had a desire to please the firm’s billion-dollar client or to keep the shuttle schedule on track. Maybe they just didn’t want to make waves. But the astronauts had no idea that this had even happened when they made their way into the orbiter that morning. Like the American public watching at home, they were convinced that their spaceship would fly.

H igginbotham’s book is full of heart-stopping moments like these—the kind that make a reasonable person shout, “Oh, God, how did they let this happen?” Such events begin decades before Challenger, going back to 1967, when Apollo 1 caught fire on a launchpad, killing three astronauts … after NASA leaders had been warned about the capsule’s faulty wiring and the huge amount of flammable nylon and Velcro inside. More than a decade later, efforts to add an emergency-escape system to the space shuttle fizzled, in part to avoid the public perception that the shuttle was unsafe, the book alleges. Higginbotham later suggests that some of the Challenger crew may have been alive for about two minutes as the crew compartment plunged toward the ocean—but with no way to eject, even though that may not have saved them after such a dramatic failure.

In just the five years that the space shuttle had been operational, the window of tolerated risk inside NASA and among its contractors kept getting wider and wider, Higginbotham exhaustively shows. A great and lurching bureaucracy was trying to match the promise and expectations of the 1960s under the slimmed budget and pro-privatization, anti-government attitudes of the ’70s and ’80s. So rockets that had serious flaws were marked safe for human flight. The burden of proof in flight-readiness reviews seemed to shift in the period before Challenger, Higginbotham suggests, from having to show that a given flight was safe to proceed to having to convincingly demonstrate that it wasn’t —as shown in that disastrous meeting the night before launch, when Thiokol could not make its data prove that the launch would fail.

And it’s stomach-churning to read about the fragility of the orbiter’s heat-resistant tiles, or for Higginbotham to casually reference the foam insulating the shuttle’s external tank, knowing what would happen nearly two decades after Challenger. In 2003, a chunk of foam fell off the tank of the Space Shuttle Columbia as it was lifting off, and hit the orbiter’s wing. Like the O-rings, “foam strikes” were a known problem during launches, documented since the program’s first flights, and the delicacy—and importance—of the heat shield was equally well known. But by this point, loose foam seemed fairly routine, and the impact failed to drum up much alarm at Mission Control. More than a week later, Columbia tried to return to our planet, but the hole that chunk had made in the heat tiles was fatal. The orbiter came apart, killing everyone on board.

Read: NASA finally has alternative to SpaceX

These issues—faulty O-rings, foam strikes—were understandable. Theoretically, with study and ingenuity, they were fixable. The problem was not really a lack of technical knowledge. Instead, human fallibility from top to bottom was at issue: a toxic combination of financial stress, managerial pressure, a growing tolerance for risk, and an unwillingness to cause disruption and slow down scheduled launches.

Challenger is a remarkable book. It manages to be a whodunit that stretches hundreds of pages, a heart-pounding thriller even though readers already know the ending. The passion and ideals at the heart of human spaceflight come through, which only adds to the tragedy of understanding how many chances there were to save the astronauts aboard. Our faith in the systems that run our world is really faith in our fellow man—a chilling reality to remember.

​When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

The Space Shuttle Challenger Disaster A Study in Organizational Ethics

Profile image of se hyuk jang

Related Papers

International Journal of Sciences

space shuttle challenger disaster ethics case study

tee shi feng

Science, Technology, & Human Values

Andrew Tudor

Science and Engineering Ethics

Wade Robison

MidHath Nigar

Since long time ago, the human being has been curious to understand what happens to his surroundings and to be able to use this knowledge for manipulating the environment for his benefit. The spark of doubt, inherent to the nature of man, is so strong that he not only would settle for understanding his immediate environment. There is always the enthusiasm of looking up to the sky to discover the mysteries that keeps the universe, what happens beyond what our eyes can see if we can reach for the stars. Moved by the exponential growth of technology and science, in the 20th-century human being undertook the so-called “Space Race.” It would take to another level the imagination and knowledge of engineers and scientists around the world to achieve one of the most bold and fearless objectives that humanity has proposed to itself: lead men into space. The winner of such race would proclaim to be the National Aeronautics and Space Administration (NASA) when in 1968 managed to take the first man on the Moon in the Apollo 11 mission. Space Shuttle Challenger leaps from the launch pad. Photo Credit: NASA [1]. It was expected that as a result of the event space agencies would not stop. Instead, it would be an experience full of motivation with views towards what could come if they continued to work correctly. In this context, NASA announced an ambitious project called “Space Shuttle” in 1976. This project presented the idea of a reusable manned spacecraft, able to make several return trips into space. The dream was becoming a reality, conquer of space was happening. The first trips of the Space Shuttle, although in the midst of uncertainties and details to improve, were promising, and with such enthusiasm, NASA dared to send continuous missions within relatively short periods of time. However, a tragedy occurred in the tenth trip of the project, which would be an event that marked and changed the history of space exploration forever. The catastrophe took place on January 28th, 1986, when the spacecraft of the “Shuttle Challenger” mission was destroyed only 73 seconds after launching in front of the eyes of the entire organization and a huge section of the American population. The Shuttle Challenger mission, which numbering was STS-51-L [2], had as objectives to take to orbit the second Tracking and Data Relay Satellite for American communication services. In addition to the placing in orbit of the SPARTAN-Halley, which was an astronomical platform that would carry out observations of the Comet Halley, which at that time was close to the Earth. The accident claimed the lives of the seven members of the crew, including a teacher of basic education who designated for teaching children about space when returning from the mission [3]. The impact that it caused on the population and the enthusiasm of the scientific community was so significant that many media named it as “the largest accident on the conquer of space” [4]. The crew of the Space Shuttle Challenger. Photo Credit: History.com In this essay, we will make a thorough study of the technical and administrative factors that contributed to the failure of the Shuttle Challenger project. Starting from paying attention from the planning stage, the implementation of the project, and even the consequences and further investigations, to be able to identify the lessons to be learned in both areas. It is expected that by completing this task, we can notice the critical factors that we need to pay particular attention to approaching ourselves as students that can develop projects successfully.

The two Space Shuttle tragedies, Challenger and Columbia, have led to many papers on case studies on engineering ethics. The Challenger disaster in particular is often discussed due to the infamous teleconference that took place the night before the launch in which some engineers tried to postpone the launch. However, the space shuttle program itself is worthy of study as it relates to the engineering design process, and the details of the Challenger and Columbia disasters are worthy of discussion as they relate to a variety of sub-disciplines, including material science, thermodynamics, fluid mechanics, and heat transfer. This paper summarizes the major technical findings of the Rogers Commission and the Columbia Accident Investigation Board (CAIB). An overview of the history of the space shuttle program, going back to the end of the Apollo program, is presented, including some of the design compromises that were made in order to get political support for the space shuttle program. A detailed bibliography is given that will aid instructors in finding additional material they can tailor to their particular class needs.

paul swaminathan

Henry Findley

Junichi Murata

One of the most important tasks of engineering ethics is to give engineers the tools required to act ethically to prevent possible disastrous accidents which could result from engineers' decisions and actions. The space shuttle Challenger disaster is referred to as a typical case in almost every textbook. This case is seen as one from which engineers can learn important lessons, as it shows impressively how engineers should act as professionals, to prevent accidents. The Columbia disaster came seventeen years later in 2003. According to the report of the Columbia accident investigation board, the main cause of the accident was not individual actions which violated certain safety rules but rather was to be found in the history and culture of NASA. A culture is seen as one which desensitized managers and engineers to potential hazards as they dealt with problems of uncertainty. This view of the disaster is based on Dian Vaughan's analysis of the Challenger disaster, where in...

Loading Preview

Sorry, preview is currently unavailable. You can download the paper by clicking the button above.

RELATED PAPERS

Mark D Mandeles

Technology and Culture

7th IET International Conference on System Safety, incorporating the Cyber Security Conference 2012

Sanjeev Appicharla

Stephen Waring

Gideon Marcus

The American Review of Public Administration

Terence Garrett

Kelly Carney

Astropolitics

David Lengyel

Richard Blomberg

12th AIAA Aviation Technology, Integration, and Operations (ATIO) Conference and 14th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference

Taiki Matsumura

Stephanie Collier

IEEE Transactions on Engineering Management

E. Matricciani

Ramon Llull Journal of Applied Ethics

Robert E . Allinson

Public Administration Review

Paul Schulman

Irving Statler , Nicolas Maille

Journal of Space Safety Engineering

Dave Gingerich , Jeffrey Forrest

Paul Cheng , William Tosney

45th AIAA/ASME/SAE/ASEE Joint Propulsion Conference & Exhibit

Russel Rhodes

Roger D Launius

Journal - American Water Works Association

Melanie Goetz

INCOSE International Symposium

David Kaslow

John Stevens , Tibor Balint

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

IMAGES

  1. Space Shuttle Challenger Ethics

    space shuttle challenger disaster ethics case study

  2. Space Shuttle Challenger Disaster

    space shuttle challenger disaster ethics case study

  3. ENGINEERING ETHICSThe Space Shuttle Challenger Disaster.docx

    space shuttle challenger disaster ethics case study

  4. Case Study 1-Challenger Disaster

    space shuttle challenger disaster ethics case study

  5. ELR-047 Ethics Case Study Challenger Disaster: 3 PDH

    space shuttle challenger disaster ethics case study

  6. ENGINEERING ETHICSThe Space Shuttle Challenger Disaster.docx

    space shuttle challenger disaster ethics case study

VIDEO

  1. Space Shuttles & The CHALLENGER Disaster

  2. கொடூரமாக வெடித்த ராக்கெட் 14 பேர் மரணம்

  3. Engineering Ethics

  4. Space Shuttle Challenger Ethical Dilemmas

  5. Ethics

  6. [What caused the Space Shuttle Challenger Disaster?

COMMENTS

  1. The Space Shuttle Challenger Disaster

    Student Handout - Synopsis. On January 28, 1986, seven astronauts were killed when the space shuttle they were piloting, the Challenger, exploded just over a minute into flight. The failure of the solid rocket booster O-rings to seat properly allowed hot combustion gases to leak from the side of the booster and burn through the external fuel tank.

  2. PDF Engineering Ethics Case Study: The Challenger Disaster

    The purpose of case studies in general is to provide us with the context—the technical details—of an engineering decision in which an ethical principle may have been violated. Case Study of Challenger Disaster On January 28, 1986, the NASA space Shuttle Challenger was destroyed in a disastrous fire 73

  3. Space Shuttle Challenger Disaster: Ethics Case Study No. 1

    Allan J. McDonald, former director of the Space Shuttle Solid Rocket Motor Project for Morton Thiokol, discusses the events surrounding the destruction of th...

  4. PDF The Space Shuttle Challenger: A Case Study in Engineering Ethics

    Jan. 24, 1985 - O-Ring blow-by discovered after flight 51-C. Aug. 19, 1985 - NASA management briefed on o-ring issues. By the end of 1985, there have been 24 successful Shuttle flights. Jan. 27, 1986 - Telecon for Challenger flight go/no-go. Jan. 28, 1986 - Challenger disaster kills 7 astronauts.

  5. Case Study

    Introduction to the Case. On January 28, 1986, seven astronauts were killed when the space shuttle they were piloting, the Challenger, exploded at just over a minute into the flight. The failure of the solid rocket booster O-rings to seal properly allowed hot combustion gases to leak from the side of the booster and burn through the external ...

  6. The Challenger Disaster: A Case of Subjective Engineering

    28 Jan 2016. 14 min read. Illustration: Barry Ross. Illustration: Barry Ross. Editor's Note: Today is the 30th anniversary of the loss of the space shuttle Challenger, which was destroyed 73 ...

  7. Space Shuttle Case Studies: Challenger and Columbia

    The two Space Shuttle tragedies, Challenger and Columbia, have led to many papers on case studies on engineering ethics. The Challenger disaster in particular is often discussed due to the ...

  8. Space Shuttle Case Studies: Challenger and Columbia

    Abstract. The two Space Shuttle tragedies, Challenger and Columbia, have led to many papers on case studies on engineering ethics. The Challenger disaster in particular is often discussed due to the infamous teleconference that took place the night before the launch in which some engineers tried to postpone the launch.

  9. PDF The Space Shuttle Challenger Disaster

    On January 28, 1986, the space shuttle Challenger exploded in midair, sending six astronauts and schoolteacher Christa McAuliffe to their deaths. The initial public reaction was shock and disbelief. Americans had come to expect routine flights from NASA. Well before the shock had eased, the public wanted to know why the accident took place.

  10. PDF ENGINEERING ETHICS

    prove the Space Transportation System's cost effectiveness and potential for commercialization. This prompted NASA to schedule a record number of missions in 1986 to make a case for its budget requests. The shuttle mission just prior to the Challenger had been delayed a record number of times due to inclement weather and mechanical factors.

  11. The Challenger Space Shuttle Disaster: A Case Study in the Analysis of

    I am grateful to Jenny Lye for helpful comments on an earlier version of the article. I am also grateful to the Office of National Archives in Washington DC, for permission to reprint material from the 1986 Report of the Presidential Commission on the Space Shuttle Challenger Accident, Volume 1. Search for more papers by this author

  12. Engineering Ethics Case Study: The Challenger Disaster

    This course provides instruction in engineering ethics through a case study of the Space Shuttle Challenger disaster. The minimum technical details needed to understand the physical cause of the Shuttle failure are given. The disaster itself is chronicled through NASA photographs. Next the decision-making process—especially the discussions ...

  13. E

    On-line; Text Course. Course No E - 1142. PDH Units 3.00. TAKE THIS COURSE. $ 75.00. Intended Audience: all types of engineers. PDH UNITS: 3. What led to the failure of the Space Shuttle Challenger? Gain a comprehensive understanding of the relationship between engineering ethics and technical knowledge through the lens of the case study, the ...

  14. Space Shuttle Challenger Disaster and Ethical Issues Case Study

    Space Shuttle Challenger Disaster and Ethical Issues Case Study. The case study details the procedures that took place before the launch of the Challenger Space Shuttle. The author notes that NASA was under pressure from politicians and competing space agencies, which is why the management pushed for the launch despite insufficient testing and ...

  15. PDF Engineering Ethics Case Study: The Challenger Disaster

    Engineering Ethics Case Study: The Challenger Disaster 2020 Instructor: Mark P. Rossow, Ph.D, PE Retired PDH Online | PDH Center 5272 Meadow Estates Drive Fairfax, VA 22030-6658 ... On January 28, 1986, the NASA Space Shuttle Challenger burst into a ball of flame 73 seconds

  16. Space Shuttle Challenger

    The Challenger exploded 73 seconds after launch on January 28, 1986. It was the 25th shuttle mission. The seven astronauts killed included a high school teacher, Christa McAuliffe, who was to broadcast lessons from space. Government investigations and redesign efforts grounded the shuttle program for the next two years (Freiman and Schlager ...

  17. What the Challenger Disaster Proved

    This would be the 25th mission of a NASA space shuttle, and Challenger's sister shuttle Columbia had just returned from a six-day journey 10 days before. In classrooms across the country ...

  18. The Space Shuttle Challenger Disaster A Study in Organizational Ethics

    The two Space Shuttle tragedies, Challenger and Columbia, have led to many papers on case studies on engineering ethics. The Challenger disaster in particular is often discussed due to the infamous teleconference that took place the night before the launch in which some engineers tried to postpone the launch.

  19. The Space Shuttle Challenger Disaster

    The Space Shuttle Challenger Disaster Author(s) Anonymous Year 1992 Description A case study looking at the explosion of the Challenger Space Shuttle. Abstract On January 28, 1986, seven astronauts were killed when the space shuttle they were piloting, the Challenger, exploded just over a minute into the flight. The failure of the

  20. Engineering Ethics Case Study The Challenger Disaster R1

    Course Content This course provides instruction in engineering ethics through a case study of the Space Shuttle Challenger disaster. The course begins by presenting the minimum technical details needed to understand the physical cause of the Shuttle failure. The disaster itself is chronicled through NASA photographs.

  21. PDF Space Shuttle Case Studies: Challenger and Columbia

    Abstract. The two Space Shuttle tragedies, Challenger and Columbia, have led to many papers on case studies on engineering ethics. The Challenger disaster in particular is often discussed due to the infamous teleconference that took place the night before the launch in which some engineers tried to postpone the launch.

  22. PDF Engineering Ethics Case Study: The Challenger Disaster

    The purpose of case studies in general is to provide us with the context—the technical details—of an engineering decision in which an ethical principle may have been violated. Case Study of Challenger Disaster On January 28, 1986, the . NASA . space . Shuttle Challenger was destroyed in a disastrous fire 73

  23. 12 Ethicscasestudies

    This document contains an outline and summaries of 11 engineering ethics case studies presented in a course on introduction to engineering design. The case studies cover topics such as murder, speeding, software piracy, safety issues, truth in public statements, the Challenger disaster, and Toyota unintended acceleration. For each case, the document discusses the legal, moral, and ethical ...

  24. PDF Engineering Ethics Case Study The Challenger Disaster

    The purpose of case studies in general is to provide us with the context—the technical details—of an engineering decision in which an ethical principle may have been violated. Case Study of Challenger Disaster On January 28, 1986, the NASA space Shuttle Challenger was destroyed in a disastrous fire