The U.S. is once again sending people to the moon.
The world met the crew of the planned Artemis II mission in early April and celebrated an upcoming 10-day voyage that should both stir nostalgia and fuel a new generation’s love of crewed spaceflight.
But after multiple space catastrophes in the past 60 years, Project Artemis needs to exemplify NASA’s commitment to safety when taking humans out of Earth’s atmosphere. In the aftermath of these catastrophes, NASA has repeatedly shifted its approach to safety, which is commendable, but often the agency’s people have ignored red flags and reports that could have prevented astronaut deaths.
To that end, NASA needs to make it possible for agency employees and contractors to point out possible program weaknesses without fear of reprisal. The agency needs to ensure that reporting mechanisms are in working, responsive order and that managers can and will act on safety concerns. It’s our hope that NASA will step up to the task. Still, some NASA staffers, including a whistleblower to whom we have spoken, believe the agency has a long way to go.
NASA was not yet nine years old when a fire in the Apollo 1 command module killed three crew members in 1967. The accident surprised the American public and many within NASA and exposed how the agency was unprepared to build more complex spacecraft. Several people had raised concerns about the work quality of prime contractor North American Aviation (NAA) and the risk of fire.
These included people who had strong influence in the space program, such as Wernher von Braun, then director of the Marshall Space Flight Center, and Air Force General Sam Phillips, then director of Project Apollo. In one stunning example from documents from the National Archives and the NASA History Division, a subcontractor on the project had warned a NASA manager that the risk of fire would be “better considered now than by the Monday morning quarterbacks.”
The decision-makers within Project Apollo were focused on their deadline, and no one dared do anything to delay the program. They succumbed to groupthink, as there was a lack of communication among NASA directorates, and lack of attention, as critical engineering milestones failed to take into account the fact that three people would be flying entirely new spacecraft. This led to engineers ignoring warning signs and managers dismissing concerns. Before the disaster, Joseph Shea, head of the Apollo Spacecraft Program Office, asserted the crew smoking in the cabin was the only way a fire would occur.
As told in a 1969 interview, a few months after the incident NASA Administrator James Webb called the fire a “failure of management” and created groups to supervise and report on the progress of the project. He forced out program heads, including Shea and NAA space division head Harrison Storms. The administrator looked to shake up the Project Apollo management structure, as the fire had shaken his faith in senior managers.
Webb characterized his actions after the tragedy as “saving the system by correcting the procedures.”
By all appearances, NASA became a safety-first agency. But then the explosion of shuttle Challenger in 1986 shattered that assumption. Rubber O-rings that separated sections of the shuttle’s solid-rocket boosters contracted in cold weather and malfunctioned, causing a nightmarish explosion moments after launch.
Once again, in a tale of a failed whistleblowing response, Roger Boisjoly and Allan McDonald of contractor Morton Thiokol warned NASA not to launch in below-freezing temperatures. Joe Sutter, a member of the Rogers Commission that investigated the cause of the accident, concluded NASA’s organizational structure “was a mess, with competing fiefdoms, tangled reporting lines—and no top-level leader focused solely on safety.”
NASA halted the shuttle program for more than two years while it examined how to better identify safety risks and how to better manage safety concerns. The agency, in response to a recommendation from the Rogers Commission, set up an office of safety, reliability and quality assurance. Even so, it took a third disaster for the agency to be shaken enough to consider formalizing its safety culture.
In 2003 the shuttle Columbia broke up during reentry, killing the seven-member crew. The explosion was traced to insulating foam that had separated from the shuttle’s external tank during launch, striking the leading edge of a shuttle wing and breaching the tiles protecting the ship during reentry. The loose foam issue that had been known for years.
Again, NASA worked to bolster safety measures, with Tracy Dillinger, a member of the Columbia Accident Investigation Board, concluding, “NASA [didn’t] have a systematic way of getting feedback.” In 2009, more than four decades after the Apollo fire, NASA finally created an official safety culture program.
Perhaps complacency played a role in all three accidents, particularly those involving the shuttle, which had become a routine method of space travel. Regardless, these tragedies raised a pressing question that remains: How and why are red flags so often ignored or dismissed?
This is an open question, a NASA safety engineer tells us. The engineer flagged a possible fire concern several times through NASA’s reporting systems and believes management is more interested in appearing to prioritize safety rather than ensuring it.
In 2015 the whistleblower reported the possible launchpad fire risk to a manager. Nothing changed. The NASA engineer reported the concern again through NASA’s official safety reporting system and to NASA’s Office of Inspector General. The engineer says instead of action, their manager, who knew about the concerns, offered only sharp criticism. As retaliation and career security became a concern, the NASA employee submitted a complaint that eventually was referred to the federal government’s Office of Special Counsel. They also submitted their concerns to the Occupational Safety and Health Administration.
The engineer told us it was unclear whether the agency was doing anything about the issue when it was reported through the NASA safety reporting system because to preserve anonymity for those who choose it, there is no formal communication process for those who don’t. There is no way for the whistleblower to actively communicate with those reviewing the concern to offer context or suggestions, and the system offers minimal feedback or status updates.
Moreover, they tell us, it wasn’t until last year that antiretaliatory provisions for protected safety disclosures were included.
It’s been 20 years since the Columbia accident, and each of NASA’s previous accidents were a little less than two decades apart. This is a cadence that cannot be ignored as Artemis II is scheduled to launch in November 2024. Our hope is that, this time, two decades will have been enough time to ensure the astronauts onboard return home safely.
Any weaknesses in NASA’s current reporting structure must be rooted out now, as we enter a new space race and as pressures to compete with China and other spacefaring nations come to the surface.
This not only requires the creation of mechanisms that effectively pass along red flags to the appropriate managers but a top-down culture shift where managers will not retaliate or hinder the career of anyone who speaks out.
To offer a minor edit to James Webb’s remarks, NASA should strengthen the system by constantly correcting the procedures.
Viewing the official portrait of the Artemis II crew released by NASA, it’s impossible not to think of the incredible feat they will risk their lives to achieve. It’s also impossible not to think of the ones they are leaving here on Earth and how imperative it is that we bring these loved ones home safely.
This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.