Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There’s definitely audio from all those meetings. You can hear the mission manager plow right through the issues with bureaucratic ease. They’ve seen something similar before and even though it was unexplained, because they’ve seen it, they think it’s a known issue.

You want to blame her for it and shake her out of it as you hear her say it, but it wasn’t malicious. I believe the mission manager even had a spouse who was an astronaut, so it’s obviously not like they don’t care. I’ve always found it fascinating how organizational structure and pressure can take really brilliant and motivated people and beat them into making such poor decisions.

It’s really important to remember that these people are not idiots. It’s literally a room of rocket scientists and space shuttle mission managers. It’s so easy for us all to say here that it was so easy to see it happening and that anyone who didn’t ask the right question was a moron, but these structures take on a life of their own. If I had it to start all over again, studying that would make a fascinating and potentially rewarding career.



For the past two years I have worked in the local emergency room as a technician working on getting some clinical experience. Let me tell you, in the constant face of life and death, you really just develop a kind of "danger fatigue" and even the most critical moments become somewhat prosaic.

You begin to develop a false sense of security after nobody really does from a gunshot wound. Then, someone septic comes in, and "seems" fine, and they're dead in a few hours.

I'm not sure what this kind of logical fallacy is, but I suspect it's similar in a government environment, where you're constantly at RED ALERT. The risk of danger just seems overstated, even when it isn't.


A very similar phenomenon happens in aviation—we call it complacency. Thousands of successful takeoffs in a row make it hard sometimes to remember that each one is a completely independent event.

The way I fight it is by explicitly reminding myself that just because something worked yesterday, that doesn't mean I can skip a step today or let my guard down at any point.

It really does take deliberate thought though. Funny how the brain works.

(The upside is that every successful takeoff becomes a delightful surprise!)


Same thing for rock climbing, and complacency born from routine is exactly the problem: you tie knots that your life will depend on every day, thousands of times, and then stuff like this happens to some of the most skilled and experienced people:

- get distracted while tying the rope to your harness and leave the knot unfinished, fall 20 meters from the top of the climb (Lynn Hill, by sheer luck only broke her foot and elbow)

- use a slightly unusual rope setup and when preparing to be lowered, tie in on the wrong side of the anchor, fall 14 meters onto rock (Rannveig Aamodt, broke her spine, pelvis and ankles)

- have your partner point out damage on your harness, shrug it off because there's plenty of safety margin, continue climbing for 3 days in a manner that puts repetitive abrasion on exactly that part of the harness, have it snap while rapelling and fall 150 meters to your death (Tood Skinner)


Also on the roads. So many people drive round a blind corner at high speed because they do the journey every day and the road is always empty. Until one day it isn't.


Sometimes there are comparatively simple lifehacks you can establish to prevent complacency from leading to problems.

A little example from my life is forgetting to secure the buckle of my motorcycle helmet. Once I've done everything else and gotten my gloves on, it's a pain to take them off again to buckle it up if I forget, and a few times I just shrugged it off and rode anyway.

Then an instructor suggested he always follows exactly the same procedure each time, to the point that he always even puts the same glove on first. It made me wonder if I should do that, and in thinking it through I realised if I changed my order to always buckle up before putting my glasses back on, I'd never forget. There's no way I'd ride off without the glasses (because I can't see without them!), so if that step can only come _after_ buckling up, then there's now no way I'd ever not buckle it up either.


Not quite what you're talking about, but very much related is normalization of deviance:

What these disasters typically reveal is that the factors accounting for them usually had “long incubation periods, typified by rule violations, discrepant events that accumulated unnoticed, and cultural beliefs about hazards that together prevented interventions that might have staved off harmful outcomes”. Furthermore, it is especially striking how multiple rule violations and lapses can coalesce so as to enable a disaster's occurrence.

[1] https://danluu.com/wat/


I hear what you are saying and agree with the danger fatigue observation, but it is different to the managers at NASA and Boeing that are reviewing the slides.

By default of your situation, you are facing these life and death decisions on a constant basis. The NASA and Boeing managers do not. I can't imagine they are part life and death scenarios very often, if at all.

Critical thinking failed them that day.


I don't think he's that far off the mark. Take the seals. This wasn't a one-off decision. This was a problem that was known for years and repeatedly ignored at lower and lower temperatures (thus lowering the allowed limits far below the original specifications) until the disaster. The reasoning being "well, you warned us last time, but everything went fine, so let's just keep going, it'll keep being fine".


True, it is easy to be wise after the event. However, the management culture at NASA, which was also responsible for the loss of Challenger, and something Richard Feynman also criticized, did not change.

This lead to managers ignoring engineers' warnings about the foam strikes on Columbia, and also rejecting requests for high resolution images.

This is exactly the same culture which ignored engineers' warnings about the O-rings on the SRBs.

Linda Ham, the mission manager who rejected these requests left the space shuttle program after the Columbia disaster and was moved to other positions at NASA. Not firing or disciplining these managers will cause similar disasters in future. https://en.wikipedia.org/wiki/Linda_Ham#Columbia_disaster_an...


> Not firing or disciplining these managers will cause similar disasters in future.

I think the quote is something like "Why would I fire them, we just lost X lives and dollars training them?"


That doesn't seem to quite square with them being reassigned. In Ham's particular example:

> "Ham's attitude, and her dismissal of dissenting points of view from engineers, was identified as part of a larger cultural problem at NASA.[2] After the report's release, Ham was demoted and transferred out of her management position in the space shuttle program."


That's the best option if you want to be able to rebuild confidence in the chain of command. The people who have been ignored before will never trust her again anyway. In such situations, you either rebuild the chain of command or the whole troop if you want to have a functioning group.


For sure they are not "idiots", but these are people who usually have done well in school/academia where saying "stop, don't do it!" is never the right answer; these are people who were never promoted to where they are now for saying "stop, don't do it!".

Recently I've seen a product owner trying to push us on a two week sprint where we needed to complete 100 story points worth of tasks while on the previous two sprints we only completed ~30 story points. To my protest, the SCRUM master sided with the product owner in saying we should go ahead and commit to the goal.

These are not stupid people, they were under pressure to deliver; unfortunately the incentives are not to listed to reasonable people, but to pushers; when the shit hits the fan, they make a big fuss and obtain even more resources to push even harder and they usually deliver -- whit 3X the costs.

Very rarely have I seen reasonable and smooth-sailing managers get to the top, usually it's the die-hard/hard-pushers/busy-appearing types that get to the top.

I think it's curse of modern society with its unlimited resources; this shit would never fly in the times of Sun Tzu or Caesar precisely because limited resources would prevent such smart "idiots" -- they'd succumb to the elements or their subordinates would do them in their sleep for being so detached from reality.


With the obvious warnings about limited sample size, anecdotal evidence and so on...

I've encountered a few people with this trait, and the general trend I've noticed is an excess of optimism and a hint of complacency.

"I have the utmost faith that my team can do 100 hours of work in 25!"

"We did 75 scheduled hours of work in 15. That means we can easily do 100 in 25." (completely overlooking that the team - by that point - had been working from 7am until 9pm for the entire week, ordered food in and had to take several days PTO to recover)

Curiously these are usually the same managers who demand crunch-time, but "have a prior engagement" when the team asks if they'll be helping too.


I don't think anyone is saying that it was malicious, but it was absolutely negligent. For people with decisional authority, organizational pressure is not an excuse. For people with responsibility, lapsed vigilance is not acceptable. They weren't forced to make a split-second decision, and they weren't fatigued. While there were contributing systemic factors, this was still a mistake to dismiss the information so quickly. They're professionals, not amateurs.


It's easy for us to past judgement in hindsight, but what is an acceptable failure rate? For Apollo, I believe it was around 7-8%. There's always going to be some risk, what you don't see is all the times those administrators correctly identified a risk as acceptable, or scrubbed a mission due to inaccurate risk assessment.

A space program that demands better than six-sigma risk for example is unlikely to ever get off the ground.


Although I'm technically judging in hindsight since I wasn't there, I don't think I'm being unfair in my judgement. I have experience in safety and I've seen better and worse safety reviews than this one. Although there were plenty of external factors that pushed people to act the way they did, the people we're talking about were the responsible authority for vetting these issues. It was their job to take ownership of the issues and their inquiries, and they failed to conduct themselves appropriately.

Just because their mistake didn't push the failure rate past the acceptable limit doesn't absolve them of anything.


> I’ve always found it fascinating how organizational structure and pressure can take really brilliant and motivated people and beat them into making such poor decisions.

I get your sentiment, but there are more stories to tell in NASA events. Robert Trivers wrote a book [1] about his research on human behavior, specifically, how self-deception plays a role in the event.

[1] https://www.amazon.com/Folly-Fools-Logic-Deceit-Self-Decepti...


> There’s definitely audio from all those meetings

Are these accessible publicly? Could someone link to them, if available online? It's not easy to get a deep view into organizational decision making. The least we could do is to learn from mistakes.


It's important to remember that it was their job to be mindful of the risks. These weren't folks working at a bake sale or something, these were trained professionals who had been entrusted with significant and weighty responsibilities, and they did not take those responsibilities seriously enough. No, they weren't idiots, and that makes it far worse. Negligence is a common human failing, we should remember what it looks like and how it happens, we shouldn't pretend that it doesn't exist or that it should be excused.


They acted irresponsibly. They should respond for their actions. They had a responsibility and failed miserably.

I have no pity for such lack of professionalism. You are correct in saying that it is not so easy and that these structures take a life of their own but this shouldn't be an excuse and we see disasters happening over and over because people don't act as professional.

Anyone on the room seeing such mistakes and who was capable of standing up, should have, even if it meant they could end up jeopardizing their careers by doing so.

* I'm not asking for a hero and I understand that failure is part of the human nature but we should have [professional] respect where it is due.


> They acted irresponsibly. They should respond for their actions.

It's satisfying to point to a handful of people to place blame, but it isn't terribly scientific. We should be asking why they asked irresponsibly. The folks making these decisions didn't just act in a vacuum, they were a product of NASA, systems engineering at large, a bureaucratic institution, and our own societal norms.

It's likely that others who went through similar training and operated in a similar environment would have made the same decision. But even if the bad call was due to a few bad apples that inserted themselves into the decision making process, we should ask how we allowed them to get there.

Punishing an individual may or may not be warranted in this case, I suspect that the guilt they must live with is punishment enough. However, what's clear is that punishment won't be enough to prevent similar problems from arising again.


> We should be asking why they asked irresponsibly. > I suspect that the guilt they must live with is punishment enough. We should ask this for the sake of science. However, it is not only accident prevention here that should be at stake here. They signed off and should be held responsible. I'm not even talking about punishment here but about repairing the damages done.

This is a question of justice first of all. Punishing for the sake of punishment is not the way forward. It is true that people will think twice [before dismissing such kind of complains] if they see that you can't get away with [unintended] murder but this should be a secondary, positive side-effect only. If you try to punish to send a message it might be unjust, send the wrong message (like, silencing whistleblowers that have doubts or people from fixing their own mistakes), or both.


> This is a question of justice first of all.

No it's not. This is a question of optimizing spaceflight for safety.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: