United757CNN reported on Tuesday that United Airlines grounded 96 Boeing 757 airliners for "unscheduled maintenance". The grounding meant that some of the airline's flights were canceled or delayed Tuesday night and Wednesday. The Federal Aviation Administration said United's action involved its compliance with an airworthiness directive from 2004, but that it was “a voluntary action on United's part”. One can wonder why a safety directive from 2004 is only addressed today; fact is that someone within United has been bold enough to ground the planes immediately and start to implement the directive. In your projects you will also encounter situations where risks could’ve been identified earlier, but for various reasons haven’t. A bold project manager might continue to push forward, arguing that this is expected from him. But would you, just like United management, be bold enough to delay a major milestone to address a risk?

It took United 24 to 36 hours to bring the fleet of Boeing 757 airplanes back to full operational status – resulting in cancelled flights and no doubt serious loss of revenue.  My only conclusion is that United deemed the fallout associated with the risks (of an incident or accident) were considered too high and as a result it was decided to immediately address the situation. As you can imagine this could not have been an easy decision. In situations where I do not know the likelihood of a risk occurrence, I use the theory of the Type I and Type II errors  to help me understand the the potential consequences of a decision.

A type I error is the error of rejecting a hypothesis when it is actually true. A type II error is failing to reject a hypothesis when it is actually false. Combine this with asking yourself: “what is the worst that can happen if I make the wrong decision?” and you’ve got a technique to aid your decision making. I would not be surprised if the United executives followed the same strategy. Let me explain this by using United’s case as an example. Consider the following hypothesis: “Grounding of the 757 fleet is necessary to prevent accidents”. If this statement turns out to be true, but was rejected (type I error), the consequences would be astronomical. A preventable plane crash would cost unforeseeable amounts of money, not speaking of the years of investigations, negative publicity and legal proceedings. If the statement turns out to be false, but was accepted (type II error: we grounded the planes but nothing would’ve happened anyway), the resulting consequences are limited – the costs may be substantial, but are certainly manageable. And after one week, everyone has forgotten about it. As a United executive I’d rather make a type II than a type I error in this situation. And as a passenger, I will happily board a United flight next time, knowing they’re not afraid to take measures when required.

I’ve been in a similar situation (okay, the impact was slightly less…) when my project team was ready to launch a website giving advice to voters (based on a Q & A game). Just before go-live, I received feedback that the advice was heavily biased – not intentionally – but answering all the questions conform the election campaign of one party would score them significantly less than doing the same for another party. The team’s hypothesis was that “going live would not generate negative publicity”. If accepted, but actually being false, this (type II) error would lead to a lot of negative publicity for our company. The (type I) error of rejecting this hypothesis when it is actually being true (so not launching the site when indeed no negative publicity would be generated) would mean my team spent weeks of work in vain. Based on this view I decided I’d rather make the type I error and pulled the plug on the launch. As a result we missed our “launch” opportunity, and others beat us to it. Needless to say my executives were not thrilled, but did compliment me on my decision and ended up giving me more responsibility. Which brings me back to my original question: would you be bold enough?