Another Lesson From Boeing: Silos

Boeing’s missteps with the 737 Max jet offer many powerful lessons for corporate compliance, audit, and risk executives. Our latest lesson comes from an in-depth article in the New York Times, examining the decisions Boeing made about the jet’s design and subsequent pilot training, and the terrible consequences that followed.

Every compliance and audit professional should read the article yourselves, but in brief: Boeing engineers originally designed the jet’s automated MCAS navigation system to be much more safe, and to activate much more rarely. MCAS was supposed to rely on data from two sensors, and only turn itself on while flying at high speeds. Under those circumstances, the MCAS system wouldn’t pose any unusual safety risk.

Then the engineering team decided to use the MCAS system much more extensively. That included letting the system push the nose of the plane down more sharply — and, critically, relying on data from only one sensor rather than two.

silos

The 737 Max jet.

Why did the engineers do that? Because they wanted the MCAS system also to activate at low speeds rather than only high speeds. The second sensor measured speed; the first sensor measured the plane’s angle. Once MCAS only relied on that “angle of attack” sensor, if that sensor gave false readings, the jet would go nose down and the pilots might not be able to pull it back up.

We all know the result. That’s what happened with the Lion Air crash last October and the Ethiopian Airlines crash in March. Those engineering decisions created the conditions for that outcome.

And how did we get from those MCAS changes, adopted in 2016, to 346 deaths more than two years later? For compliance and audit professionals, this is the article’s crucial paragraph:

Test pilots aren’t responsible for dealing with the ramifications of such changes. Their job is to ensure the plane handles smoothly. Other colleagues are responsible for making the changes, and still others for assessing their impact on safety.

In other words, Boeing had a highly compartmentalized chain of decision-making — and nobody could see the whole. Nobody could see how the consequences of one decision could create adverse effects in another part of the enterprise. Within each individual compartment, the decision seemed reasonable. Nobody assessed how immaterial risks in each compartment might accumulate into one collectively material safety risk.

That’s how Boeing, the Federal Aviation Administration, and two airlines got 346 people killed.

The Risks of Silos

Compliance and audit executives talk about this sort of risk all the time. We simply call them “silos” rather than compartments — but make no mistake, that’s what happened at Boeing.

I’m torn about siloed organizations. On one hand, silos allow division of labor and greater efficiency. The concept has been around at least since Adam Smith described his pin factory in The Wealth of Nations in 1776, and silos always will be with us. They should always be with us. They’re indispensable to business.

On the other hand, the success of silos depends on good communication within the business — and we’re getting worse at that, because modern IT lets us create ever more complex organizations. We’re creating complexity faster than we’re improving communication to manage the silos that complexity creates.

That’s the risk chief compliance officers and chief auditors need to worry about. It’s subtle, and can manifest in any number of ways, but in one form or another, that’s the risk that worries CEOs and boards the most: that the organization’s complexity overwhelms decision-making ability, because individual groups don’t relay the right information to the right colleagues.

I worry because email, cloud-based services, and the rest of modern business technology allows us to fall into this trap so easily. It’s not so much that modern technology allows us to create more silos; better to say that modern technology allows us to create organizations that are more silo-able, if we can use that adjective.

That is, we’re creating a world where organizations are split into silos by design, rather than by necessity. That approach can work, but it drives up the importance of communication and governance among mid-level and senior executives, so they all have an “enterprise-wide awareness” as they make decisions within their silos.

Once you scale up the organization to hundreds of thousands of workers, operating in dozens of countries, as a mix of employees, contractors, and other third parties, in highly specialized silos while making precise products — that’s quite the communication challenge.

Careful Risk Assessments

So if we define the risk as “highly compartmentalized organization that works against seeing enterprise-wide risks”  — exactly how might you perform a risk assessment for something like that? What specific problems would a compliance or audit executive want to identify, that are tangible enough for you to recommend solutions?

I would look for problems of communication process, where individual compartments do speak up, but they’re under the mistaken belief that everything is within acceptable norms. That’s what happened at Boeing. It happens at large organizations everywhere.

For example, when Boeing changed the design of the MCAS system, the FAA had already approved the original version. FAA rules only required a second look at the new MCAS system if the new version changed how the jet performed in “extreme situations.” The update system, however, was understood not to change how the jet performed in extreme situations — which meant the FAA team in charge of pilot training procedures didn’t require new training on the new MCAS. And when Boeing asked the FAA to remove MCAS from the jet’s pilot manual, that FAA team said sure. Because, from their perspective, why wouldn’t they?

If the New York Times article casts aspersions on any one person, that person is Mark Forkner, chief technical pilot on the Max jet. He asked the FAA about cutting MCAS from the training manual.

Is this disaster Forkner’s fault? We don’t know enough facts to say. But again, consider compartmentalization. Prior to, and separate from, Forkner’s work on the Max jet, Boeing decided that technical pilots would only work in simulators, rather than the actual planes. That decision later left Forkner with less understanding of the Max’s risks.

You see where I’m going with this: one could easily construct a scenario where the whole organization does have a good corporate culture, and everyone makes reasonable decisions within their purview — but they don’t understand how their decisions affect other future decisions, and miscommunication multiplies, and misunderstanding of risk multiplies.

FAQs

Perhaps another way to assess your compartmentalization risk is to reverse-engineer the disaster. What’s the worst operational or compliance risk your company might face? How could poor communication among different silos of the enterprise create the conditions for that disaster?

That would be especially true as your organization embraces new technology, outsourced business processes, or flattens management oversight as part of a restructuring. Those ideas always sound feasible unto themselves, and especially to the MBAs blurting them out with little operational know-how. That’s when you want to ask the questions. What you want to ask is, essentially, “How would we lose sight of the bigger picture? How could we prevent that?”

Risk dashboards might help — but if they only summarize operational data for whomever is reading the screen, that person still needs to know how to put the data in context. No wonder audit executives keep saying, in poll after poll, that the most important trait in an audit staffer is “knows the business.” Risks like this are why.

You can also consider escalation procedures when someone flags a risk. Again, however, the crucial point is understanding the context — so the challenge isn’t really escalation up to senior executives; it’s carrying across from one part of the enterprise to another.

(Principle 14 of COSO’s internal control framework: “The organization internally communicates information, including objectives and responsibilities for internal control, necessary to support the functioning of internal control.”)

Maybe by the end of the 2020s, artificial intelligence will be able to do far more sophisticated risk analysis, that solves a lot of these compartmentalization problems for us. For now, however, organizational complexity and compartmentalization are racing ahead, so compliance and audit executives need to think about the communication policies and practices that can keep that complexity well governed.

That, more than anything else, is what your bosses want.

Leave a Comment

You must be logged in to post a comment.