Lessons on Algorithms, Ethics, and Equity

Compliance professionals searching for cutting-edge policy management mishaps, turn your gaze toward Stanford University. The mess that the medical school endured last week with its bungled distribution of covid vaccines to employees offers lessons to us all. 

What happened? According to press reports and many unhappy people on Twitter, Stanford Medicine started rolling out its vaccines to several thousand employees — and thanks to a faulty algorithm, excluded almost all of the hospital system’s medical residents who provide direct care to covid patients. Instead, vaccination slots went to more senior doctors who typically don’t have close contact with patients, or to nurses and other staff in similarly covid-less lines of care.

The plan had been for the algorithm to assign vaccination slots for the first 5,000 employees, who would begin receiving the covid vaccine on Dec. 18. That group had to include doctors, residents, nurses, orderlies, food service workers, and other specialists; and the initial supply was never going to be enough. (Stanford Medicine, which encompasses the medical school and two hospital systems, has many thousands of employees involved in patient care.) 

Apparently, however, the algorithm assigned only seven residents to that priority list of 5,000 — out of more than 1,300 residents across the whole system, many of them exposed to the virus on a daily basis as part of their jobs. Meanwhile, the algorithm assigned at least some of those valuable slots to Stanford Medicine pathologists and radiologists, who can do most of their work from a lab; or to staff who work in elective surgery units, where patients must be covid negative before they ever reach the operating room. 

The residents, of course, were furious. They fired off a letter to Stanford Medicine’s leadership last Thursday, demanding swift corrective action. Among their demands:

  • A full and specific timeline for staff vaccination;
  • Full transparency into the algorithm that Stanford used to assign vaccination slots, including “an explanation of what checks were in place to ensure that the list was indeed equitable as intended;”
  • A stronger voice for residents within Stanford Medicine’s leadership councils, including a standing monthly meeting between the chief resident and Stanford Medicine’s CEO.

According to an email from Stanford Medicine executives to residents obtained by the Washington Post, management blamed the algorithm and quickly apologized. 

“The Stanford vaccine algorithm failed to prioritize house staff,” the email said. “This should not have happened — we value you and the work you do so highly. We had been told that residents and fellows would be in the first wave. This should never have happened nor unfolded the way it did.”

Precisely how Stanford will rectify this remains unclear. That’s fine; compliance and audit professionals have plenty enough to consider already.

Algorithms and Equity

The first lesson here is to use algorithms with caution. Presumably Stanford wanted to reduce the influence of subjective human judgment in a decision as difficult as who would get the first doses of covid vaccine — but thanks to poor oversight and design of that algorithm, everything backfired. 

For example, residents say the algorithm put them at a disadvantage because they don’t have an assigned “location” within the hospital system, which was one of the variables the algorithm used to assign vaccination slots. Residents also tend to be in their 20s and 30s, so they might seem at low risk statistically even though their job duties put them at high risk. The algorithm missed that nuance, too.

AITo be clear, other staff might still belong ahead of residents. For example, an attending physician who works at the hospital only once a week but is black, asthmatic, and 68 years old may indeed deserve vaccination before a healthy, white, 28-year-old resident who is there every day. These decisions depend on a multiple of variables, and algorithms can be a great vehicle to make them as quickly and fairly as possible — if the algorithms are designed properly.  

In Stanford’s case, the algorithm wasn’t properly designed. It ignored several factors critical to assigning the correct risk profile for residents. So nobody should be surprised that it gave bad results. 

That’s a powerful point about algorithms: they don’t make decisions; they crunch numbers according to pre-existing criteria that humans selected. Those criteria are a reflection of people’s business objectives, ethical priorities, and cultural predispositions. 

So if you want algorithms to give results that will be accepted by large groups of people — say, a diverse workforce wondering who gets dibs on a small number of covid vaccines — you need to be sure those pre-existing criteria make sense.

That brings us to the other important lesson in this spectacle: equity. 

If you read the residents’ letter to Stanford Medicine leaders, you see complaints about equity surface multiple times. They literally say that in the second complaint mentioned above: an explanation of what checks were in place to ensure that the list was indeed equitable as intended.” They demand a stronger voice in management councils. Those are complaints that an important part of the organization feels excluded and disempowered from decisions that affect their lives. 

Are the residents correct to believe that? I don’t know — but regardless, that’s what they believe. Any time management encounters resentment and distrust like that, it’s a huge, waving red flag that something is amiss with the corporate culture. 

The broader lesson for compliance and audit professionals is to consider those issues of equity and oversight before letting an algorithm take over critical business processes. 

You need to assure that the people selecting the assumptions for algorithms to use don’t have ethical or cultural blind spots that could ruin the outcomes. You do that by asking pointed questions: Is this algorithm properly designed? How do we know that? Who was involved in designing the algorithm? Who wasn’t? How transparent is our oversight process? How are we communicating that transparency? 

The residents’ complaint letter draws a direct connection between equity and algorithms: “We believe that to achieve the aim of justice, there is a human responsibility of oversight over such algorithms to ensure that the results are equitable.” 

They raise a good point — one that will hold true long after we’re all back in the office with vaccination marks on our arms. 

Leave a Comment

You must be logged in to post a comment.