Could the Right Algorithms Help Fight the Opioid Crisis? - Institute for Experiential AI

Could the Right Algorithms Help Fight the Opioid Crisis?

By Anna Fiorentino

The distinguished lecture series at Northeastern’s Institute for Experiential AI (EAI) showcases individuals within the college and beyond who are introducing AI to our lives in meaningful and transformative ways. Last fall, as part of its “Expeditions in Experiential AI Series,” Angela Kilby, assistant professor of economics at Northeastern, sat down with EAI’s Inaugural Executive Director Usama Fayyad to discuss what Fayyad refers to as a “topic of mystery.” Her lecture, called “Algorithmic Fairness in Predicting Opioid Use Disorder Using Machine Learning,” underscores the value of responsible AI in healthcare delivery.

 

Better Algorithms for Prescribers

new wave of AI is rolling in and permeating the institutional and industrial landscape with innovative breakthroughs and financial gains. But without ethical implementation and accurate machine learning algorithms, the AI experiment could leave some organizations — and in this case, patients — in deep water. Kilby sheds light on the importance of modeling the most accurate data to tell the story of one very high-stakes application, where an algorithm could be the difference between life or death: predicting opioid use through machine learning.

Her timely presentation piggybacks on recent reports citing the highest drug overdose rate in history throughout the pandemic. Kilby asks, could they have been prevented? Her research indicates it’s possible — with fairer algorithms trained on individual patient populations.

“There’s considerable uncertainty now in the medical space over appropriate use of opioids for the treatment of pain,” says Kilby. “Has an increase in prescribing led to a host of social problems with abuse? Has a more recent pull back in prescriptions caused harm to patients in terms of treatment for what can be debilitating pain? Is there undertreatment of pain?”

Kilby’s research, now awaiting publication, detangles dozens of commercial and academic algorithms designed to predict opioid use disorder (OUD) and prevent clinicians from prescribing opioids for pain treatment to patients at risk of drug addiction. To identify whether machine learning (ML) can accurately determine the patients most likely to develop an OUD or worse, die from one, she built an algorithm that mirrored others now in use and identified biases and inconsistencies.

What Kilby found, and outlined in the lecture, was that the ML algorithms used now do not properly inform clinical decision making about the appropriate opioid prescriptions for the general population. On the contrary, they could be directing providers to deny treatment to the patients who need it the most and to prescribe opioids to others at risk of developing OUD.

“The algorithms don’t provide any value to what clinicians are already doing,” and, says Kilby, even “suggest that if doctors realign their prescribing to be more concordant with algorithm risk scores, prevalence of OUD might rise, which is a concerning result.”

 

Data Biases and Restrictions

Today, before prescribing any patient an opioid, clinicians review a computerized system that processes an ML algorithm — similar to Kilby’s — to flag patients at risk for OUD. The problem is that these algorithms are trained on databases filled with patient history that present biases and unavoidably limit privacy restrictions. They fail to consider variables like class and race and present glaring concerns around unfair treatment of individuals with disabilities, who likely experience the most chronic pain and are therefore considered most at risk of developing an addiction.

In an attempt to avoid selection bias right off the bat, Kilby also found the data is first trained solely on limited records of patients who’ve already received opioid therapy at least once. And, most concerningly, it fails to estimate differences in the treatment effects for individuals within specific populations, like those without chronic pain.

While the algorithm succeeds at directing doctors to cease opioid prescriptions for patients with the most chronic pain, the AI did not pick up on the adverse effects of opioid prescriptions among patients without chronic pain. Both groups saw similar reductions in developing OUD after ending prescriptions and were therefore equally at risk, but the data told a different story.

“That implies that reallocating opioids away from patients with a high risk of developing OUD and toward patients with low risk could even worsen the situation, potentially,” she says.

 

The Opioid Crisis Meets AI

To realize the value of using AI to predict opioid addiction, one must acknowledge the origin of a highly publicized and controversial epidemic that dates back 30 years, Kilby notes in her lecture. Concern had been mounting that pain was being undertreated, and in the 1990s, the medical community began encouraging the liberalization of opioids to treat pain. That’s when oxycontin was first approved.

But by 2011, overdose rates from opioid misuse had spiraled to a threefold increase. Five years later — over a decade into the crisis — the Centers for Disease Control (CDC) finally reversed guidelines, no longer deeming opioids a preferred treatment for pain. But the decision came at a cost. The CDC received blowback from the American Medical Association, who stated it was devoid of a patient-centered view, and most of us know what happened next: the death rate went up from misuse of an illegal alternative, heroin, which was eventually contaminated with an even more deadly drug called fentanyl.

According to Kilby, by now, society has learned to accept the general notion that the cost to the patient of not receiving an opioid for chronic pain is low, and the cost to society is high.

“But that relates back to the motivation of this talk, which is that it is quite in dispute. I have work that shows that a reduction in prescribing causes significant reductions in one’s abilities to work and other negative outcomes for pain patients,” she says, adding that the issue isn’t fully settled and it’s a slippery slope given the negative and deadly effects of developing OUD.

Today, with the opioid crisis at its height and still little to no evidence-based research delineating the benefits and harms of opioid therapy use, experts are shifting the fight to end the opioid crisis to ML algorithms trained to predict and prevent OUD. Data and medical scientists are teaming up to look at variables like a patient’s medical background, including previous use. And, says Kilby, we still have a long way to go, particularly when it comes to training models to investigate treatment effects on individual populations.

“We’re motivated by a search for ways to identify patients who can be prescribed opioids safely, without prescribing opioids to people who may experience harm,” she says.

Read a recent story in Wired highlighting Kilby’s research and the stories of people who’ve been in chronic pain for decades because they couldn’t get an opioid prescription.

To learn more about how AI can solve problems in business and the real world, visit our website, which is chock-full of articles and faculty lectures demystifying these applications in AI.