By Chris Myers
“Who Needs Doctors When You’ve Got Algorithms?”
In a Kansas courtroom, a storm is brewing—one that should have every doctor, hospital, and patient in the country paying close attention.
AdventHealth Shawnee Mission Medical Center has filed a lawsuit against Blue Cross and Blue Shield of Kansas City (Blue KC), accusing the insurer of refusing to pay over $2 million in claims. Their tactic? Using opaque “clinical validation audits” powered by artificial intelligence to invalidate physicians’ diagnoses—often without transparency, clinical justification, or any approval from the providers themselves.
If that sounds dystopian, that’s because it is.
AI Over MD
At the heart of this case is Blue KC’s use of third-party vendors like Cotiviti and Apixio—companies that boast about their AI-driven capabilities to root out allegedly “invalid” diagnoses. But let’s be clear: this is not innovation in the service of care. It’s automation in the service of denial.
According to the lawsuit, these vendors regularly acknowledge that a medical condition has been diagnosed by a licensed clinician, documented in the medical record, and falls within the provider’s scope of practice. And yet—they still override the diagnosis. Why? Because it doesn’t meet a set of secret, often outdated, and non-standard criteria that no clinician ever agreed to.
In other words, AI is now acting as judge, jury, and insurer-friendly executioner, retroactively rewriting medical decisions to save payers money.
The Business of Invalidating Care
AdventHealth claims that more than 350 legitimate diagnoses—many of which involve comorbidities or complications that increase payment rates—were declared invalid by Blue KC’s auditors. And not just invalid—invisible. Gone. Erased from the record as if the patient was never sicker than a generic template suggested.
Even worse, these audits are often conducted under the assumption that insurers like Blue KC get to define what qualifies as a diagnosis—superseding not only hospital agreements but also the treating physicians themselves.
This is not medical judgment. This is financial manipulation wrapped in tech jargon.
Apixio’s 60% Invalidation Rate: A Feature, Not a Bug
Apixio’s own marketing touts that 60% of hospital stays it reviews include a “clinically invalid” diagnosis. Think about what that means: They’re effectively claiming that doctors—real, credentialed physicians—are wrong more than half the time. And based on this premise, they deny payment.
How convenient for payers. How dangerous for care.
The lawsuit also notes that appeals are often rejected instantly—hardly the behavior of a system giving thoughtful review to complex medical decisions. This is a machine built to say “no”—fast, final, and with no room for nuance.
What’s at Stake
This is not just a Kansas story. It’s a national warning. As insurers rush to integrate AI under the banner of “efficiency,” they are erecting an invisible firewall between doctors and reimbursement—and patients will pay the price.
The use of AI to undermine diagnoses threatens to:
- Erode trust between clinicians and payers .
- Delay or deny care that depends on diagnosis-based coding
- Bankrupt hospitals already operating on razor-thin margins
- Punish patients with surprise bills or denied claims
And perhaps worst of all: it devalues human medical judgment in favor of algorithms built to maximize denial rates.
The Bottom Line
Technology should support care—not second-guess it in secret.
Blue KC may claim they’re following process. But the process is flawed, opaque, and built to prioritize profit over patients. If a physician’s diagnosis, backed by medical records, can be invalidated by an insurer’s black-box algorithm, we’ve crossed a dangerous line.
It’s time for state regulators and federal lawmakers to draw a line in the sand. Require transparency in audit criteria. Prohibit retroactive denial of clinically appropriate diagnoses. And restore the primacy of medical professionals, not AI vendors working on commission.
Because if we don’t act soon, the next time you’re in the hospital, your care might be documented by a doctor—but your insurer will be listening to a bot.