AZ Dispatch - promo

The use of artificial intelligence in the enforcement of health care regulations


Artificial intelligence (AI) is a virtual construct that uses algorithms, data, and computational power to simulate human intelligence in machines. Physicians and health care professionals often imagine AI as a futuristic, benevolent, childlike humanoid with the unique ability to love, as depicted in the 2001 Steven Spielberg movie A.I. Artificial Intelligence. However, the United States Department of Justice (DOJ) disagrees. In 2022, the DOJ announced the widespread deployment of AI, particularly in the unjust civil and criminal enforcement of health care regulations. The DOJ warned health care professionals of the severe penalties, including stiff fines and criminal charges, for using AI in perceived malfeasance.

The DOJ’s current stance mirrors another Spielberg movie, Minority Report (2002), where a specialized police department known as Pre-Crime uses AI and foreknowledge from psychics to apprehend criminals before they commit crimes. This movie foreshadows how AI could enforce health care laws and regulations in the near future. Imagine a physician merely considering coding their clinical note at a higher level for additional compensation—immediately, DOJ agents could be summoned, or worse, whisked away in a predawn raid if they assert their constitutional rights.

Physicians might scoff, thinking, “Enough of this nonsense.” But consider this: in 2015, the Centers for Medicare & Medicaid Services (CMS) contracted Safeguard Services LLC (SGS) as a Unified Program Integrity Contractor (UPIC) for the Northeast region. SGS uses predictive analytics and AI to audit physicians and recover funds for CMS. Physicians who have encountered UPICs know that nothing good comes of these interactions—CMS demands its “pound of flesh,” and resistance is futile. Even flawless documentation can’t prevent a recoupment if an SGS “precog” targets you. SGS proudly celebrates its recoupment of billions on its website, portraying itself as a defender against fraudulent physician activity.

A skeptical colleague might argue, “Dr. Rifai has lost his marbles.” But consider this: SGS’s parent company, Peraton, is one of the largest government intelligence contractors. Peraton’s contracts cover AI, predictive analytics, and national security for agencies like the FBI, CIA, NSA, and DIA. You’ll see their boast: “Fearlessly solving the toughest national security challenges.” If this doesn’t give you pause about CMS’s AI potential, then carry on—but remember, as Elon Musk said, the government “doesn’t need to follow normal laws.” They will acquire AI by any means necessary.

In recent years, CMS has expanded SGS’s contract to include the Southern and Western U.S. jurisdictions. CMS is even considering giving SGS control over auditing 50 percent of U.S. Medicare and Medicaid regions. If you treat Medicare patients, beware: CMS, through SGS, will target you for delivering what they consider “unnecessary” care, and you won’t know when your number is up.

This is not fantasy. Peraton is also involved in space research, surveillance satellites, and unconventional weapons and is owned by Veritas Capital, which also owns Athena Healthcare. Veritas is rumored to be eyeing the acquisition of Epic. Peraton proclaims, “Together we do the can’t be done.” But it’s not “together” with physicians; it’s Uncle Sam. In this dystopian future, physicians who even contemplate upcoding will face immediate repercussions—a microwave beam from an orbiting satellite, perhaps. Even patients could be punished for “unnecessary” care, with their cell phones overheating to end their doctor visits.

Science fiction? Think again. Consider the recent explosions of cell phones and electronics targeting a political group plotting violence in the Middle East.

Back on Earth, a Pew Research Center survey conducted in November 2023 found that nearly 60 percent of Americans are uncomfortable with their health care provider relying on AI to determine medical care. Physicians share this discomfort, worrying about malpractice liability and flawed algorithms. Despite AI’s proficiency in language processing, pattern recognition, and problem-solving, there are still many pitfalls to its use in patient care.

AI is only as reliable as the data it receives. Corrupted data leads to flawed solutions. Over the next few weeks and months, I plan to share more articles about AI’s role in health care and enforcement based on my real-life experiences. I’ll continue to write, unless Tom Cruise comes after me, like in Minority Report.

Muhamad Aly Rifai is a practicing internist and psychiatrist in the Greater Lehigh Valley, Pennsylvania. He is the CEO, chief psychiatrist and internist of Blue Mountain Psychiatry. He holds the Lehigh Valley Endowed Chair of Addiction Medicine. Dr. Rifai is board-certified in internal medicine, psychiatry, addiction medicine, and psychosomatic medicine. He is a fellow of the American College of Physicians, the Academy of Psychosomatic Medicine, and the American Psychiatric Association. He is the former president of the Lehigh Valley Psychiatric Society.

He can be reached on LinkedIn, Facebook, X @muhamadalyrifai, YouTube, and his website. You can also read his Wikipedia entry and publications.






Source link

About The Author

Scroll to Top