As healthcare organizations move to integrate artificial intelligence (AI) into clinical workflows, there is a significant gap between the perceived capabilities of the technology and its practical reality, according to industry experts interviewed by HealthTech magazine.
While AI is often touted as a revolutionary force, experts from major academic institutions warn that persistent misconceptions regarding the replacement of physicians, data automation, and cost savings could hinder effective adoption.
One of the most pervasive fears surrounding AI is that it will eventually render human doctors obsolete. However, experts emphasize that AI is designed to augment clinical expertise rather than replace the nuanced judgment of a human professional.
“The first [misconception] is that AI is smarter than doctors and will replace them,” said Dr. Lee Schwamm, senior vice president and chief digital health officer at Yale New Haven Health System. “That’s just not an accurate understanding of where AI will contribute value in healthcare.”
Hongfang Liu, vice president of the learning health system at UTHealth Houston, noted that the technology remains tethered to human oversight for both legal and functional reasons. “AI actually depends on humans for data generation and interpretation, and AI alone cannot function as an agent independently, due to the legal liability,” Liu said.
Another common misunderstanding involves the “normalization” of data—the process of standardizing various formats from different sources. While AI can accelerate this process, it is not yet a fully automated solution.
“A lot of people think AI is so powerful that it can automate and perfectly normalize different formats from all the different sources, and without much human oversight. No — totally not there yet,” said Xiaoyan Wang, a research professor at Tulane University. She noted that it remains “the most challenging problem in healthcare on the data end.”
From a business perspective, many organizations expect AI to deliver immediate cost reductions. Experts argue that while AI can improve efficiency and reduce the burden on providers, it is not a guaranteed money-saver for the healthcare system at large. Most current applications focus on “back-office processes” and “coding support” rather than direct patient care.
Dr. Schwamm clarified that so-called “hallucinations”—where AI generates incorrect information—are often misinterpreted as a sign of a broken system. He suggested these should instead be viewed as “inaccurate predictions” inherent to any prediction model.
He said that chatbots, which have the potential to offer quick and easy answers to patients, aren’t able to offer the full capacity of a doctor.
“Chatbots will outperform physicians in certain circumstances, but they don’t have the ability to absorb context that is not directly provided to them,” Schwamm added.
According to the experts, the path forward requires a shared responsibility model involving clinicians, IT leaders, and vendors to ensure AI is deployed safely and effectively.