2024
While AI has comfortably nestled into administrative corners, trimming inefficiencies and enhancing operational flow, its march into clinical territories is met with a starkly different reality. This isn’t about automating paperwork; it’s about influencing diagnoses, treatment plans, and even patient trust. What happens when AI crosses the line from assistant to authority? When it prescribes medication or becomes the voice patients heed over their own physicians?
The HealthLeaders 2024 AI in Clinical Care Mastermind program, sponsored by Microsoft-Nuance, convened clinical trailblazers from 10 health systems to grapple with these very questions. Their candid discussions reveal not just early victories in radiology and patient communication, but also the raw, unvarnished struggles: navigating governance without stifling innovation, integrating AI without alienating clinicians, and chasing ROI amidst ethical gray zones.
Moving beyond administrative tasks, AI is now influencing diagnoses, treatments, and patient trust—raising critical ethical and operational questions.
Striking the right balance between robust AI governance and fostering clinical innovation is crucial to sustainable success.
Early involvement of clinicians in AI development ensures smoother adoption, min-imizes resistance, and enhances patient outcomes.
Success with AI isn’t just about financial returns—it’s about scalability, sustainability, and ethical deployment within clinical care.
During the Mastermind roundtables, several executives expressed a desire to work with colleges and universities to make sure the next generation of doctors and nurses is coming into the workforce with a knowledge of AI. In some cases that means creating partnerships to give students hands-on training, particularly in clinical situations. But that also means ensuring that educators are up to date with the legal and ethical issue behind using AI in clinical care.
“We have the imperative to swim upstream, into undergraduate medical education, in nursing schools and medical schools, and insert some of this foundational education,” Sheahan says.
is based in Indianapolis, operates 10 hospitals and employs more than 17,000 staff members. The health system reported 2.9 million outpatient visits last year and 7,600 births at its hospitals last year.
For those already in the workforce, executives should be creating opportunities for doctors and nurses to learn about AI on the side, with the understanding that this will improve not only their understanding of how AI will impact care delivery but their ability to adjust to new workflows and opportunities.
“How do we get our people, our nurses, our doctors, our clinicians, our revenue cycle teams, to understand how the technology supports the future of their work?” Sheahan adds. “We can't just focus on deploying the fun new technology and expect that organically it's going to change your business, right?”
In addition, clinicians should be included from the very beginning in developing new programs or using new technology that includes AI. Once again, the benefits are twofold: They’re learning how to use AI properly so that they’ll be able to hit the ground running when the program or tool is integrated into care pathways, and those programs will have a better chance of succeeding, with less provider pushback.
“You must include the clinicians in AI adoption from the beginning,” says Patrick McGill, MD, MBA, executive vice president and chief transformation officer of Community Health Network.
In short, forward-thinking health systems are putting doctors and nurses on the front lines of testing out AI tools and platforms because they’ll be the ones who ultimately make those new tools work. And a health system’s AI strategy and expertise will grow and mature as its providers expand their knowledge.
This is a tricky concept at present, as there are many health systems and hospitals just getting into AI. A number of them are creating their own AI governance committees, which are separate from other innovation and technology committees. The reasoning there is that AI is so big and evolutionary that it needs to be addressed on its own.
is an academic health system that operates eight campuses. The health system has 50,000 staff members and more than 10,000 affiliated physicians. NewYork-Presbyterian is affiliated with two medical schools: Weill Cornell Medicine and Columbia University Vagelos College of Physicians and Surgeons.
Noth everyone agrees with that idea.
Health systems and hospitals need to understand the outcomes and goals they are trying to accomplish with AI, says Robert Bart, MD, chief medical information officer at UPMC.
“We are past the time that people are saying they want to use AI because it is AI,” Bart says. “You need to make sure as you are examining the use of AI in your health system that it drives the type of clinical outcomes you are trying to achieve. In addition, AI models should have the scale that your organization needs.”
Bart says the process for identifying the desired goals and outcomes of an AI tool depends on the problem you are trying to solve.
“You want to identify your goals and desired outcomes early on when you are evaluating whether you want to adopt an AI tool, whether it impacts clinical care, the efficiency of care delivery, or other aspects that a health system has targeted,” he says.
For that reason, some healthcare leaders prefer to include AI as part of the organization’s innovation strategy, and to treat AI as just another technology. Setting AI apart creates extra levels of governance that can slow down the process to a point where clinicians might get frustrated and look to avoid using AI.
Some executives point out that AI is so ingrained in the healthcare enterprise, integrating with so many different departments and functions, that it can’t be set aside and treated on its own.
Again, this is not a tried-and-true strategy just yet. Those that do create AI governance committees and/or C-level AI executives are doing so because AI is now and disruptive, and has the potential to do serious harm if misused or misunderstood.
is based in Peoria, Illinois. The health system, which reported $4.6 billion in total net revenue in Fiscal Year 2024, operates 17 hospitals. OSF HealthCare’s physician network employs more than 2,215 primary care doctors, specialists and advanced practice providers.
which is based in Renton, Washington, owns 51 hospitals and reported $28.7 billion in operating revenue in 2023. The health system employs 122,000 caregivers, including 34,000 physicians and 38,000 nurses.
Establishing a governance strategy should be done early in the process of adopting AI tools, according to Eric Alper, MD, chief quality officer and chief clinical informatics officer at UMass Memorial Health.
“There is so much happening so fast in this space, including regulations, guidelines, and tools that are being released,” he says. “That is going to continue to evolve. So, you must have your set of experts lined up as more AI tools are released and there are changes in the environment.”
UMass Memorial has set up an AI governance committee. The panel has several representatives from key areas at the health system, including clinicians, IT staff, legal team members, risk management staff, ethicists, and people who are focused on health equity.
“Choose wisely and do your due diligence,” Alper says.
Providence has established AI governance to get ahead of several issues raised by AI models, according to Hoda Asmar, MD, MBA, executive vice president and system chief clinical officer at the health system.
“Providence proactively assembled an AI governance structure to ensure alignment around priorities and strategy, and ensure safety, privacy, security, equity, and the ethical use of AI,” Asmar says. “This governance structure will evolve as our experience and knowledge around AI deepens.”
The AI governance structure has several elements, Asmar explains.
“Providence has put together an AI guardrails workgroup led by our system’s chief data officer; an information protection committee led by our chief information security officer; and a data ethics council, led by our chief ethicist,“ Asmar says. “The work of these three teams feed into the Generative AI Leadership Council that oversees our responsible use of AI and advances our AI strategy.”
AI tools are playing an assistive role for care teams, and it is unlikely that they will replace doctors and nurses, according to Asmar.
is a $7.7 billion not-for-profit, regional healthcare system based in Columbia, Maryland, and it’s one of the largest employers in the region with more than 34,000 associates.
which reported $15.8 billion in operating revenue in 2023, is based in Sacramento, California. The health system employs more than 57,000 staff members, including more than 12,000 physicians and more than 15,000 nurses.
which is based in Worcester, Massachusetts, operates five hospitals. The health system employs 16,000 staff members, including 2,100 clinicians and 3,100 registered nurses.
which is based in Pittsburgh, operates more than 40 hospitals and 800 doctors’ offices. The health system employs 100,000 staff members, including more than 5,000 physicians.
“Early indicators suggest these tools have a high level of engagement and satisfaction with the care teams by allowing clinicians to spend more time with their patients, reducing stress and administrative task burdens, and allowing clinicians to focus on what matters most to them and their patients,” Asmar says.
Others aren’t so sure. AI is seen as a key to improving workflows for clinicians and helping them to address challenges that are driving doctors and nurses out of the workforce. But the fact remains that healthcare needs more providers, and there aren’t enough coming up through the system to replace what hospitals and health systems are losing.
Some have suggested that AI can indeed replace the doctor or nurse in tightly regulated and highly monitored instances where consumers need access to care, and providers just aren’t available. Think a direct-to-consumer telehealth platform that now addresses the non-acute concerns of consumers, such as a nagging cough or rash. Could an AI bot at some point take care of that health concern?
When asked to look into the future, Mastermind participants focused on AI as a tool for clinicians and others in the healthcare ecosystem, rather than as a replacement. And they see AI integrating with the technological backbone of the hospital and, in effect, becoming the hospital operating system.
Sheahan envisions an AI-enabled operating system serving the enterprise, responding to queries from pretty much anyone within the health system. The platform might help doctors map the best care plan for patients, give nurses direction on inpatient care, help the rev cycle management team deal with a prior authorization or denial, or even map out the best route for a patient to a specialist appointment across town.
“We want to show the many groups that can benefit from AI why and how to use it to make themselves more efficient,” Sheahan said. “Ultimately, that is what is going to deliver ROI over time. We can make our business more efficient if we're all in this together.”
The HealthLeaders Mastermind series is an exclusive series of calls and events with healthcare executives focusing on pain points that matter most to you. This Mastermind series features ideas, solutions, and insights on excelling in your AI programs. Please join the community at our LinkedIn page.
To inquire about participating in an upcoming Mastermind series or attending a HealthLeaders Exchange event, email us at anorris@healthleadersmedia.com.