Do health care organizations need a chief AI officer?
Baptist Health Medical Group CMIO Brett Oliver, MD, says it may be too early to routinely add this new role to the health care C-suite. Find out why.
The audience at the HLTH health innovation conference in Las Vegas didn’t hear the answer they were expecting when members of an augmented intelligence (AI) expert panel were asked whether health organizations should welcome a designated AI executive into their C-suite.
“I think folks were maybe a little surprised that the entire panel was like ‘No,’” said HLTH panelist Brett Oliver, MD, a family physician and chief medical information officer for Louisville, Kentucky-based Baptist Health Medical Group.
With a multitude of vendors on the convention floor marketing AI-powered services and numerous AI-themed discussions taking place all over the conference, the time seemed right for the opposite pronouncement to take place.
Dr. Oliver argues that health systems and medical groups first need to raise the level of their overall “AI literacy” before such an opening is created and that everyone should understand the basics of how AI works.
“If you have just that one person, if you have an AI officer, human nature says, ‘Oh great, that's Brett, he's taking care of it,’” Dr. Oliver said in an interview.
He explained that, by having a chief AI officer, others may feel relieved of any responsibility for AI developments within the organization.
To illustrate his point, Dr. Oliver gave a hypothetical example of a device vendor reaching out to the end users of a hospital’s cancer-treating linear accelerator and saying: “Guess what? In two weeks, we’ll give you an AI upgrade. You don’t have to do anything. It’s going to be great.”
“We need a level of AI literacy so that the physicians using the device say: ‘Whoa, wait a minute. Has it gone through our AI evaluation process? What AI are you talking about?’” Dr. Oliver explained. “Relatively small conversations like that can have a big impact to make sure that the AI that's coming in gets evaluated and is monitored.”
Some prominent health systems have, in fact, chosen to appoint a chief AI officer, and Dr. Oliver added that it was not the intent of the panel, which was convened to discuss “integrating AI into your tech strategy,” to “disparage those that have chosen to do it differently.” It was just that he and the other panelists felt it was not the right direction for their own systems at this particular time.
Baptist Health Medical Group is a member of the AMA Health System Program, which provides enterprise solutions to equip leadership, physicians and care teams with resources to help drive the future of medicine.
Democratizing AI governance
Instead of establishing a chief AI officer position, Baptist Health Medical Group created an AI oversight committee with representatives from the organization’s clinical, operational, legal, finance, marketing and other divisions.
They went on a “12-week journey” to develop an AI governance structure, which included hiring a consultant who “interviewed about 50 people and probably surveyed another 50,” held workshops and learned the goals and intentions departments had for AI.
“On the other end of that, we came up with a plan to stand up what we're calling an ‘AI Enablement Center’ that reports to that oversight committee, and it’s where all the work gets done,” Dr. Oliver explained. “It's that central repository, where all the policies and everything goes through.”
The center includes department representatives called “portfolio leads” and it is where people come forward with a business case for something they would like to accomplish using AI.
Related Coverage
[Article] AI can support value-based care, but challenges must be addressed
Dr. Oliver described the center’s governance and the processes it uses as “democratizing,” and it is where people go if they “have an idea, problem or challenge” that AI may be able to address.
The AMA has developed new advocacy principles that build on AI policy. These principles (PDF) address the development, deployment and use of health care AI, with particular emphasis on:
• Health care AI oversight.
• When and what to disclose to advance AI transparency.
• Generative AI policies and governance.
• Physician liability for use of AI-enabled technologies.
• AI data privacy and cybersecurity.
• Payor use of AI and automated decision-making systems.
• Clinical uses for AI
For the time being at least, radiology is the clinical realm where the most AI is applied, and Dr. Oliver notes that, of the 950 or so AI algorithms cleared for clinical use by the Food and Drug Administration, more than 90% are for imaging.
“There's a lot of acceptance already in that specialty,” he said. “They're not afraid of it. They've been waiting for it. They've seen it now for several years.”
AI’s main use for imaging is in helping to confirm a diagnosis.
“I haven't heard of anybody that doesn't have this sort of principle in place where there's a human in the loop with all of this,” Dr. Oliver said. “No one is getting a diagnosis or a treatment plan straight from [radiology] AI.”
Easing administrative burdens
An area where AI is catching on quickly is applications used to ease administration burdens.
Dr. Oliver noted that this doesn’t mean front-office type tasks, but the documentation load that can weigh down on physicians.
This includes the use of ambient AI scribes which transcribe patient-physician interactions. And then, using machine learning and natural-language processing, the ambient scribes summarize the clinical content of the discussion and then produce a clinical note documenting the visit—though physician review remains a critical step in this process.
Dr. Oliver said that ambient AI scribes have been in use among physicians with Baptist Health Medical Group for about four years, but recent improvements in the technology’s speed and accuracy have led to a significant surge in adoption.
“It can be a career-changer for physicians,” he said. “I’ve heard so many testimonials and it seems to be saving careers.”
People have told him they have delayed retirement plans, have been relieved of a significant document burden and are able to spend more time with their families and do other things they enjoy, Dr. Oliver said, who said he uses it himself.
“When I’m done seeing the patient, I hit ‘Stop,’ and I get my note in seconds,” he explained. “When that note appears, what also appears are prescription or imaging orders queued up for me to review and sign so I don't have to go find them administratively.”
Dr. Oliver noted that he was impressed by how the technology can distinguish between small talk and clinical discussions. So a question such as “Didn’t I see you on the golf course last week?” is ignored, but a question about a possible shoulder injury that the patient first noticed while playing golf is included.
“It's also freeing you up from that keyboard and it’s a behavioral change too, to be able to just sit and look at the patient and listen to them,” Dr. Oliver said.
“We know multitasking doesn't work, right? You're not supposed to text and drive, the CEO at a meeting does not take the minutes,” he added. “And yet I'm supposed to listen to your concerns, develop a differential diagnosis, determine what tests, what physical exam we’re going to do. And then ‘Oh, by the way, Brett, please take some notes because we need to document for billing.’”
The new large language model has sped up the process, allowing the technology to fit better in more physicians’ workflow and adoption is growing by word of mouth, Dr. Oliver said. He noted that about 250 physicians and non-physician providers are now using the ambient AI scribe in a medical group that includes 1,600 doctors and other health professionals.
Other AI administrative applications include drafting prior authorization letters and responses to general questions from patients.
Baptist Health Medical Group is conducting a small pilot with about 15 physicians where AI is helping draft responses to patients.
“The AI generates a draft response that then comes to me for review to either send out, edit, delete, or do my own thing,” Dr. Oliver said. “It’s taking some time to get used to, but I’d say about 20% of the time, the responses can be sent out unedited.”
He cited a 2023 JAMA Internal Medicine study that found patients prefer the AI-generated notes.
“The patient's perception of the empathy of the reply is better with the AI—like not even close,” Dr. Oliver said.
“You can imagine it's the end of the day—or in middle of a busy clinic day—and you've got 20 messages, my replies are going to be short and to the point: ‘Please have an office visit. No? Yes?’” he added. “But then the AI says: ‘Oh, Brett, I understand a cough can be really frustrating. I hope we can get you in here soon.’”