Skip to main content

How Can AI Augment Clinical Judgement?

Practice Accelerator
September 1, 2023

Editor's Note: In this interview, Nancy Morgan, RN, BSN, MBA, WOC, reviews several artificial intelligence (AI) modalities that can augment wound and health care, assuage provider's fears of adopting these new technologies, and some considerations for integrating these technologies in one's practice.


How Can AI Augment Clinical Judgement? from HMP on Vimeo.


Transcript

How can AI help or augment clinical decision-making in a wound care practice?

AI speeds up the decision-making process by rapidly processing data and providing relevant insights. Clinicians can focus on patient care and less on manual data analysis. By using an AI, it's going to help give us kind of an early warning system. The AI can actually identify potential complications or risks, allowing clinicians to intervene sooner.

Consistency and standardization: AI technology can ensure consistency in wound assessment and documentation, reducing the variability among different clinicians, and promoting standard of care.

For providers who are hesitant to incorporate AI systems into their practice, what advice would you give?

So, my advice that I would give other providers, clinicians, in regards to AI, well, I would say just start slowly, okay, baby steps. I want you comfortable, but this is a direction that we all have to go. If you want to have a career and you still got several years ahead of you to work, time to jump in the game, guys, don't be afraid. Remember when the computer came out years ago? Some of you probably don't.

Well, we all had to get with the program then and start working on computers, and now look what it has done to us. It's just changed our lives. So, think of AI the same way. So, start slowly. I want you to educate yourself. Take time to understand how different AI systems work.

Involve your patients. Some of your patients are pretty techy savvy as well. Why not? If they're interested in that, that kind of brings them into the whole care plan of things as well. I would talk to them about data privacy. Assure patients that their personal and medical data will be handled with the utmost care and adherence to privacy regulations. Address any concerns that they might have about data security.

Another thing that I would tell clinicians is gradually integrate certain AI technologies, you know. Learn an AI tool, get comfortable with you and your team, get some feedback from your team and your patient, and then, hey, maybe it's time to add another tool. And just kind of do some gradual waves of integration, so it's not overwhelming to you and your office.

And also, I would always try to keep a feedback loop just for my staff so that way they can provide any kind of feedback or any certain experiences that we can enhance. Maybe we can do a better job on. So that way, it's open door for your employees. And if the patients want to say something to it, we should actually document that as well.

How do you ensure patient trust in AI?

Transparent communication. Be upfront with your patients on the role of AI in their care. Explain to them that this is a tool that assists you, it’s not to replace you, or their doctor, it’s just to help them with their individual needs and preferences. That’s why we use these tools, to help customize the care for that patient.

Also, another thing that we should do is, let's demonstrate AI's value not only to our patients but also to our colleagues that maybe are scared. Maybe they're just like, oh, they don't want to learn something new. Oh, they're going to be retiring soon. “I don't want to get into something new.” This is easy. If I can do it, you can do it.

How can clinicians ensure the AI technology they incorporate into their practice does not put them at risk?

Make sure that you do a thorough evaluation before adopting any AI technology. Conduct a thorough assessment. Consider its purpose, its benefits, its potential risk. Is it compatible with your practice workflow? Does it make sense with the nature of your business?


How Much Do You Know About Artificial Intelligence in Health Care? Take our quiz to find out! Click here.


Vendor due diligence. Research the AI technology provider extensively. Check their reputation, experience, client reviews, and whether they adhere to medical regulations.

Clinical validation. Ensure that AI technology has been clinically validated and tested in real-world scenarios relevant to your specialty. Published research and peer-reviewed studies can provide valuable insights. Integration with the EHR. Ensure that the AI system that you're using can integrate with your electronic health record. Make sure it's going to facilitate efficient information exchange and reduce the risk of errors.

Training and education. Well, invest in proper training for yourself and your team to effectively use AI technology. This training helps prevent misuse or misinterpretation of its outputs. Regularly assess the AI system performance and outcomes. If you notice any inconsistencies or errors, address them promptly and document your actions. Collaboration. Involve your IT department, tech experts, and hospital administration that can offer insights into implementation, security, and risk management.

About the Speaker 

Nancy Morgan, RN, BSN, MBA, WOC is an experienced clinician, successful business leader, and accomplished educator in the field of wound management. She is the co-founder of the Wound Care Education Institute, (WCEI®) and Wild on Wounds Productions. Nancy is a distinguished wound care educator, delivering nearly 1200 lectures, conference keynote addresses, seminars, webinars, and bedside consultations for more than a quarter-century.

The views and opinions expressed in this blog are solely those of the author, and do not represent the views of WoundSource, HMP Global, its affiliates, or subsidiary companies.