• Revenue Cycle Management
  • COVID-19
  • Reimbursement
  • Diabetes Awareness Month
  • Risk Management
  • Patient Retention
  • Staffing
  • Medical Economics® 100th Anniversary
  • Coding and documentation
  • Business of Endocrinology
  • Telehealth
  • Physicians Financial News
  • Cybersecurity
  • Cardiovascular Clinical Consult
  • Locum Tenens, brought to you by LocumLife®
  • Weight Management
  • Business of Women's Health
  • Practice Efficiency
  • Finance and Wealth
  • EHRs
  • Remote Patient Monitoring
  • Sponsored Webinars
  • Medical Technology
  • Billing and collections
  • Acute Pain Management
  • Exclusive Content
  • Value-based Care
  • Business of Pediatrics
  • Concierge Medicine 2.0 by Castle Connolly Private Health Partners
  • Practice Growth
  • Concierge Medicine
  • Business of Cardiology
  • Implementing the Topcon Ocular Telehealth Platform
  • Malpractice
  • Influenza
  • Sexual Health
  • Chronic Conditions
  • Technology
  • Legal and Policy
  • Money
  • Opinion
  • Vaccines
  • Practice Management
  • Patient Relations
  • Careers

AI gets good marks, and some edits, when transcribing patient encounters

Article

Physicians seek best ways to reduce documentation burden that contributes to burnout.

Artificial intelligence (AI) is good – but not perfect – for documenting visits between physicians and patients.

A new study by hand surgeons examined the best ways to log information exchanged by doctors and patients, in hopes of reducing the computerized charting that can contribute to burnout.

“While AI proved to be a promising tool, some verification and correction is necessary for accuracy,” said a news release from the American Academy of Orthopaedic Surgeons (AAOS).

"In our practice, we created a task force to better understand and correct physician burnout to study what we know to be the top reason for burnout ­– patient documentation," Rothman Institute orthopaedic surgeon Michael Rivlin, MD, FAAOS, said in the news release. Rivlin is associate professor at Thomas Jefferson University in Philadelphia.

"We wanted to look at ways to maximize the physician's workload at the maximum level of their license and remove burdens that can lead to burnout by finding methods to outsource certain tasks, such as documentation, as this can be time consuming and redundant," Rivlin said.

Taking notes for doctors

In the study, three orthopaedic hand surgeons evaluated 10 standardized patients with prewritten clinical vignettes. Clinical documentation was performed and examined for four modalities:

  • AI-based virtual scribe service, running on a tablet, and recording everything said in the room.
  • Medical scribe, a human being who is present physically or virtually in the office visit to transcribe the patient encounter.
  • Transcription service, in which a third-party company transcribes audio files that physicians record using Dictaphones.
  • Voice recognition mobile (VRM) application available on an electronic medical record platform that transcribes words based on voice recognition.

"Our physicians who were not involved in the documentation acted out these vignettes and each scenario contained an element of distraction to determine if the AI would be thrown off by various nuances that might occur during a clinical visit – such as a parent and a minor sharing their thoughts, or a patient interjecting a story about a friend's experience with hand surgery in the middle of providing an update on their own surgery," Rivlin said.

In total, 118 clinical encounters were documented including 30 AI scribe, 30 VRM, 28 transcription service, and 30 medical scribe notes. Clinical notes were deemed acceptable or unacceptable and assigned a letter grade – A, B, C or F – using an eight-point scoring system. An attorney reviewed all notes for medical legal risk.

Exam room findings

Overall, all modalities performed well with similar documentation outputs between each, according to the AAOS report. Specific findings include:

  • The AI scribe scored significantly lower than the other modalities for one specific question: "Is the plan correct?" The AI got most of the verbal and implied elements of medical documentation, but parts of the plan sometimes were deficient compared to human counterparts and manual edits were required.
  • Documenting clinical encounters through transcription services and voice recognition mobile applications took more time than auto-populated AI-based notes. The average time per a note for VRM and the transcription service was 3.48 min and 3.22 min, respectively.
  • AI-based scribe services rely on verbalized narratives throughout the entire encounter for accurate documentation, but some verification and correction are needed, unlike a human scribe.

"The AI-based virtual scribe service is a promising tool to help decrease documentation burden without significantly lowering the quality of documentation compared to transcription and voice recognition software services," Rivlin said. "While AI has some limitations, it continues to improve as the technology advances. These results create a palette of options for physicians to compare outputs should they want to explore new modalities."

The study, "Use of Artificial Intelligence for Documentation in Orthopaedic Hand Surgery," was presented at the annual meeting of the AAOS.

Related Videos