• Revenue Cycle Management
  • COVID-19
  • Reimbursement
  • Diabetes Awareness Month
  • Risk Management
  • Patient Retention
  • Staffing
  • Medical Economics® 100th Anniversary
  • Coding and documentation
  • Business of Endocrinology
  • Telehealth
  • Physicians Financial News
  • Cybersecurity
  • Cardiovascular Clinical Consult
  • Locum Tenens, brought to you by LocumLife®
  • Weight Management
  • Business of Women's Health
  • Practice Efficiency
  • Finance and Wealth
  • EHRs
  • Remote Patient Monitoring
  • Sponsored Webinars
  • Medical Technology
  • Billing and collections
  • Acute Pain Management
  • Exclusive Content
  • Value-based Care
  • Business of Pediatrics
  • Concierge Medicine 2.0 by Castle Connolly Private Health Partners
  • Practice Growth
  • Concierge Medicine
  • Business of Cardiology
  • Implementing the Topcon Ocular Telehealth Platform
  • Malpractice
  • Influenza
  • Sexual Health
  • Chronic Conditions
  • Technology
  • Legal and Policy
  • Money
  • Opinion
  • Vaccines
  • Practice Management
  • Patient Relations
  • Careers

Majority of patients OK with artificial intelligence helping with diagnosis or treatment options

Article

Many patients worry about privacy and accuracy

Artificial intelligence is being incorporated into many areas of life, and health care is no exception. But what do patients think about AI intruding into their diagnosis and treatment?

A patient survey reveals that 66% considered AI playing a big role in their diagnosis or treatment thought it was very important, with 29.8% rating AI use as somewhat important.

However, not everyone was comfortable with AI playing a big part in their health care. Of those surveyed, 31% said AI usage made them very uncomfortable and another 40.5% were somewhat uncomfortable receivinga diagnosis from an AI algorithm that was accurate 90% of the time but incapable of explaining its rationale.

The survey appeared in JAMA Network Open.

Overall, most patients believe aI would make health care much better (10.9%) or somewhat better (44.5%) comparted to 4.3% who thought it would make it somewhat worse or much worse (1.9%). The remaining 19% did not know one way or the other.

For respondents who answered “don’t know” 59.7% of them wanted to be told when AI played a small role in their diagnosis or treatment, and were very uncomfortable with receiving an AI diagnosis that was accurate 98% of the time but could not be explained.

Comfort level varied depending on which application AI was being applied to. For example, 55% were either very or somewhat comfortable with AI reading chest radiographs, but that number dropped to 31.2% when the task was making a cancer diagnosis.

Most respondents were concerned about misdiagnosis (91.5%), privacy breaches (70.8%), less time with clinicians (69.6%) and higher costs (68.4%). Respondents who identified as racial or ethnic minorities ranked these issues higher than white respondents.

The researchers recommended that clinicians, policy makers, and developers should be aware of patients’ views regarding AI. Education for patients on how AI is being incorporated into care and the extent to which clinicians rely on AI to assist with decision-making may be neces

Related Videos
Kyle Zebley headshot
Kyle Zebley headshot
Kyle Zebley headshot
Michael J. Barry, MD
Hadi Chaudhry, President and CEO, CareCloud
Claire Ernst, JD, gives expert advice
Arien Malec