Many patients worry about privacy and accuracy
Artificial intelligence is being incorporated into many areas of life, and health care is no exception. But what do patients think about AI intruding into their diagnosis and treatment?
A patient survey reveals that 66% considered AI playing a big role in their diagnosis or treatment thought it was very important, with 29.8% rating AI use as somewhat important.
However, not everyone was comfortable with AI playing a big part in their health care. Of those surveyed, 31% said AI usage made them very uncomfortable and another 40.5% were somewhat uncomfortable receivinga diagnosis from an AI algorithm that was accurate 90% of the time but incapable of explaining its rationale.
Overall, most patients believe aI would make health care much better (10.9%) or somewhat better (44.5%) comparted to 4.3% who thought it would make it somewhat worse or much worse (1.9%). The remaining 19% did not know one way or the other.
For respondents who answered “don’t know” 59.7% of them wanted to be told when AI played a small role in their diagnosis or treatment, and were very uncomfortable with receiving an AI diagnosis that was accurate 98% of the time but could not be explained.
Comfort level varied depending on which application AI was being applied to. For example, 55% were either very or somewhat comfortable with AI reading chest radiographs, but that number dropped to 31.2% when the task was making a cancer diagnosis.
Most respondents were concerned about misdiagnosis (91.5%), privacy breaches (70.8%), less time with clinicians (69.6%) and higher costs (68.4%). Respondents who identified as racial or ethnic minorities ranked these issues higher than white respondents.
The researchers recommended that clinicians, policy makers, and developers should be aware of patients’ views regarding AI. Education for patients on how AI is being incorporated into care and the extent to which clinicians rely on AI to assist with decision-making may be neces