
Patients trust physicians over AI for diagnosis but see a role in cancer detection, survey finds
Key Takeaways
- Most Americans are cautious about AI diagnosing independently but are optimistic about its role in cancer detection assistance.
- Familiarity with AI tools increases comfort and trust, with exposure leading to more positive perceptions.
Most Americans remain cautious about AI diagnosing them on its own, but views improve when the tech assists clinicians.
Most Americans are hesitant to let
Only 17% of respondents said they
The study was led by Michael Sobolev, Ph.D., a behavioral scientist at the Schaeffer Institute for Public Policy & Government Service at the University of Southern California, and Patrycja Sleboda, Ph.D., a psychologist and assistant professor at Baruch College, City University of New York.
Their research specifically looked at public attitudes around an AI-assisted diagnostic tool for cervical cancer that analyzes digital images of the cervix using automated visual evaluation, a technology also called automated visual evaluation.
Familiarity builds comfort
People with personal experience using AI tools felt more positively about its use in medicine. In the surveys, 55.1% of respondents said they had heard of ChatGPT but not used it, while 20.9% said they had heard of and used it.
“We were surprised by the gap between what people said in general about AI and how they felt in a real example,” Sobolev said. “Our results show that learning about specific, real-world examples can help build trust between people and AI in medicine.”
Sleboda added that familiarity appears to shift how people weigh risks.
“Our research shows that even a little exposure to AI — just hearing about it or trying it out — can make people more comfortable and trusting of the technology,” Sleboda said. “We know from research that familiarity plays a big role in how people accept new technologies, not just AI.”
What people value most
When participants were asked to evaluate the cervical cancer tool on a 1 to 5 scale across five measures — understanding, trust, excitement, fear and potential — potential received the highest average rating. It was followed by excitement, trust, understanding and fear, which ranked last.
Identifying as male and having a college degree were associated with higher levels of trust, excitement and perceived potential, and lower levels of fear about AI in health care overall.
A separate analysis of the survey results found that reactions to a specific AI use case were more positive than broad attitudes about AI for general diagnosis, researchers said. Sobolev also leads the Behavioral Design Unit at Cedars-Sinai Medical Center in Los Angeles, where he works on human-centered innovation.
An organizational release summarizing the findings noted that participants who learned about an AI tool for spotting early cancer signs were more likely to describe the technology as having “great potential” and express excitement rather than fear.
Newsletter
Stay informed and empowered with Medical Economics enewsletter, delivering expert insights, financial strategies, practice management tips and technology trends — tailored for today’s physicians.








