• Revenue Cycle Management
  • COVID-19
  • Reimbursement
  • Diabetes Awareness Month
  • Risk Management
  • Patient Retention
  • Staffing
  • Medical Economics® 100th Anniversary
  • Coding and documentation
  • Business of Endocrinology
  • Telehealth
  • Physicians Financial News
  • Cybersecurity
  • Cardiovascular Clinical Consult
  • Locum Tenens, brought to you by LocumLife®
  • Weight Management
  • Business of Women's Health
  • Practice Efficiency
  • Finance and Wealth
  • EHRs
  • Remote Patient Monitoring
  • Sponsored Webinars
  • Medical Technology
  • Billing and collections
  • Acute Pain Management
  • Exclusive Content
  • Value-based Care
  • Business of Pediatrics
  • Concierge Medicine 2.0 by Castle Connolly Private Health Partners
  • Practice Growth
  • Concierge Medicine
  • Business of Cardiology
  • Implementing the Topcon Ocular Telehealth Platform
  • Malpractice
  • Influenza
  • Sexual Health
  • Chronic Conditions
  • Technology
  • Legal and Policy
  • Money
  • Opinion
  • Vaccines
  • Practice Management
  • Patient Relations
  • Careers

HIMSS23 roundup

Publication
Article
Medical Economics JournalMedical Economics June 2023
Volume 100
Issue 6

Talk of AI dominated the health conference, but there were plenty of other featured technology innovations and strategies

The annual Healthcare Information Management Systems Society Global Health Conference & Exhibition (HIMSS) took place from April 17-21, 2023, in Chicago, Illinois. Artificial intelligence (AI) was a dominant theme this year as health care experts discussed how it could be used to make both patients’ and physicians’ lives easier, often by combing through massive amounts of data to find relevant facts or assist physicians with a diagnosis. And although AI may have been the star of the show, there were plenty of other technology-related innovations and strategies presented. Here’s a roundup of some of the topics covered.

HIMSS23: ©HIMSS

HIMSS23: ©HIMSS

Interconnected data and AI

Physicians aim to benefit from advances in data and artificial intelligence

Imagine how helpful it would be to a physician if a nonverbal patient could somehow communicate their pain level or if that 200-page electronic health record (EHR) could be quickly mined for just the laboratory results.

Both things are already happening, and more innovations like them are on the way, according to the panelists at a presentation at HIMSS. Don Woodlock, head of global health care solutions at InterSystems, led the discussion about how technology is changing care delivery.

Philip Daffas, CEO and managing director of PainChek, talked about how AI was helping caregivers understand the pain levels of nonverbal patients. “The AI tool analyzes facial expressions. The care person observes them and then creates a score that is documented into the EHR,” Daffas said. “It allows the caregiver to assess their pain, administer controls and then reassess how it is working.”

Daffas pointed out that the AI component is not about taking away jobs or trying to be better than a physician but about providing information for the physician or nurse in ways that are not always humanly possible. For example, the AI tool analyzes facial expressions with a three-second analysis.

“Can a human do it? Yes, if they are really well trained,” Daffas said. “The challenge is to pick up the minor expressions not visible to the naked eye: up to nine different ones at a time. Once the AI is trained properly, you know you have a reliable data set that can help take a pain assessment.”

Initially developed for nonverbal patients with dementia, Daffas said the technology is being expanded to analyze preverbal infants, helping physicians assess the pain levels of children who can’t yet communicate. This AI application can assist physicians in accurately diagnosing pain levels, reduce reliance on pain-killing drugs and help the patient.

Jay Nakashima, executive director of eHealth Exchange, spoke about how data sharing and interoperability can help physicians understand the patient in front of them.

In a care setting, when a patient shows up and is asked about what medications they take, the response might be “a red one, a blue one and a green one” — an answer not particularly helpful to a physician. But when a physician can see not only what medications were prescribed but also which prescriptions were picked up, they get a fuller picture of the patient’s total health.

As data exchange becomes more standardized, more data will follow the patient regardless of where they are, and those data will be more useful and accessible to the physician. For example, Nakashima pointed out that instead of a physician having to sift through 200 pages of a medical record, the laboratory results or medication list can be quickly accessed.

AI, when combined with these standards, has the potential to offer diagnosis help. An AI tool can scan through pages of notes and data, not just doing a keyword search but also looking at intent, and form a hypothesis about what the patient may be experiencing. This hypothesis would then be presented to the physician for review, Nakashima said. This isn’t meant to replace the physician but supplement the physician’s decision-making process.

Nakashima said it is critical to maintain patient and provider trust, and every organization should have clearly defined governance on how data are used and protected and whether they will be sold.

Health care AI going forward

Dreams of AI are nothing new, but the risks require careful consideration

Does artificial intelligence (AI) offer unlimited potential for health care, or will the risks ultimately outweigh the benefits? The keynote panel discussion at HIMSS, moderated by Cris Ross, chief information officer at Mayo Clinic, focused on AI in health care and how the risks must not be ignored just to receive the benefits. Panelists were Andrew Moore, CEO of Lovelace AI; Kay Firth-Butterfield, CEO of Centre for Trustworthy Technology; Peter Lee, PhD, corporate vice president of Microsoft and Reid Blackman, CEO of Virtue.

Ross spoke about how human beings have dreamed of intelligent machines dating back to Homer’s “Odyssey,” which referenced autonomous ships that could navigate through mist and fog and avoid hazards. Using the story of Icarus, who flew too close to the sun and crashed to the earth, Ross posed the question: Just because we can do something with AI, should we do it?

Moore said that organizations should embrace AI, even on a small scale, and push forward with it. There should be a group that looks at the general technology platform and makes sure physicians and others can use it to meet the needs of patients. “Don’t wait to see what happens with the next iteration; start now,” Moore says. “Don’t wait for a small number of experts in Silicon Valley to do it for you.”

Lee said that although generative AI is quickly emerging in capabilities, it is important to understand its implications for health care. If AI is 93% correct when asked a medical question, what about the other 7%?

Although AI can help with physician note-taking, writing justification text for prior authorizations or even role-playing a patient for medical students, Lee says there are also serious risks, including some we may not know about yet. “It is the health care community that needs to … decide whether or how to use the technology,” Lee said.

Blackman voiced concerns about what’s behind technology such as ChatGPT, saying it may be tricking people into thinking they are dealing with an intelligent, reasoning device when it is more of a word predictor that may not be giving results that can be explained.

“It’s magic that works, but for a cancer diagnosis, you need to know how you came to that diagnosis,” Blackman said. “ChatGPT doesn’t give reasons. It looks like it, but it doesn’t. It just predicts the next set it thinks makes the most sense. It is a word predictor, not a deliberator.”

Firth-Butterfield pointed out that in terms of equity, ChatGPT poses serious concerns. Not everyone has the internet access to use it, presenting one problem. Add in bias it might have built in and concerns about informed consent, and you have a bigger issue. And what about accountability when something goes wrong with it?

“Who do you sue?” Firth-Butterfield asked. “Maybe you can pass on the liability to someone else, but what do you have to do to prove that? Do you have to prove you did your own due diligence?”

Firth-Butterfield also said that if organizations are going to use generative AI systems, they need to think about what data are going to be shared with those systems. One company shared confidential material with an AI system, and that information came out as an answer to someone outside the company’s system. “That’s the sort of thing you have to think about very carefully,” she said.

Lee said that AI needs to be examined in each sector, such as health care or education, and not globally, because usage will vary.

Moore said that big companies have been irresponsible with their use of AI and that it will lead to big problems. He says a good use of AI is to help guide users to sites or sources of information that can help them with the problem they are trying to solve. He added that although generative AI has been quickly adopted by many and has made great strides in a short time, perfecting it will take much longer.

He used the example of autonomous cars, where 93% of the time they could drive themselves but each incremental 0.25% gain after that took a year, saying AI will likely follow a similar development path.

Firth-Butterfield, who was a signee of a letter urging a six-month pause on further AI development until it is better understood, said ethics are being overlooked. She doesn’t believe AI will destroy the world, but designers need to make sure it is designed in a way that benefits humanity and where the maximum number of people have access to the tools to avoid exacerbating health inequity.

“Make sure everyone in the company understands AI,” she said. “Know what you want and that the risks are there. You don’t want to negatively affect your brand or lose patients. Don’t get blinded by ChatGPT.”

Create an engaged workforce

How to convince employees to stay at your practice

Every organization would like to have an engaged workforce that is excited to come to work each day and give 100% effort, but to achieve that, you have to commit to it, according to Jonathan Goldberg and Tejal Desai, who presented at HIMSS in Chicago, Illinois. Goldberg is former chief information officer and Desai is assistant vice president of information technology applications, both at Edward-Elmurst Health.

Goldberg says that to engage your workforce, you have to create a culture where the organizational lines are blurred so that everyone feels like a leader. Employees want to have fun and believe they are in control. If you can do that, they will be more committed to their employer. “If you can create an environment where people are not looking for jobs, they won’t leave for jobs,” Goldberg says.

Desai says to be aware of the difference between engagement and satisfaction. The qualities of engagement are enablement, energy, empowerment and encouragement. She says the goal for leaders is to create an environment where employees love their job, but it takes an intentional effort by leadership to accomplish that.

To achieve this end, Goldberg and Desai created employee advisory councils where employees could give honest feedback in an informal environment, sampling a wide range of employees. They also sent out employee surveys on a regular basis.

“Everyone has probably given them, but it’s about looking at the results,” Desai says. “Many of us are guilty of sending surveys but not acting on the results.”

After the team had transitioned to working at home during the COVID-19 pandemic, she said they were surprised to find that survey results showed employees still didn’t have everything they needed for optimal performance. The team worked to fix that, but without the survey, they would have thought employees were set.

Goldberg and Desai also used informal town hall meetings that were less than an hour, combining updates on the company along with time for employees to give feedback. Another thing they did was adjust meetings to be intentional and built on collaboration. In doing so, they also made sure everyone had an equal voice in the meetings.

The final piece was the CrushIT award, where employees could nominate teammates for a quarterly award. Winners were recognized by leadership in front of the entire staff. Goldberg also sent out a weekly email on Friday with a fun tone, addressing anything from leadership ideas to musings while standing in line for coffee.

Goldberg and Desai also spent time developing a holiday card, signed by management, with a message for each employee. In addition, they tried to do something special on random days, such as a “do something nice day” to recognize staff for their contributions.

Goldberg and Desai both say that staff parties went a long way to develop camaraderie and respect with staff, because the more time people spent together, the more relationships were built.

“Someone will be more likely to be forgiving if they like someone, and that helps build a culture of respect,” Desai says.

Goldberg says that developing an engaged workforce is not easy and that leaders should expect to put in effort to get results. But the results are worth the effort: They saw a turnover rate of 2% over two years when the industry average is 25.9%.

With the cost of hiring a new employee above $100,000, he says it’s well worth the effort to retain employees.

AI in action

AI can help measure and manage health disparities

When ChristianaCare, a three-hospital network in Delaware, needed to close health disparities, it turned to artificial intelligence (AI) to help. The network had a diverse patient population, had observed health disparities in outcomes and wanted to address the problem, Yuchen Zhang, data scientist at ChristianaCare, said at the 2023 HIMSS conference.

ChristianaCare started by gathering race/ethnicity data, initiating community outreach and putting a focus team in primary care offices to help people get enrolled in an assistance program. But data were lacking on health disparities among different outcomes, and resources were stretched thin.

The challenge was to develop one metric that would measure all the disparities that could change outcomes. The goal was to have quality of care be consistent regardless of race, gender, ethnicity, geographic location, language, payer or socioeconomic status. They wanted to make sure quality of care could not be predicted by any of these factors.

The team at ChristianaCare built a machine-learning model that looked at various health equity factors and produced data to show how likely it was for each patient to have a negative outcome.

Once the data were accumulated, the team had to decide which primary care practices to focus on. They looked at the expected outcomes compared with actual outcomes, then identified the practices that performed better and share their findings.

The health system then allocated additional resources to help underperforming practices, such as adding a texting system between the primary care office and the patients to improve communication.

By measuring the data and taking actions, ChristianaCare was able to improve outcomes across all racial groups. However, the gap between white and Black patients increased, although outcomes for Black patients also improved.

Zhang said it shows that some interventions are more effective than others. For example, the texting system helped, but fewer Black patients had smartphones than White patients, so the lack of a device limited how effective the program was for Black patients.

Are they patients or customers?

How the patient experience differs from the customer experience, and why medical practices must cater to both

In health care, there has been a shift from thinking about people as patients to thinking of them as customers.

But Amy Goad, managing director of Sendero Consulting, says during a conference presentation that you need to see them as both, depending on where they are in their care journey. Goad presented at the HIMSS conference.

When a person is in proactive mode looking for episodic care, such as a physical examination, it is more of a customer experience because they are looking for convenience. On the other hand, if they need reactive care, such as when they receive a cancer diagnosis, it is more of a traditional patient experience. In this phase, the person is more interested in finding the highest quality care and factors such as wait times are less important if they mean receiving a better outcome.

For example, a person would be a customer if she is in good health and only sees providers for routine care or checkups. If she becomes pregnant, her experience shifts to that of a patient as she spends more and more time in the health system as the baby grows. After the baby is born, she might shift back to a customer experience as the care for herself and the baby moves back to routine.

Goad says both experiences are distinctly different but equally important. The medical groups that can build long-term commitment through the customer experience will benefit when those people need more than routine care, Goad says.

There are three questions to ask yourself:

  • Do I understand what the end-to-end care journey is?
  • Are my patient and customer experiences balanced? (Recent trends show too much effort being put into the customer experience, and the patient experience may be declining.)
  • How do I know whether efforts are effective?

The obstacle to overcome is resource availability, and there are often competing priorities. Additionally, people are tired of the amount of change sweeping through health care.

But by creating a full team effort to address the person in the right way at the right time, it builds long-term loyalty.

“You have to figure out what is most important to that individual at that point,” Goad says. “Take (a) cancer patient, for instance. If they are in the middle of treatment, they probably don’t want to hear about how you can help with their bills. You need to know what the person needs at that point and what resources you have in system.”

The burden does not have to fall on physicians. Nnonmedical staffers can handle many of the customer relations tasks.

One physician in the session pointed out that he doesn’t see customers but only patients and that it’s not about selling people something. The counterargument is that with the competition in health care from Amazon, Walmart and CVS, health care has transitioned to be more customer focused, with an emphasis on convenience.

“That’s why I make the argument that the patient and customer experiences are different, but we must not lose sight that the doctor-patient relationship is unique,” Goad says. “The patient is vulnerable and needs your expertise. You want the person to feel empowered in their journey.”

Related Videos
Kyle Zebley headshot
Kyle Zebley headshot
Kyle Zebley headshot