Nearly every day, it seems, the media report a massive cyberattack on a healthcare organization. Nevertheless, most physician practices still don’t safeguard their electronic patient information properly.
News accounts of security breaches tend to focus on big healthcare systems, but that doesn’t mean that small and medium-sized practices are safe. In fact, cyber thieves view poorly protected medical records in these practices as easy pickings.
“Lots of hackers target smaller businesses because they won’t have the necessary expertise on staff to fully secure their system,” says Gerard Nussbaum, JD, an independent healthcare consultant based in Chicago.
Practices often lack basic security policies and procedures, allow staff members to share passwords, and fail to turn on or properly configure the security features of their electronic health record (EHR) systems. In addition, many practices fail to perform security risk assessments, despite a requirement to do so under the Health Insurance Portability and Accountability Act (HIPAA).
Here are 10 steps that experts say can help practices defend their protected health information (PHI) and their businesses from cyber criminals.
Do a security risk assessment
Besides being required by the HIPAA security rule, security risk assessments must be performed annually to meet the criteria of the Meaningful Use EHR incentive program and Medicare’s new Merit-based Incentive Payment System (MIPS).
If a consultant is required, a security risk assessment can cost several thousand dollars, says Lee Kim, JD, director of privacy and security for the Healthcare Information Management and Systems Society (HIMSS). There are also costs for security risk mitigation, Nussbaum points out. For example, practices might have to buy extra software to supplement their EHR’s security tools, which may cover only some aspects of security.
Online guides from HIMSS and the
Office of the National Coordinator for Health IT (ONC) can help practices perform security risk assessments. Even small and medium-sized practices can do this, says Mike Sacopulos, JD, president of the Medical Risk Institute in Terre Haute, Indiana. However, he advises hiring a consultant for the initial evaluation if the practice has never done one before.
Under the HIPAA security rule, patient data should be encrypted whenever possible. Any current certified EHR can perform this task.
Experts agree that while encryption is essential, practices should not rely on this approach alone—or on other technical fixes such as antivirus programs and firewalls—to defend the privacy and security of data. The weak point of encryption is that it relies on protecting access to the system, says Mac McMillan, chief executive officer of security firm CynergisTek. If a password is stolen, for example, the thief can use that password to access data, whether or not it is encrypted.
More than 80% of security breaches,
Sacopulos says, result from human factors. While few practice staffers would steal PHI, they could unwittingly introduce malware into a practice network by falling for phishing emails or other tactics.
In many cases, security training can prevent those kinds of breaches, he says. Practices can buy online HIPAA security training or get free training from some hospitals and medical societies.
Control system access
Access control, a key component of security, takes different forms depending on a practice’s network and how its EHR and practice management system are hosted.
In a client-server network, where the server that stores the EHR is located on-site, the providers and staff access the EHR through their computer network. If a practice uses a cloud-based EHR, in contrast, the application and the data are stored on a remote server, and individual workstations and other computers reach the EHR through a web browser.
Cloud-based EHR vendors configure the security features for their servers, “so it’s more seamless and you don’t have to worry about that [part of security],” says Kim. It can be tricky for practices to configure those features properly in client-server networks, she notes; they may have to hire outside help (see sidebar).
On the other hand, the cloud vendor’s security protects only the remote server, not the practice IT infrastructure, she points out. If somebody steals a password, or if malware gets into the network, the practice’s data security is still at risk.
Other deficiencies in basic security can also leave data vulnerable, notes Nussbaum. Many practices, for example, don’t apply security patches to their computer operating systems. Moreover, many groups are still using outmoded operating systems such as Windows XP, which Microsoft no longer supports and represents an invitation to hackers, he adds. No practice concerned about security should stick with these older operating systems, he says.
Most EHRs authenticate users with a login name and a password. Experts say practices should frequently change passwords and make them complex enough to foil hackers.
Kim suggests changing passwords every 60 to 90 days. Even then, hackers using a “brute force” attack—in which the attacking computer runs through a large number of combinations—may be able to figure out a password, she says.
A better approach, she says, is two-factor authentication. This method couples a password with biometric identification, such as a thumbprint, a text that a user has to respond to or some other factor that only authorized users can provide.
Provide remote access securely
Providers may need remote access from home or other locations to do their work. If a practice has a cloud-based EHR, users with remote privileges can access the EHR in the usual way through their web browser.
In a practice with a client-server network, however, remote users must access the network to get to the EHR. If a user’s home computer is infected with malware, a robust firewall coupled with antivirus and intrusion detection software will help keep it out of the system. But cyber thieves can still steal data during remote user sessions if the link with the network is insecure.
To prevent this, experts recommend using a virtual private network (VPN) that encrypts all of the data in transit and disappears after a session is over. A VPN is a secure, temporary computer-to-computer connection within the public internet. Practices can easily download VPN software, Kim says, but need someone with technical expertise to install and configure it.
Adopt role-based access
Most EHRs allow practices to configure their software to limit different levels of the system to employees who need to use that portion of the application and view the associated data.
For example, in an EHR integrated with a practice management system, a receptionist may only need to use the scheduling application; role-based access would not let that person access any clinical or financial data.
This approach helps protect privacy and prevent the use of PHI to commit fraud, Kim notes. In addition, she says, if a user’s password is stolen and that person has only partial access to the EHR, it limits how much damage the thief can do.
Don’t store data on user devices
“What small providers have in place in terms of security isn’t typically there to keep information confidential, but to protect their access to it,” notes Kim.
That could explain why many small practices allow users (staff, physicians and anyone with access to the system) to store PHI on their desktop computers, laptops and even mobile devices. But doing so makes the information more vulnerable to hackers, experts say.
Practices should prohibit the storage of PHI on end-user devices, says Nathan Gibson, chief information officer for Charles, West Virginia-based Quality Insights. Centralized storage of PHI on a practice server is safer, he notes, but staff education and training about the risks of local storage are still necessary.
“If the employees aren’t aware of the importance of PHI being centralized, and they start creating PDFs and performing print screens that means that information is still being stored locally,” he points out. Staff should be instructed to access information from the network and not store any data on their own devices, says Gibson.
Many doctors and nurses use tablets and smartphones at work. These devices can pose security threats if they’re allowed to connect to the network or if they can download EHR data.
Some healthcare organizations install remote “wiping” software on laptops and mobile devices employees use at work. In theory, that would allow a practice to erase data on those devices if they were lost or stolen.
However, Gibson points out that the device must be connected to the internet, either via Wi-Fi or the cellular phone network, to receive the signal from the practice. The device may connect to the cellular network automatically when it’s turned on, but that signal can be blocked, he says.
Use and scan audit logs
All certified EHRs have audit logs that record which user did what in the EHR and when.
However, practices often don’t turn these logs on or configure them correctly. According to Kim, a recent HIMSS survey showed that fewer than half of hospitals and practices were using their audit logs or another feature designed to prevent people from tampering with the logs to erase the signs of an intruder. Practices that don’t know how to activate and configure audit logs should ask their vendors or IT consultants about it.
Beyond that, she notes, practices need software that automatically scans their audit logs to detect anomalies that might indicate a cyberattack, such as an unfamiliar user or a known user logging on at an unusual time of the day.
Back up data off site
Practices that use a client-server system should have onsite backup, such as a mirrored server that can replace the main server if it goes down, Nussbaum says.
In addition, he says, all practices should have off site backup, both for security purposes and in case of natural disasters. In addition, practices should maintain off site copies of their financial data, including data from billing systems, general ledgers and payroll systems, he says.
A cloud-based EHR vendor or hosting firm will back up EHR data, says Nussbaum. Practices that have client-server systems should back up their data on a tape and move it offsite at least daily, he adds. It’s essential to keep these backups offline in case a hacker takes over your network, he notes.
Also, he says, backups should be encrypted. Otherwise, a lost backup tape is considered a security breach under HIPAA.
Get business associate agreements
HIPAA requires practices to sign business associate agreements (BAA) with all outside parties with which they share PHI.
These agreements obligate the business associates to safeguard the PHI. Organizations covered by HIPAA do not have to evaluate the security procedures of their business associates, but some experts suggest that practices question business associates about their security practices in general to help safeguard data.
Hashey does this before he signs a BAA, mainly to ensure that outside firms understand the importance of protecting patient information.
Sacopulos agrees this is a good idea, but cautions against including business associates in security risk assessments. It’s impractical because it involves too many entities, he says. Also, if a practice signs off on a business associate’s security practices, it’s assuming a legal duty that it’s not obligated to take on.