Many private practices lack written policies and procedures for data security and haven’t done a security risk assessment. Here's how to secure your practice.
Many private practices lack written policies and procedures for data security and haven’t done a security risk assessment, health IT consultants say.
These omissions are a mistake for several reasons, the observers note. First, both the Health Insurance Portability and Accountability Act (HIPAA) security rule and the meaningful use criteria require periodic security risk assessments, and HIPAA mandates written policies and procedures. If you’re subjected to a HIPAA audit and found to be in violation of the rules, you could be facing a stiff fine. If your meaningful use attestations are audited, you might have to return your electronic health record (EHR) incentive payments to the government.
Related E-Book | What if you had a HIPAA incident tomorrow?
Security breaches can also open you up to lawsuits from patients and damage your reputation in the community. Moreover, if the breach is large enough to require you to report it immediately to the Office of Civil Rights (OCR) in the U.S. Department of Health and Human Services (HHS), OCR may investigate your security procedures.
Most physicians are at least vaguely aware of these perils. So why don’t they pay more attention to data security? Some doctors are unaware of the need for security risk assessments because they’re too busy to keep abreast of compliance requirements, says David Zetter, a consultant in Mechanicsburg, Pennsylvania. Others know the rules but figure there’s only a slim chance they’ll be caught if they ignore them, he adds.
While it is difficult to keep track of all the government requirements, this is an area that you don’t want to ignore or be ignorant of. In either case, you’re putting your practice, your patients, and your own financial security at risk. Here are some basics to consider as you evaluate your current security posture.
Security approaches differ by practice setting. Large medical groups and healthcare systems have their own IT staffs and can afford to hire security consultants. Small and medium-sized practices, in contrast, usually depend on their EHR vendors and local computer service companies to implement the security options they have chosen.
You need your IT vendors to establish data security, but you can’t rely on them to protect you. While they must all sign business associate agreements under the latest iteration of the HIPAA rules, their liability is limited to the security breaches they cause directly, Zetter notes.
For example, if the EHR or network vendor made a mistake in configuring the system, and protected health information (PHI) was exposed as a result, that vendor would be responsible. But if a practice chose not to encrypt its data or didn’t secure its mobile devices, the practice would be liable. Theoretically, an EHR developer would be liable if a software design flaw led to the unauthorized release of PHI; but none of the experts we consulted had heard of that happening.
Employed physicians must follow the security policies and procedures of their healthcare system or group. If an employed doctor violates HIPAA rules, the healthcare organization is responsible. But those physicians may face a range of sanctions from their employer. In fact, HHS requires that organizations have a sanctions policy for employees who violate HIPAA, notes Ron Sterling, CPA, a health IT consultant in Silver Spring, Maryland.
The type of liability a physician has may depend on the nature of his or her relationship with a hospital, says Mac McMillan, chief executive officer of the security firm CynergisTek and chair of the privacy and security policy task force of the Healthcare Information and Management Systems Society (HIMSS). “In some cases, they’re autonomous; in other cases, they’re almost like an employee; in other cases, they manage their staff in their own practice locations, but they get other services from the hospital, and those are governed by the hospital policies,” McMillan says.
But regardless of their hospital relationship, he adds, non-employed physicians are responsible for complying with HIPAA rules.
Most practices have an on-site client-server system or use a cloud-based EHR. If you have the latter, the EHR vendor is responsible for the security of the server that stores your application and data, as well as for data backup. If you have an on-premises server, that’s your responsibility.
The physical security mandated by HIPAA includes having a locked room or closet where your server resides. In addition, off-site data backup is required. You must have policies governing the receipt and removal of hardware and electronic media containing PHI to and from a facility, and you must implement policies to protect PHI from improper alteration or destruction.
McMillan strongly advises that small and medium-sized practices consider outsourcing their health IT to remote hosting companies. “For the physician, it’s like buying a service: he’s buying an EHR, email, network support, workstations, file servers and data storage, and it’s all hosted in a virtual environment. So he doesn’t have the headaches of having to understand how to secure the system. He’s buying it as a service.”
From a security standpoint, McMillan adds, “the only thing practices are responsible for are their own employees and their physicians, and how they interface with that system and what they do with the information once they have access to it. That’s much easier for them to manage.”
Some of the larger EHR vendors, including Epic, Cerner, McKesson, Allscripts, and eClinicalWorks, offer this kind of soup-to-nuts hosted solution, McMillan notes. Alternatively, he says, a practice could use a third party hosting firm that understands HIPAA requirements. The total cost of ownership for running your own client-server network, he says, is probably greater than the fees you’d pay to a remote hosting service.
David Boles, D.O., who leads a 12-provider practice in Clarksville, Tennessee, says his practice recently decided to switch to remote hosting “because keeping up with the security requirements got to be more than I wanted to deal with.”
While it’s too soon to evaluate the results, he notes that he made the switch after a cloud-based EHR offered by his group’s longtime vendor failed to work as promised. The group went back to the EHR’s client-server version; but rather than invest in new servers, Boles decided to hire the remote hosting company.
Regardless of how your system is set up, there are certain security basics that you need to be familiar with.
To start with, the experts say, you should encrypt all of your data. Encryption is a strong defense against thieves and is considered nearly unbreakable, note McMillan and Sterling. It is possible that a “brute force attack” could be used to obtain a user password, which would sidestep the encryption, Zetter says. Questioned on that point, McMillan replies, “It’s certainly possible, but encryption is still a sound risk mitigation and liability manager response.”
Encryption is especially important on laptops, smartphones and computer tablets, because these devices can easily be lost or stolen. In fact, lost or stolen mobile devices account for 39% of the security incidents in healthcare, and for 78% of the records compromised in security breaches, according to one study.
One way to prevent theft of mobile devices is to prohibit providers and staff from taking them out of the office or facility, Zetter notes. If a physician goes to the hospital, he points out, that doctor can use a hospital laptop and connect to the office network from that device.
If a laptop or other mobile device is lost, and PHI is on it, the incident should be reported, Zetter says, even if the data is encrypted. “Because if you fail to and the government finds out, you’re going to be in bigger trouble,” Zetter says.
Sterling takes a different view. “If data is properly encrypted, it’s not considered PHI,” he says. “If I lost a thumb drive with all kinds of encrypted information on it, that wouldn’t be considered a breach.”
What constitutes a security breach under HIPAA is discussed later in this article. At this point, it’s just important to understand that encryption greatly reduces the possibility of such a breach.
Another strategy that many practices have adopted is to set up their computer systems in such a way that PHI is stored only on their servers or in their cloud-based EHRs. Desktops, laptops and other mobile devices that doctors and staff members use are not allowed to store PHI.
Some practices have “thin-client” networks, where the desktops in the office are dumb terminals that cannot store programs or data. Other practices can’t use that approach because the physicians have to carry their laptops with them when they travel to other practice settings. They keep the EHR applications on their laptops but don’t store any data on them.
For example, Jeffrey Kagan, MD, an internist in Newington, Connecticut and a Medical Economics editorial consultant, and his partner use laptops when they visit patients in nursing homes and when they travel. Several years ago, they stored all of their patient records on their laptops, synching with the office server every day.
Then, because their laptops didn’t have enough disk space, they stopped storing PHI on them and began using remote access to the network when they needed to see their records.
Boles’ practice discourages providers from taking laptops out of the office, but allows remote access to the system from home computers. “We’d never get through with the paperwork if we didn’t let people work at home, too,” he says.
Security experts advise caution when using personal computers, because they can be infected with malware or used as conduits to break into a network. If you do use a personal computer, McMillan says, remote access should include a proxy server or a virtual private network to ensure you don’t store any PHI on the personal computer and to shield the network from unauthorized intrusions.
Good access controls are critical, McMillan notes, because thieves impersonating users can gain access to EHRs. Besides having strong passwords, practices should deploy “two-factor authentication,” he says. Under this approach, which he says is very affordable, the practice can use a biometric tool, such as thumbprint authentication, or a proximity badge to confirm the user’s identity. Alternatively, users might be asked a personal question when they log on.
To make two-factor authentication less onerous, he adds, you can set up the system so that the password has to be entered only once a day. “You use some second factor associated with the person so they only have to put their username and password in once. Then the system might time out, but I can touch it with my badge or my fingerprint and it comes right back up,” McMillan says.
Two-factor authentication also can be used for remote access, he says. iPhone users, for example, can download a free app that enables this kind of identity access, while Google Mail provides options for encryption and two-factor authentication.
What should you do if you have a security incident? That depends on whether it’s regarded as a security breach and how many patients are involved.
As noted earlier, experts disagree over whether the loss of encrypted data constitutes a breach. The HIPAA security rule says that an impermissible use or disclosure of PHI is presumed to be a breach unless the HIPAA-covered entity or business associate shows there is a low probability that the PHI has been compromised, based on a risk assessment of these factors:
“If there’s a low probability that the PHI was compromised, you don’t have to report it,” Sterling maintains. “But you have to maintain the documentation.”
If the records of 500 or more patients are breached, you are required to notify the patients and HHS within 60 days. If fewer than 500 patients are involved, you don’t have to tell the government right away, but you must notify the patients. If 10 or more patients can’t be reached, you have to make a public announcement that a breach has occurred, Sterling says. You must document all security breaches, regardless of size, and report them to HHS annually.
If a laptop is stolen in a practice where PHI can be accessed only through the network, Zetter advises consulting an attorney. The practice should tell him or her what they think is on the laptop and when it was taken. Then they should ask the lawyer whether they need to notify HHS or the patients immediately.
Kagan says his practice has had a couple of minor HIPAA security issues over the years, but they affected only a few patients. “We jumped up proactively and paid for identity protection for those people for a couple of years,” he says. “If somebody broke into our server, with 22,000 patients’ records in it, we’d have to send them all a letter.”
Templates for security policies and security risk assessments are available for free from a variety of sources, but must be adapted to the specifics of the practice situation, consultants say. HIMSS and the Office of the National Coordinator for Health IT (ONC) have security risk assessment tools online, McMillan notes. Sterling specifically cites ONC’s Security Risk Assessment Tool.
Sterling admits that the first time a practice does such an assessment, “it’s complicated.” But subsequent annual updates are much easier. A group that’s never done it before might want to get some advice from a security consultant, he says.
McMillan, whose company doesn’t work with small practices, echoes Sterling’s point. A few thousand dollars for a security risk assessment, he says, is “small potatoes” compared to the amount that a practice might have to refund to the government if its meaningful use attestation is ever audited.
If a practice can’t afford to hire a consultant, there are vendors who can walk you through the process using online software. “There’s a whole group of security vendors now that cater to the small practice. And there are some good ones.” McMillan says.
Zetter agrees, noting that one vendor he knows will help practices perform a security risk assessment for $350.
Your practice can do an adequate job of safeguarding your PHI. But it takes some dedicated effort to find out what you need to do and to make sure that it gets done. That could prove challenging. Boles and some of his colleagues, for example, did their own security risk assessment this year, having laid off the in-house IT technician who used to do it.
“We go through it the best we can,” he says, “but it’s like the IRS code.” Hiring a consultant, however, would be too expensive, he adds.
Kagan says he’s concerned about security risks, “but I’ve got so many concerns going on simultaneously. I’m more worried about the quality of patient care, malpractice suits, and my reputation in the community. Cybersecurity and HIPAA issues just get a lower priority for most doctors.”
That’s all true, until the HIPAA police come knocking at the door. Then you’ll be glad you did your due diligence on data security.
$21,906,500 - Monetary settlements, as of June 19, 2015, involving HIPAA Privacy, Security and Breach Notification Rules
$4.3 million - The lone civil money penalty issued by OCR for violations of HIPAA Privacy Rule
115,929 - Number of complaints received by OCR since compliance date of HIPAA Privacy Rule in April 2003, as of May 31, 2015
1,216 - Compliance reviews initiated over that same time period
15 - Resolutions of cases involving the HIPAA Breach Notification Rule, as of May 31, 2015
$15,581,000 - Monetary settlements tied to those resolution agreements
549 - Number of referrals made by OCR to the U.S. Department of Justice for criminal investigation tied to knowing disclosure of obtaining protected health information in violation of HIPAA
23,580 - Number of cases investigated and resolved by OCR requiring technical changes in privacy practices and corrective actions, or technical assistance to, HIPAA covered entities and their business associates, as of May 31, 2015.
Source: HHS’ Office for Civil Rights
HIPAA rule violations: Categories and penalty amounts
The Health Insurance Portability and Accountability Act Omnibus Rule establishes four “tiers” of violations, based on what it terms “increasing levels of culpability,” with a range of fines for each tier.
Violations of the same requirement or prohibition for any of the categories are limited to $1.5 million per calendar year.
The language of the rule states that actual dollar amounts will be based on “the nature and extent of the violation, the nature and extent of the resulting harm, and other factors…includ[ing] both the financial condition and size of the covered entity or business associate.”
|Did not know of breach||$100 to $50,000|
|Had reasonable cause to know||$1,000 to $50,000|
|Willful neglect, corrected||$10,000 to $50,000|
|Willful neglect, not corrected||$50,000|