Presented by Sarah Badahman, Oct. 30, 2019
Download the PowerPoint presentation here.
- So why, exactly, is healthcare vulnerable?
- The best defense is a robust security program.
- Wrapping it up.
Today is a big topic of “security over compliance” to wrap up this month of National Cybersecurity Awareness.
Our webinar today is being sponsored by ThrottleNet. ThrottleNet is a partner of HIPAAtrek, and we’re very excited they partnered with us this month for our webinar. At the end of the webinar, I’ll give you a little bit of information about ThrottleNet and what they can possibly do for you guys as well.
Let’s go ahead and get started:
We all know that healthcare is under attack. In fact, we won an award last year, but not an award that we want to win. We are actually the most hacked industry in the United States. A lot of this is because hackers know that we are not taking security as seriously as we should, as well as the value of all the medical records. On average, a medical record is worth ten times the value of a stolen credit card on the black market, which leaves us extremely vulnerable.
In fact, hacking has surpassed all other types of breaches in healthcare. The majority of times, a hacking event will affect more than one location because we’re being phished. We’ll get the attack delivered to us through an email and then it can spread throughout our networks from there.
It is also important to note that most cybersecurity attacks are preventable. We’re being attacked, but there are things we can do to help protect ourselves, and we’re going to talk about a couple of those today.
One of the biggest things that might surprise you is the majority of our attacks are coming from overseas. So, a simple blocking of foreign IP addresses can actually help do a lot of good to protect our networks.
The number one reason why healthcare is vulnerable is because security is not viewed as critical to patient care. This means we are only, on average, spending about 6% of our operating budget on security practices. But would you still be able to treat your patients if you didn’t have access to your network? To your EMR? Of if your entire system was locked down to ransomware?
So, we can see that security really is critical to patient care and we should be taking it a lot more seriously.
Another reason why we’re so vulnerable is that shortcuts to adoption of technology in healthcare is seen as culturally OK. Meaning it’s OK if we don’t take the necessary steps to properly vet vendors. So, we use vendors that won’t sign business associate agreements or won’t answer security questionnaires, or we’re not sure what their practices are.
We also will allow our clinicians to share passwords so that their medical assistants can log into the EMRs and workstations for them to save a few seconds when they’re going from room to room. And this is a big problem.
It’s expensive to adopt; it’s expensive to maintain. Healthcare’s budgets are shrinking, and the expenses are getting larger. So, we’re cutting expenses, and technology is unfortunately one of the places a lot of healthcare organizations do cut expenses, which ultimately leaves us vulnerable.
These are the most common ones we see when we come in and do security risk analyses. We see:
Lack of authentication. We have weak password policies, and your passwords are your weakest form of protection. A password can be easily breached or hacked. There are password cracker software out there that hackers use to get in through passwords, and if we don’t adopt stronger password policies, we are left vulnerable.
Lack of adoption of two-factor authentication. Two-factor authentication is when you have a password that you enter in and then you have to swipe a card, or you’ll receive a text message and enter a code. It’s essentially two ways to authenticate yourself into a system that has access to PHI.
We’re also not properly encrypting our data at rest. We’ll talk a little later on about some encryption practices but encrypting your data at rest – not just your data in transit – is super important. In fact, it could actually benefit you if you are hacked or you do have a breach if you can prove your data at rest was encrypted. Essentially that means the hacker would have little access, if any, to PHI.
Insecure email. This is mostly in clinics where we see use of insecure email, where we see free or personal email accounts being utilized to share PHI. These are your Yahoo, Gmail, and AOL accounts. We’ll also see shared email accounts where one practice will have a single email for everybody to share, which also leads into unencrypted email. When you do have PHI, you’re sharing back and forth, you are required to have that encrypted.
Email access on mobile devices. This goes across the board for organizations of all sizes and types because most of us have smartphones where we have our work emails. If we’re not encrypting our smartphones or we don’t have some heftier security on our personal devices that accesses our work email, it leaves us vulnerable for those types of attacks.
Lack of comprehensive inventories. And this is really important. This is not just your hardware. This is a comprehensive inventory of all of the software you use that has access to, can transmit, stores or creates PHI. It’s really important for you to have a solid understanding of everything you use that can have access to PHI, and that it’s all documented.
Lack of basic security protocols. SSL and TLS on websites and applications transmitting PHI is super important. One of the first things we always do when we do a security risk analysis is go onto the website. What we see a lot of times (and again, this is especially for clinics), we see the clinic website has no SSL or TLS and can tell it’s built on WordPress or Joomla.
And it has these “Contact Us” forms on there. Even if your website has SSL or TLS, those web forms are actually plugins that are provided by Joomla or WordPress, and so they’re going through a third party. So, all that data is going to that third party where there’s no business associate agreement and then back to the provider or to the clinic or hospital, and it’s all transmitted in clear text. We never want to have those types of things on our websites because that’s a huge security breach, and we see it all over the place in healthcare.
Lack of data backup and disaster recovery planning. If you have a breach, and you’re only backing up your data at the end of every day, can you survive losing an entire day’s worth of work? If you’re at a larger clinic or hospital, you have to think about the fact that you’re losing so many people’s days’ work. So, it’s really important that you have incremental backups throughout the day.
At HIPAAtrek, we backup our data every 15 minutes, so we’ll only ever lose 15 minutes of work. And then we have a tested disaster recovery plan. And that’s also really important.
Just having a disaster recovery plan that you have written down, if you’ve never tested it, how do you know it works? How do you know you’ll be able to recover your data and get back to work within an appropriate amount of time that will make sure you’re losing the least amount of revenue and patient care as possible? The only way to be able to tell that is if we’re testing out our disaster recovery plans frequently.
Lack of auditing and monitoring procedures. This is probably what every compliance officer hates the most – going in and auditing everybody’s access to make sure nobody’s accessing things they shouldn’t be accessing and that we’re properly monitoring our systems. Some of this can be set up to be automated, and I really strongly suggest that because it reduces some the work.
But unfortunately, some of it’s still going to be manual-type labor, going through your EMR especially and making sure nobody’s accessing things they shouldn’t be. And that is your responsibility, and it’s a huge ability to be able to protect yourself.
This is one of my favorite pictures. It encapsulates everything that drives our IT departments crazy.
Our IT departments put every piece of data security in place. Dave is our human error, and he causes all kinds of troubles for us because it doesn’t matter what we do. Dave is always there.
Computing habits. Some of the things we need to teach Dave is proper computing habits. It’s called a workstation – it’s not your PlayStation. When we have improper web browsing, we’re utilizing our own personal email, or we go on social media at work – these are all things that can cause damage to your work environment. About 80% of places we go to when we do our security risk analyses, we see employees actually looking at pornography at work.
Make sure you’re blocking everything you can to protect yourself against Dave in that situation. Not only does it waste his productivity, but it puts your security at work at risk.
Physical security. Dave is also really bad at physical security. He leaves his workstation unlocked and unattended when he gets up to get a cup of coffee or go to the restroom or just run into somebody else’s office. That can actually cause some damage, too, because Dave is still responsible for all the activity that happens under his login. If he forgets to log out, anybody can look through that.
We know that mobile devices containing PHI are probably our biggest risks. These are our laptops, tablets, and cellphones. When we leave these in vulnerable areas, they will be swiped. So, we need to make sure we’re keeping those under lock and key.
Security practices. We don’t have proper passwords in place. We may have an internal firewall, but we’ve neglected to put in an external firewall. And again, those pesky audit procedures will keep coming back to us. We have to do those.
We also tend to ignore our non-technical vulnerabilities. This is probably one of the biggest problems we see in security because we think of security as an IT function. There are a lot of non-technical vulnerabilities we also need to consider when we considering the security of our electronic protected health information.
We need to make sure we have cross-functional teams, because when you tell your IT department that they’re responsible for security, they’re only going to think about technical vulnerabilities.
So what are non-technical vulnerabilities? They are the physical security of everything. So, our portable devices, anything storage, and all our maintenance records.
Employees. We also have our employees. Again, this goes back to Dave. If we’re not paying attention to him, he can cause us some really big headaches. So, we need to make sure we’ve trained all our employees before we give them access to our protected health information. We need to make sure they’re trained periodically, throughout the year, and that they receive training every time one of our security or privacy practices changes so they have the most up-to-date information.
We also need to make sure we’re properly hiring – this means we’re training our employees before we give them access to PHI. And we need to make sure we’re properly terminating our employees. Whether they resign or they’re being fired, we need to make sure we have an accurate list of everything they have access to and, before they leave our facilities, that all of that has been terminated.
That’s not only considering technical passwords and computer systems they need to be locked out of, it’s also physical locations: Do we need to change punch codes to doors? Do we need to change locks? Do we need to get a key back? Some of that’s going to determine the level of employee being terminated, as well as whether you feel they have any malicious intent toward you after the termination.
Policies and procedures. We also need to make sure our policies and procedures are more than just a binder. You need to make sure your policies and procedures are living, breathing documents that you’re actually implementing, because if you just have policies and you’re not following them, that’s a huge vulnerability.
Third parties. And finally, we need to look at our third parties. Any vendor that we have, we need to evaluate whether or not we need a business associate agreement with them. Can we do a security assessment of our third parties or get some other sort of assurance that our vendors are secure? A third of all cybersecurity attacks involved third parties; that’s a third of all cybersecurity attacks, not just specific to healthcare, but that’s alarming. So, we need to make sure we’re taking it seriously and properly vetting our third parties that we utilize.
If we focus on security, compliance will follow.
We need to understand that HIPAA is not a security framework. It is a regulation. It does have some requirements that fit into security frameworks. However, [HIPAA] doesn’t go in-depth. It’s left purposefully flexible to fit organizations of all sizes.
It is our strong recommendation that you adopt a security framework. You can choose NIST, which is the National Institute for the Standardization of Technology. It is a government-written security framework. This is the framework that is recommended for the DoD. There is no framework from NIST for healthcare, although there is a crosswalk from NIST to HIPAA.
There’s also a HITRUST security framework and there’s ISO, but I tend to like the NIST one better. It’s the easiest one to understand. We’re going to go through a few of these sections today. But this is what the NIST security framework looks like:
One of the things we really need to make sure of is that we’re including all of our devices that are connected to the network in our cybersecurity plan. That’s the first thing we need to do, regardless of which cybersecurity framework you’re looking at.
At the last work group for HIPAA last year, one of their biggest recommendations was that we start looking at our connected devices.
So all of our monitoring devices in hospitals we typically don’t think of when we’re doing our security risk analyses that are connected over the internet or on our network – those are the next entry point for hackers to get into our systems. Our MRI machines, any patient monitoring systems, any imaging systems we have connected to the internet that contain PHI – we really need to start looking at those as the next entry point for cybercriminals to get into our networks.
Run on unsupported operating systems. One of the biggest reasons those connected devices is a growing security concern is because a lot of them work off of unsupported operating systems. For example, we were just doing an assessment at a pharmacy where they use a piece of McKesson software to run some of their machines. They can’t get off of Windows XP because that’s what McKesson requires. Windows XP is no longer supported by Microsoft. They have an unsupported operating system that they have to have, and there’s no other alternative out there. So, they have this huge liability and risk, all because that system is on an unsupported operating system.
McKesson is a lot bigger than some of these hospitals and pharmacies, so you have to make sure we’re all pulling together to say, “Hey, we need you to use supported operating systems to protect our cybersecurity.”
Aren’t included in risk analyses. We also know our connected devices aren’t all included on our risk analysis and security evaluations. Not understanding that risk means we can’t communicate that concern back to our vendor and we can’t start taking the necessary steps to move away from that vendor or find some other workaround or alternative to help address that gap.
Don’t support common security protocols. We also know that these connected devices don’t necessarily support common security protocols. A lot of times you can’t have a unique user ID and password to authenticate yourself into some of these systems, so it’s a shared password or, worse yet, there’s no security at all.
Contain rich PHI and are easily breached. We know that these devices store rich electronic protected health information. It’s got all the patients’ data. It has everything a hacker would need to steal a patient’s identity. They’re so easily breached with a low likelihood of immediate discovery, which means that attacker can just keep farming from you without you even knowing.
Some of the things we can do is make sure some of those connected devices are on our asset management. Here’s some action steps we need to take when we think about how we’re going to manage our assets:
- We need to make sure we’re identifying all our devices: workstations, mobile devices, personal devices, laptops, tablets – anything which accesses, stores, or transmits electronic protected health information.
- We also need to make sure we’re identifying all the software that we use to create, access, store, or transmit electronic protected health information.
- We really need to make sure we have our Application and Data Criticality Analysis. That sounds like a mouthful, but it’s actually super simple to get your Application and Data Criticality Analysis together. Again, I have a great template, I’d be happy to share with you guys. Just let me know if you want that template. This Application and Data Criticality Analysis is required – it’s part of the Contingency Plan of the HIPAA Security Rule, so we know we have to do it. It’s simply listing out which software you have, what software that software needs to run on. So, if you’re using Microsoft Office, you need an operating system to support that.
Question: Does this mean that we need Windows 10 by January 2020 to be secure?
That is correct. Windows 7 is no longer going to be supported. It’s being sunset.
- We also need to make sure we have all the proper controls in place. This is all your policies and procedures around your assets.
GI Joe was right: Knowing is half the battle. The best way for us to know what’s going on is to conduct a security risk analysis. These are the steps that will need to be taken to do a security risk analysis. We did a webinar a few months ago on security risk analyses.
The biggest thing you need to know is to take a multi-disciplinary approach to doing your security risk analysis and to your security program as a whole. Most of us think of security as being in our IT departments, but our IT departments are unable to tell us the financial impact a breach could have on our organization – that needs to come from Finance and Operations. They can’t tell us our legal risk – that needs to come from Legal or Compliance. So, you need to make sure that you have a multi-disciplinary approach to not just your security risk analysis but to your security program as a whole.
Some risk management action steps that we need to make sure we’re doing are:
- Prioritize all our identified vulnerabilities. When we run our risk analysis, we need to prioritize which vulnerabilities we’re going to address first.
- Create a project management plan for each vulnerability that will be mitigated. This is incredibly important because just knowing what your vulnerabilities are without a plan to mitigate them doesn’t mean all that much.
- Use that multi-disciplinary approach to mitigation.
- And document everything!
You also need to remember that risk management is an ongoing process, not a one-time thing.
This is another big area that gets us into trouble. We need to make sure we’re establishing access for our employees and workforce members correctly.
- Establishing work groups so we can know that nurses need access to certain things and management need access to other areas.
- Making sure we review those periodically and making sure those are still proper is really important because we have to comply with the Minimum Necessary Rule, which means we can only have access to the minimum amount of PHI you need to do your job. This means we may need to set up periodic reviews of the access because we may need to modify access if someone’s job has changed to ensure they still have the proper access they need.
This is my favorite thing to talk about! A lot of times, we know we need to encrypt all of our data in transit, and for the most part we do a pretty good job of making sure all our emails are encrypted, we’re using encrypted electronic medical records, we have all our cloud-based data backups that’s all encrypted in transit, but sometimes we forget about encrypting our data at rest. This is the data that’s sitting on our workstations or sitting in our servers, and we need to make sure it’s encrypted.
There are two ways most of us will encrypt: full-disk encryption or file-level encryption.
- Full-disk encryption is by far the easiest one. People love it because it doesn’t require long keys to access files, and a lot of times it’s free because it’s free with Windows Professional. Everybody loves it. The problem is, with full-disk encryption, it’s only encrypted if your machine is unbooted or turned off. The moment you turn your machine on, it’s no longer encrypted. Anytime you’re moving files, those are also not encrypted; so, if you’re moving a file from one folder to another, they are decrypted.
- With file-level encryption, it is the strongest level of encryption. It stays encrypted regardless of where it’s stored. It’s encrypted as long as it’s at rest, meaning, even if the computer is on, the file is encrypted, unless you’re actively utilizing the file.
I wouldn’t say all your machines need file-level encryption or all your machines need full-disk encryption. You need to come up with a system that works best for your hospital, clinic, or organization to address this.
My strong recommendation is that every mobile device – all of your laptops – should have file-level encryption. It helps protect the system better; it helps protect the data better; it ensures all those files are encrypted. Full-disk encryption is perfect for your workstations or your servers. Try to work with your IT team and come up with a hybrid of how you can address encryption.
One of the biggest “don’ts” is don’t send unencrypted communications containing ePHI! This is text messaging – one of the most popular questions we get is: “Is iMessaging encrypted? Can I use it? Is it HIPAA compliant?” The answer is, no. It is not. Apple is not signing a business associate agreement with you. You cannot text patient information at all.
Email – you need to make sure that you’re using an encrypted email and that you have a business associate agreement in place with your email provider before you send PHI through email.
Outdated technology, on average, costs the health industry $8.3 billion a year. This is an insane number. We could treat so many patients with that dollar amount that we’re wasting on outdated technology. A lot of this is because we don’t want to update our machines. It’s expensive and it’s time-consuming, and – let’s face it – nobody likes to change. But we have to make sure we’re doing that.
Patch management. It’s going beyond just updating our operating systems or physical hardware; it’s even as simple as patch management. These are all of the updates we have to do to our current operating system. You get those alerts, those popups, that say “You need to update your system.” Microsoft and Apple send those out a lot. Those patches are super important.
If you have on a pair of jeans and you have a big hole in those jeans, that patch is covering that hole – it’s covering that vulnerability to protect yourself. And a patch for your computer is the same. It’s patching up that hole in the operating system to protect your systems and networks from attacks.
So, it’s really important we have strong patch management and we automate it as much as possible, and we assess any legacy systems. We know there are some systems out there that still require us to utilize outdated technology, so we just need to make sure we’re constantly reviewing those.
Cleaning machines. We also need to make sure we’re cleaning our machines out: temporary files, especially your email temporary files and your recycling bins. Attackers nowadays are attacking us through rootkits, botnets, and they’re storing all that information in our temporary files and recycling bins.
You can set rules on your machine or have your IT department set rules to have those automatically cleaned out periodically. But we have to make sure those are getting clean.
- Your machine will operate faster because you’re cleaning up memory.
- You’re helping to protect yourself in case you have any type of malware hiding in those temporary files or recycling bin.
Older technology. Older technology runs slower. We can have lots of productivity there for revenue, they have a higher prevalence of cyberattacks, and they’re less likely to be supported by the manufacturer. We really need to review and make sure we’re updating all our technology as necessary so we’re not using a bunch of old technology.
Early detections saves data! Just like early detection saves lives when we’re doing our cancer screenings, diabetes screenings, renal screenings, early detection within your software can actually save your data because your data is valuable and we need to protect it. So, we need to make sure we’re using proper detection software.
- Never use home versions of detection software. If you have Norton antivirus, never use the home version of it. Always use an enterprise version of a detection software.
- Also, make sure you’re keeping the detection software libraries up-to-date and that you review your quarantines frequently.
- Disallow users from disabling your detection software. We see this more frequently than I thought we would, where people will turn off their antivirus and antimalware because they get those little popups and they’re annoying, and they don’t like them. So, they disable it altogether. You need to make sure your IT department has complete control over that, and that individual users aren’t able to disable the antivirus software.
- You also need to make sure it can scan your root folders. This might mean you’ll have more than one type of detection software to make sure it’s scanning everything, because no detection software catches 100% of everything. But if you run two or three different types of detection software, you’ll likely be able to catch the majority of threats out there.
We need to make sure we’re training Dave. We need to make sure he knows how to recognize a potential attack.
Slow-moving machines. One of the first things we want to look at is slow-moving machines. When our computers start to slow down, we might have a trojan or other type of worm or virus that is slowing our machines down because our machines are having to work harder than normal. If people are complaining about slow machines, we need to take that seriously because it could be a potential attack. People need to let us know when their machines run slowly.
Executables. Or if an executable starts running out of the blue. Most of the time, that’s going to happen after we open an email attachment and it’s going to be some sort of ransomware, and that executable starts running. We need to educate our staff not only on recognizing that attack but what to do. They need to be trained to pull the plug to their machine immediately so it doesn’t spread too far into your network.
Popups. Unwanted popups are another sign of a potential attack.
After we train them on how to recognize potential attacks, we need to instruct them on what to do if they suspect an attack. We need to tell them how to disconnect from the network, tell them to unplug the machine or power it off, and then call IT. The whole point is to stop the bleed immediately, and then get IT involved.
Once IT does get involved, one of the biggest things we need to do is not to panic. If you have an attack and start to panic, it’s going to make it so you can’t think through everything that needs to be done because you’re being very reactive. This is what we plan for. We have everything planned out prior to an attack so you know exactly what to do.
The first thing you’ll do is assemble your multi-disciplinary task force. You’ll have your IT team, your Operations, your Finance, your Legal, your Compliance. They’ll sit down together and start working through the breach. One of the first things IT will need to do is contain the breach or attack and assess the severity or extent of the breach. Meanwhile, everybody else needs to start notifying the patients, staff, and management of the breach. And make sure you’re documenting everything along the way.
We also need to make sure we have a healthy disaster recovery plan, so everything is in place. This goes back to when I was speaking previously about your data backup and disaster recovery plan and making sure you test everything.
- Recovery time objective. Make sure you have that in place, that you’ve identified a recovery time objective: how long will it take you to recover? When you start to ask operations or clinicians “how long can you do without this software?” they’ll tell you 30 seconds. But that’s not reasonable. So maybe 48 or 72 hours out for everything to be properly recovered. You need to know what that recovery time objective is.
- Recovery point objective. You also need to make sure you know the recovery point objective. A lot of time it’s going to be based on how often you’re doing your backups. A lot of this information is going to be on your Application and Data Criticality Analysis. It’ll tell you which software you need to bring up first for that other software to work. It’ll also tell you your alternative workarounds in the event that that piece of software is down. You need to make sure you’re testing this process frequently.
We also strongly recommend a separate ransomware response plan. Do you have a plan specific to ransomware? The first question we have every time people ask about ransomware is: To pay the ransom or not to pay the ransom?
Paying the ransom. The FBI warns us against paying the ransom because we’re dealing with criminals. There’s no guarantee you’re going to actually get your data back if you do pay the ransom, and it’s encouraging criminal behavior.
About 80% of all hospitals that have been the victims of a ransomware attack have paid and have gotten access back to their data. But that’s going to depend on how strong your data backup and disaster recovery plan is. How sure are you that you can actually recover everything that’s been locked down? You really need to understand what you will do in the event of a ransomware attack.
Simulating an attack. And have you simulated a ransomware attack? This doesn’t have to be where you actually lock down one of your machines. You can do this on paper. You can sit down in a conference room and have a roundtable with your multi-disciplinary response team and work through what you do if you have an attack.
Work through different scenarios. Work through a scenario where the employee did not unplug the machine, but they called IT and it took IT five minutes to get there, so now your entire network is locked down. Then work through if they did unplug the machine right away. Try to work through as many scenarios as possible so you know exactly how to respond.
Employee training. Employee training on ransomware is not an option! You have to train them on recognizing what it could look like once that executable starts running. You need to train them on how to respond. You need to train them on how to prevent. Ransomware is so prevalent. We have to make sure we’re training on that.
One of the biggest reasons is because your data is so incredibly valuable. Again, ten times the value of a credit card on the black market. Your cybersecurity and compliance programs are the only thing you have to protect that data.
Remember, compliance is a journey, not a destination. You have to make sure you’re staying on top of all your HIPAA requirements, as well as all your cybersecurity goals to make sure you’re staying on track and doing everything you need to. You cannot look at compliance or security as a destination. It’s not a checkbox. This is a journey that we’re on, and we’re on it together.
Our sponsor for today’s webinar was ThrottleNet. ThrottleNet is a managed services provider with over 20 years of experience in providing IT and security services to the healthcare industry. ThrottleNet believes that cookie cutter solutions belong in the bakery and not in healthcare environments. They provide customized IT solutions to all their clients.
Q1: You spoke about file-level encryption earlier, which I think we need. How can I get my leadership to take HIPAA security more seriously?
That is a big question! File-level encryption is really important, for your mobile devices especially. Getting your leadership to take HIPAA security more seriously, that’s going to depend. Have you done a security risk analysis lately? Maybe you can show them the vulnerabilities you have that file-level encryption will help to solve. A lot of times with leadership, you’re going to have to show a financial reason why to change something, because nobody likes change. Ensuring that they budget correctly and see the value in what you’re going to save them and how you’re going to help them be risk averse, that’s the most important thing.
Q2: As a privacy officer, I do not receive cooperation from my security officer to work together and communicate together.
We see this quite a bit actually. It is important. When we go right back to that multi-disciplinary approach to compliance, you as a privacy officer are trying to work with your security officer, which I’m guessing is probably in IT. When you’re having trouble receiving that type of cooperation, it’s probably because you lack a clear plan. Having a clearly documented plan and clearly documented goals with deadlines and objectives to meet will help you guys to be able to communicate better and be more effective and efficient in your program.