Dr. Jeremy Sharp (00:34)
Hey everyone, welcome back. are closing out our technology pillar with a topic that’s a bit of a shield, I think, for your entire career, which is cybersecurity and data sovereignty. How much more boring could we get? Not much, I’ll be honest. But this is super important. Super important. Cybersecurity and data sovereignty. So in 2026, health care data
is the number one target for cyber attacks. As we have moved more to the cloud using digital testing and remote video and AI tools, we have created kind of a massive digital footprint, even within our small practices. And so today I’m not just talking about like how to not get hacked. I’m talking about actual sovereignty. So what does that mean? That means who actually owns the data?
once you hit save in a cloud based tool. There are a lot of questions about this that come up in the Facebook group and consulting and so forth. There’s a lot of fear, a lot of curiosity around, know, do I use a cloud based tool? Do I need a hard drive backup? Should I just keep paper notes? With good reason, you know, there’s a lot to be concerned about here, but…
Today we’re going to look at the new 2026 HIPAA mandates, specific red flag phrases to look for in your software contracts and the exact questions that you need to ask a vendor before you trust them with your patient’s most sensitive information. So I’m going to talk with you, I think this is the last time, at least in the March marathon about crafted practice, which is the in-person business retreat for testing psychologists. It’s the only one. It’s the only one.
So this is a small group, 20 people. We come together for three or four days in Colorado over the summer and we do small group coaching. We work on your businesses. We actually implement some ideas. The hope is that you get the space to actually think about your practice and work on it a little bit instead of just moving from day to day and hoping that something will change. There’s a chance you could form relationships that will last literally years if
past events or any indication. Love to see you there. It’s all inclusive. Just get yourself here and we’ll take care of the rest. So you can get more info at the testingpsychologist.com slash crafted practice. Okay, folks, let’s close out this March marathon with an episode on cybersecurity and data sovereignty.
Dr. Jeremy Sharp (03:16)
OK, I’m going to keep this, again, relatively short. Try to just hit the high points. As always, there’s much more to say, but these are meant to be relatively quick, actionable episodes. So I’m going to start with a little bit of a HIPAA overall discussion. So in 2026, there’s a little bit of a change in HIPAA.
So this is one of the first things you need to know. 2026 HIPAA security rule update has eliminated the distinction between required and addressable safeguards. So every technical measure is now mandatory regardless of the size of your practice. What does this concretely mean for you and your work? So it means you need to demonstrate three things in an audit.
One, you have to have mandatory multi-factor authentication. So that’s now required for every system accessing electronic PHI, protected health information, including your EHR, your patient portals, even your administrative tools. MFA is multi-factor authentication. That’s just the process where you log in with a password, but then you also have to get a code texted to your phone, or you have to put your finger
on the fingerprint scanner on your MacBook or something like that. So two factor authentication is now required for every system accessing PHI. The second point is encryption by default. So data must be encrypted both at rest and in transit. Now this is largely, you don’t have to worry about doing this. You just have to worry about using software that does do it. So you are not responsible for encrypting your own data. Let me be clear.
But again, you are responsible for using software that does encrypt the data and pretty much, you know, anything that’s going to sign a BAA with you, which you should be looking for in any software that touches your protected health information, client information, is going to encrypt the data both at rest and in transit. But you want to make sure. The third point is a 72 hour recovery rule. So
What does this mean? This means that you have to prove that you have a backup system capable of restoring critical clinical data within 72 hours of a cybersecurity event. Again, this is not necessarily your responsibility, but you do need to check your vendor contracts and software contracts and those lengthy, boring terms of service that none of us read for this little detail to make sure that the software that you use
does have a 72 hour recovery rule. So like I mentioned earlier, this is where that business associate agreement or BAA becomes your primary legal shield of sorts. In 2026, the standard off the shelf BAA is probably not enough. OK. You need at this point, AI specific clauses that explicitly address data training opt out and subcontractor usage.
If your vendor uses a third party cloud host like Amazon Web Services or Microsoft Azure, the BAA must confirm that those downstream partners are also meeting these new 2026 mandates. And this is something that we’ve been thinking about at Reverb, which is my AI report writing software. We just had to update our BAA and some of our other documents to match these updates to.
HIPAA.
So let’s transition to data sovereignty for a moment. So the term data sovereignty is a little opaque. Many clinicians will confuse this with data residency, which is where the data is physically stored. Sovereignty is more about control. And if I’m being honest, I don’t know that anyone I’ve talked to has ever called it data residency. But we do get questions a lot around where is this data stored? And that’s what I was talking
But again, we are discussing sovereignty, which is more about who controls the data. So under HIPAA, you as a clinician generally own the physical or electronic record, but the patient has a right to access and control its disclosure, right? Like this is pretty straightforward. That said, many of the digital tools try to kind of blur these lines in your terms of service. So they might claim ownership of anonymized data for
product improvement. Those are lots of quotes, if you couldn’t tell.
So professional consensus generally emphasizes that clinicians should retain full control over their data. But it’s uncertain exactly how this data sovereignty issue is going to be adjudicated in cross-state licensure compacts, like SIPAC, for example, when state laws conflict. So many stakeholders would say that the safest path is to use what’s called private instance cloud storage, where your data is
sequestered from the vendors, other users to ensure that your data doesn’t leak into a global model for training. Now you might think, okay, is the data being used to train a model? That feels like a very AI question and it’s certainly relevant in the AI context. But it’s also relevant just for other software platforms. For one thing, because nearly every software these days has integrated AI into the platform. So AI is gonna
be a part of it and you have to be conscious of whether your data is being used to train models or not. But also just because, you know, like I said earlier, like software terms of service that claim ownership of anonymized data for product improvement, even if it’s not training a model, we do have to be really careful that the data is truly anonymized and that it is not, like I said, leaking into other parts of the software, certainly into
other user accounts or anything like that. So very quick, very straight to the point, and it’s a lot to take in. So do a quick pause there just to recap. Essentially what you need to know is that HIPAA has been updated for 2026 and you have a few things that are required. Like I mentioned, mandatory multi-factor authentication, making sure you’re using software that is encrypted by default.
And then making sure that your software also has a 72-hour recovery rule. You also have to be careful about reading the terms of service, making sure that your data is not being used to train, or rather your client’s data is not being used to train other models. OK, so how do you actually vet software and make sure this is happening? It’s essentially, stop looking at any features pages and start looking at a security or legal page.
for the software. Okay, so here’s a quick little three point vetting checklist for your software. Number one, you got to ask the training question. So is my data used to fine tune your global base model? And the answer needs to be a hard no on this. The data should only be used for your specific instance or not stored at all. The second thing is basically demanding third party validation. So don’t take a vendor’s word for it.
look for SOC 2, Type 2, or High Trust CSF certifications. I’ll say those again, SOC 2, Type 2, or High Trust CSF. We are right in the tail end of the process of SOC 2 certification at Reverb, and it’s a process. So this is just a third party verification or validation of compliance and data security and data sovereignty.
So these two standards are kind of gold standards for proving the third party has audited the security of the software over a long period. The third thing is checking for data portability. So if you decide to leave your software and want to take your data with you, can you actually do that? Can you export your data in a usable format? A lot of vendors will lock you in by making it impossible to get your records out without paying an extraction fee or otherwise.
compensating them. So three point checklist when you’re vetting your software. Now terms of red flags. Let’s watch out for some red flags in the terms of service. I look y’all. Nobody’s reading these terms of service. Nobody is reading the terms of service. So what you can do here it does take a little extra effort but I think it’s worth it. So before you click on you know accept or sign a terms of service
See if you can get a text-based version, right? So you should be able to, at the very least, copy and paste the text of the terms of service into a Word document or an AI language model and do a search or ask these questions. So you can either search for these terms or ask these questions. These are red flag phrases to look for.
Quote unquote, best efforts or commercially reasonable.
These are the kind of like get out of jail free cards for vendors. They provide no actual guarantee of uptime or security for your data. Another one is quote unquote sole discretion. If a vendor can change their terms or delete your data or really do anything with it at their sole discretion, then you have zero sovereignty over that data. And then the third one is perpetual license to de-identify data.
This is a huge red flag. It just means that they can keep and sell a version of the patient data essentially forever. So these are just three red flag phrases you want to look for in your terms of service. Like I said, it’s going to take a little extra time, but I think it is worth it. And yeah, this is all honestly a huge pain in the ass. We should not have to worry about these things, but this is where we’re at. And I think it’s a consequence of moving to
digital record storage and use of these digital tools. It is just where we’re at. And I think we’re headed in that direction. It’s happening slowly but surely where keeping paper notes is gonna be, if not a thing of the past, it’s gonna become really, really difficult to do it with any amount of efficiency or ease. Even literally yesterday, I was talking with someone who said,
I still take notes on, you know, with a paper and pencil during my interviews. But then this person went ahead to say that they’re actually doing it on a remarkable tablet, which is, you know, a digital writing tablet. So we’re just moving in that direction. All right. I’m going to end with a brief discussion about cross state compliance and something called the Delete Act. Okay. So what is this? We have to consider the geography of data, apparently.
So in 2026, there are new state laws in Indiana, Kentucky, and Rhode Island that have lowered the thresholds for compliance. So if you practice across state lines, like via SIPACT, for example, you might inadvertently fall under restrictions from another state based on just the geographical data. OK, yet another headache, right? But to stay compliant,
You need to have a technology asset inventory. This is just a document that lists every tool that you use, the version number, vendor details, and a map of how the data flows. So if for some reason, you know, the attorney general from one of these states asks where your geo location data is stored, you need to be able to point to that inventory and say, we don’t collect it or it’s encrypted here. Now,
If you’re like me, I mean, even saying this out loud, I’m like, what even is this and how do I do this? This doesn’t make any sense. I’m just like one person. You can get this from your vendors. So it’ll take a request, but you can get that information. You don’t have to like make this up or dig around and really find it. You should be able to contact customer support and get this information.
And so the tools that I’m talking about, like any tool, this would be like your EHR, Google Workspace, anything that is touching your client.
Okay, I will stop there. I want to emphasize before we totally wrap up that these are steps. Okay, like this is a process to put these things in place. And if you don’t want to do it yourself, that’s okay. Nobody really does. There are folks out there who can help you. You could certainly go the attorney route. You could look at something like person centered tech, which specializes in HIPAA and security and compliance for mental health practices.
You can get support with this, and you can go slow. So the likelihood that you’re going to get busted for any of this is pretty low. You just want to kind of focus on these highlights, like I’ve mentioned, some of the most important things, and change step by step. So these are good quarterly projects to make these changes over three, six, nine months. Protecting your data, unfortunately, is not just a technical task.
just a core part of running a practice. And we’ll touch back on this in the next episode, which will be the final, final episode, kind of the capstone episode of this March sprint. We’re going to be talking about an assessment practice audit based on all the content that we’ve talked about in these three pillars over the last three weeks. So make sure to catch the next one and you’ll get a nice little checklist.
to go over your practice and make sure that everything is working the way that it should be.
Click here to listen to the podcast instead.
