Skip to Main Content
 
Thought Leadership

An Ounce of Prevention: Keys to Understanding and Preventing AI and Cybersecurity Risks

 

Published:

May 01, 2024
Listen to the podcast

Related Industry:

Healthcare 

Related Service:

Hospice & Palliative Care 
 
Podcast

    

In this episode, Meg Pekarske is joined by Jody Rudman, the leader of Husch Blackwell’s White Collar, Internal Investigations & Compliance group, where they explore the enforcement and privacy issues surrounding artificial intelligence. Jody shares insights on the most common types of cybersecurity issues and how to prevent them. A cutting-edge conversation to help ground you in this emerging area.

Read the Transcript

This transcript has been auto generated

00;00;05;01 - 00;00;53;26

Meg Pekarske

Hello and welcome to Hospice Insights: The Law and Beyond where we connect you to what matters in the ever changing world of hospice and palliative care. An Ounce of Prevention: Keys to Understanding and Preventing AI and Cybersecurity Risks. Jody, I'm so glad that you're here with me yet again to you’re like a repeat guest. So thank you for for sharing your wisdom on I guess I don't know if this is happier topic than False Claims Act, but maybe it's I don't know they're both I think cause heartburn to to people but but talking sort of AI, cybersecurity risk and I know you recently did a webinar for the firm and some of the staff and

00;00;53;26 - 00;01;21;12

Meg Pekarske

so we wanted to pull that through to the podcast because what you did was really focused on health care. And I think, you know, there's things that that people oh ChatGPT, I hear about that but then they don't really think about, well, how is that maybe changing my business and what risks does that open me up to?

00;01;21;22 - 00;01;41;25

Meg Pekarske

And so I guess maybe we start with Jody. Like, how do you see AI getting used and and maybe to in terms of your background? Because I always think of you as a white collar crime, you know, false claim act. And so how did you get involved in this space?

00;01;42;12 - 00;02;16;14

Jody Rudman

Oh, well, first of all, it's great to be here with you again. So thank you so much and how I got involved in this. It's kind of a natural segway from so much of what I do, which is kind of retroactive. Okay, what went wrong or what is at least somebody accusing my client of what went wrong and turning that into a prospective view of how do we prevent and then, you know, how do we deal with or minimize the what went wrong situations.

00;02;16;14 - 00;02;44;14

Jody Rudman

So that they create the least amount of risk and fallout on the back end. And I'll tell you Meg A is not I'm not naturally sort of a computer and software and STEM kind of oriented person. And so when I even started hearing about AI in the health care space, I was a little resistant even to rolling out my sleeves and digging in because it just felt like, Oh, that just sounds that you do think.

00;02;44;29 - 00;03;09;16

Jody Rudman

The truth is it's fascinating. And you mentioned this a little bit in your opening remarks. Bit of a double edged sword, because as as amazing as the frontier is that we're currently on if there is a flip side to that and that that flip side is that there's risk and and that presents opportunity to figure out prospect tively how to how to keep risks at a minimum.

00;03;09;19 - 00;03;53;01

Meg Pekarske

Yeah, well, I know some of our clients, some of our hospices use like predictive analytics and certain tools that they buy and things like that. But what kinds of AI do you see some of our health care clients utilizing? I know this could be a whole nother podcast, but I know there is now more chatter about payers using AI to review claims that are in prior authorizations and deny them and witness a whole nother like can of worms that you know as someone who does a lot of claim defense on are people eligible for hospice.

00;03;53;01 - 00;04;09;10

Meg Pekarske

And if you feel like well AI is making that decision, would it make people feel really good? But but anyway, that's beside the point. But what kinds of things are you seeing that are coming across your desk of how health systems are using?

00;04;09;15 - 00;04;44;27

Jody Rudman

AI Yeah, really exciting uses of AI. Early uses of AI in particularly in cancer diagnosis and treatment and in the radiological space because a AI machine learning can read and detect and find, you know, nuances in pictures and x rays and other radiographic images even faster than the human eye when it is trained. And of course, training an AI tool is is a process in and of itself.

00;04;44;27 - 00;05;46;06

Jody Rudman

But the ability of trained AI tools to spot malignant growths that are not as evident to the naked eye or to spot changes in pictures and images and x rays is pretty phenomenal. And then and then cancer detection and treatment, it it is presenting opportunities for more targeted treatment of certain cancers or certain growths, certain lesion ends. It is reviewing symptoms that may seem, at least at first blush, to be maybe unrelated to each other, but over a series of population inputs where seemingly unrelated symptoms are presented in people, as it turns out, ultimately have the same diagnosis permits, more proactive intervention, and more opportunities to say, oh, guess what, when we combine A plus

00;05;46;06 - 00;06;41;00

Jody Rudman

B plus X, it turns out we, you know, we we have a higher percentage risk of Y. So those are some exciting kind of, you know, uses of AI over over the last year or so. And most of the big companies who are really jumping on board to develop AI tools in health care have made exciting announcements. For example, Google trained an AI machine to to spot and diagnose and grade diabetic retinopathy is faster than humans were doing that mit t research years developed a machine learning algorithm that registers brain scans and and other 3D images about a thousand times more quickly that they were actually looked at.

00;06;41;00 - 00;07;15;09

Jody Rudman

And so that enables more novel learning and diagnosis techniques. Is Stanford, for example, Stanford announced that its researchers developed an algorithm that reviewed x rays and developed or could detect pathologies 14 pathologies in a matter of seconds. So, you know, that's all pretty exciting. You mentioned something that was really interesting in your comments that I want to circle back to, though, and that is humans should be treating humans right.

00;07;15;09 - 00;07;36;28

Jody Rudman

And so all of this is incredibly exciting, but at the end of the day, I think the best opportunity for AI is as an adjunct to humans making medical decisions and treating humans and not just ceding our health care to machines. And I think that's a pretty common fear and yeah, and good idea.

00;07;37;26 - 00;07;54;02

Meg Pekarske

So what, what are the unique risks that are posed when you use those machine learning and that that you see that sort of maybe different are exacerbated by the fact that we're using AI tools to do some of this stuff.

00;07;54;17 - 00;08;34;02

Jody Rudman

Yeah. So, you know, consent our patients being fully informed and are they fully consenting to the use of AI in in their, you know, diagnoses or treatment modalities? Is there a human involved in the decision making? And what kind of disclosure are we making to patients to ensure that they are knowledgeable of it and comfortable with it? And then, of course, you know, if if there is an issue where somebody believes that Malpractise has occurred, you know, where did where did that happen?

00;08;34;02 - 00;09;04;05

Jody Rudman

And it in the in a machine process or a human process, those are some some risks, other ones. You know, I think for those in the legal field, you know, there there may be some familiarity or even just anecdotally with hallucinate hallucinogenic output by some machine learning. So it's kind of famous or infamous now. But it was a lawyer who filed a brief.

00;09;04;05 - 00;09;07;09

Meg Pekarske

Oh, yeah, yeah.

00;09;07;09 - 00;09;33;03

Jody Rudman

And, and the AI tool had like just made up a bunch of case law and cited things that were never existed and that's incredibly frightening. And so you, you know, we want to make sure that our machines are not hallucinating, you know, any sort of input output or diagnoses, decision making and so on, you know, the usual concerns, physician patient relationships.

00;09;33;03 - 00;10;10;20

Jody Rudman

And in wanting to protect that environment and that relationship. Well, and then there has been some exploration of whether A.I. uses in health care might certainly this is not anyone's desire on the front end of things, of course. But could the output possibly reflect some gender or socioeconomic or other biases, racial biases, for example, because of the input of data.

00;10;10;28 - 00;10;11;08

Meg Pekarske

For.

00;10;11;19 - 00;10;40;24

Jody Rudman

That, if the input is only as good as well, it so happens that all of this information was gained from elderly and and fairly wealthy white males. Well, what are we leaving out of the output is that is really the only input. So those are some of the, you know, balancing factors and risks that need to be considered whenever, you know, talking about the all the wonderful benefit of AI and health care.

00;10;41;20 - 00;11;25;06

Meg Pekarske

So I knowing your webinar, you talked about breach risks and like are there unique risks in terms of breaches when you're using these tools that are I mean, obviously, the HIPA security rules been around for a long time and people have dealt with breaches like, oh, I sent an email to the wrong person or I did something and but when you're using these tools that obviously all electronically based on stuff like what kinds of breaches are you thinking might be a possibility with these tools that maybe people aren't thinking about?

00;11;25;26 - 00;11;55;17

Jody Rudman

Yeah, that of all the risks that I mentioned a few seconds ago and I didn't mention this one, so I'm so glad you brought it up. This is the biggest one because mass quantities of protected health information and all of the fields that are populated within a health care database are a very rich target for folks who want to misuse other people's private information.

00;11;56;02 - 00;12;38;04

Jody Rudman

And so that without question, access to that level of information at that quantity and scale easily, it presents an extraordinary risk to the to the folks housing that information and and the security of it. And I'll tell you, Meg, it's really interesting that the way most data privacy issues have been is is a bad email, a phishing scam email and and somebody and and having, you know, bless them all because we all make the mistake of clicking on an e-mail that it's valid.

00;12;38;13 - 00;13;11;16

Jody Rudman

And you know what I has done? I has created better phishing schemes, more accurate looking emails, emails that make any of us so easily prone to that click that we regret. And but, you know, one bad click can create this phishing problem. And suddenly all of this incredibly rich data could wind up in terribly wrong hands. And so that whole data privacy risk is really quite significant.

00;13;12;02 - 00;13;34;18

Jody Rudman

And so, you know, a lot of the a lot of the goal of companies that are excited about using AI or are hosting this information, you know, a lot of their thought process really should be centered around protection, avoidance of risk, and then reacting, immediate reaction if indeed there is some kind of breach.

00;13;34;28 - 00;14;14;10

Meg Pekarske

Yeah, it's interesting in the whole phishing thing because we we had a hostage client recently have someone click on an email and we're in that process of evaluating, you know, what happened here. Was anything compromised is done and yeah, you know, whoever is generating A.I. or otherwise these phishing schemes that then, you know, sends out some email and the like and then you get deal with all the passwords like, oh, the password that, you know, I used to log in here is the same one here.

00;14;14;11 - 00;14;48;06

Meg Pekarske

You know, there's just like, right, we're also humans and we don't like remembering all these pass codes and, you know, all of this, you know, hygiene we should have around and vigilance around how we use at work, you know, the our passwords and security and and all those stuff. And so one of the things we're thinking about, I mean, typically you think people who are trying to scam are trying to get money, right?

00;14;48;06 - 00;15;21;25

Meg Pekarske

They want to sell data or something, but then I was also wondering like obviously hospices, you know, prescribe a lot of narcotics and like would there also be any schemes in trying to get opioids or something, you know, like would there be something else? Why someone might target a hospice for some type of phishing? Or is it just like, oh, it just happens to be like all the rest of health care, all I can sell data or something like that.

00;15;21;25 - 00;15;57;11

Meg Pekarske

But, but anyway, it is interesting. I think the breaches that we're starting to work on are very different than the breaches of ten years ago, where I sent the erroneous email or fax or, you know, someone left their laptop in their car and it was stolen or, you know, things like that. It's just become very, very different. But I guess as you've worked on these issues, you know, how do you see people changing their incident response to some of the stuff?

00;15;57;11 - 00;16;07;18

Meg Pekarske

And are there ways that you see people are able to also use technology to essentially have a better response on stuff?

00;16;08;03 - 00;16;50;06

Jody Rudman

Yeah. So believe it or not, as as often as I can be misused in the data breach context, it, it can also be used defensively. So there are ways to utilize A.I. tools to build up better protections within, within the information, you know, programs and databases. I think it's it's a really good idea for any any host or, you know, sort of owner of this kind of data to have not only training, but an incident response plan in place that is immediate.

00;16;50;16 - 00;17;39;11

Jody Rudman

We hear about a breach or we get knowledge of a breach, build firewalls, come down, systems shut down. We have a an information security officer, internal or external. We turn to that person immediately. Crisis response is is critical because the very early moments of a data breach are the most critical moments. And then, you know, as the hospice or health care entity or organization or whoever works through the incident, one of the most important things to do before sort of, let's say, turning it all back on again, is to make sure that all the bad guys and worms and other tiny threads of misuse are gone from the system.

00;17;39;18 - 00;18;10;23

Jody Rudman

So part of incident response is making sure that everything is clean and clear before we go back up again. Right. And minimizing the amount of time between those two points in time. Right. And so how quickly can we do all of that and how effectively can we do all of that is is just so important. You know, there's obviously there's going to be reporting and there's going to be, you know, things that need to be done, you know, sort of public facing when something like that happens.

00;18;11;00 - 00;18;29;26

Jody Rudman

But the internal mechanisms are really critical. So I training, I think especially, you know, the phishing scam kind of training we get right here at the firm. We see emails every so often and it we're supposed to click on this looks like phishing and we get a great job.

00;18;30;05 - 00;18;30;14

Meg Pekarske

Yeah.

00;18;30;22 - 00;19;04;24

Jody Rudman

These are really good ideas. The password thing that you mentioned, you know, unique passwords and don't put it on is posted on your desk. It's hard to remember what it is. All those things are really important. But I do think, you know, thinking about crisis response and incident response and what we're going to do if, you know, if something like this happens and how quickly can we get it done is a it's a that ounce of prevention is really a great idea these days.

00;19;05;07 - 00;19;57;01

Meg Pekarske

So I'm trying to ask you this question, reading the tea leaves, because you deal with a very scary side of the government, oftentimes because you're doing criminal defense kind of work. And I don't know if you covered this at all in in the webinar, but like, how do you see U.S. attorneys or other people? I mean, obviously, people are used to reporting things to OCR from a privacy standpoint, but where do you think is the line where a U.S. attorney might get interested in like a breach and like, does this rise to like, you know, some type of either a larger negligence and like how you're maintaining certain things that expose things or like what

00;19;57;03 - 00;20;07;02

Meg Pekarske

what do you think the government and that not just regulators, but actually U.S. attorneys might be thinking in this area.

00;20;07;15 - 00;20;53;23

Jody Rudman

Right? QUESTION HHS, you know, Health and Human Services keeps a breach portal and makes reports to Congress. And, you know, the FBI is very involved in cybersecurity and cyber threats, but there really is not federal enforcement in the way that you might be thinking of. And where we see a lot of the current enforcement activity is actually within state attorneys general and the National Association of Attorneys General involves a bunch of state agencies who will often work in tandem with one another in coordinated cases and investigations that are unique in their states.

00;20;53;23 - 00;21;30;07

Jody Rudman

But their states all joined together because these data breaches involve so many of them, and they'll act in a coordinated fashion to to move into the investigation and enforcement side of things. A lot of legislation at the state levels, that is already been passed or is in process in various legislative sessions to protect data privacy. And I really think that data privacy and cybersecurity are going to be a function of state enforcement and coordinate state enforcement for the next several years.

00;21;30;18 - 00;21;55;00

Jody Rudman

Certainly, the federal government and particularly the FBI and its cybersecurity task forces will want to go after the bad guys that are doing the hacking and infiltrate. But the companies, the hospices, the the health care clients who maintain the hacked into databases are probably going to see action on the state level.

00;21;55;09 - 00;21;55;24

Meg Pekarske

Gone to.

00;21;55;24 - 00;22;40;21

Jody Rudman

Criminal, you know, not as much, but it's you know, there are it's expensive data breaches are very expensive. And so so, you know, monetary fines and so forth will be driven by how robust were your enforcement measures and then how robust was your response and incident plan? Another reason to have a really good launch and the plaintiff's bar has started to get involved pretty heavily in data breach class action litigation, less so with high, you know, more so with other types of but I'm not sure it'll be long before they start looking at as a as an opportunity for plaintiffs litigation.

00;22;40;21 - 00;23;18;01

Meg Pekarske

Yeah. Well, the new frontier, right. But yeah, I mean, that I think it's it's a lot to sort of think through in terms of I mean, I guess in some ways it's really new, but at its core it's the same stuff for already we've always been dealing with in some way. I think it's just the scale is different, I think and and I think all of this cyber crime like being able to sell data, I mean, just, you know, why do people hack people?

00;23;18;01 - 00;23;47;29

Meg Pekarske

And I mean, that's that's sort of a new thing. But I mean, I think, you know, human vulnerability, you know, now it's just I click on an email, but I mean, there's always been, you know, some of this it's just playing out, I guess, in a different way and in a different scale. Human frailty, I guess, is at the the middle ground, because when you have those phishing things, like they get at the firm because it does create this urgency like, oh my God, like someone's trying to reach me.

00;23;47;29 - 00;24;22;25

Meg Pekarske

And then, you know, because you're always multitasking and and, you know, I just I think that that, you know, multitasking and trying to do ten things at the same time and and I think just like you said, I think just having I think the skills you need on your team and the amount in terms of how much you spend on it professionals, I mean, think of what your budget was ten years ago versus what you need now.

00;24;22;25 - 00;24;54;11

Meg Pekarske

And I think you don't always have to employ it but contracted out. I know. And the issue we're dealing with right now, you know, they have a really good outside firm that they use that, you know, is a huge resource on there. So and obviously insurance having insurance for these kinds of incidents is too is super important. And that whole you know, I'm sure that's getting sorted out too about what you have coverage for and what you don't.

00;24;54;11 - 00;25;23;27

Meg Pekarske

And, you know, that is a whole nother podcast series, probably an insurance coverage of some of these burgeoning areas and what you thought you had coverage for. And maybe what you actually do is a whole different ball of wax. But but anyway, really fascinating area. So thank you for for learning it and taking up the charge because it's it's like law is an area for constant curiosity, right.

00;25;23;27 - 00;25;41;03

Meg Pekarske

If you ever want to learn new stuff, there's constantly stuff to learn in the law and applying, you know, our skills to new areas. Yeah. Really interesting. Thank you, Jody. This is I'm sure everyone's thoroughly terrified now.

00;25;41;03 - 00;25;59;18

Jody Rudman

So I know that you want to communicate that it's it's a terribly exciting frontier. We're at a very exciting time. We just need to go in it with caution. But but it's a pleasure to to talk with you always and about this in particular. So I hope it was helpful.

00;25;59;26 - 00;26;02;26

Meg Pekarske

Yeah. Until next time. Thank you.

00;26;02;27 - 00;26;05;08

Jody Rudman

Have a nice day.

00;26;06;03 - 00;26;23;17

Meg Pekarske

Well, that's it for today's episode of Hospice Insights: The Law and Beyond. Thank you for joining the conversation. To subscribe to our podcast, visit our website at huschblackwell.com or sign up wherever you get your podcasts. Until next time, may the wind be at your back.

Professionals:

Jody L. Rudman

Office Managing Partner