← All talks

Weslen Lakins - Liability Landmines: Lax Security Measures Can Put Your Business in Legal Crosshairs

BSides Knoxville59:4133 viewsPublished 2025-07Watch on YouTube ↗
About this talk
In an era of increasing data breaches, inadequate security isn't just a technical gap, it's a legal ticking time bomb. Learn how seemingly small oversights in access control and data protection can expose businesses to crippling liability, and discover practical steps to minimize your risk. Introduction Businesses of every size are under mounting pressure to protect sensitive data. When an organization's security posture is weak, especially in areas like access control or policy enforcement, the legal consequences can be swift and severe. In this talk, we'll dissect the intersection of cybersecurity and liability, revealing the most common pitfalls that can leave businesses exposed to lawsuits, regulatory fines, and reputational damage. Why This Topic Matters The growing complexity of cybersecurity means that many organizations overlook the legal implications of their security posture. Often, executives view security as a purely technical concern, until an incident happens, and the legal fallout is severe. This session aims to illuminate how specific security failures, especially around access control and privilege management, can directly translate into legal liability. What You'll Learn 1. Legal Frameworks & Regulations Overview of relevant U.S. and international laws, from data protection regulations (GDPR, HIPAA) to newer legislation targeting corporate accountability (such as the SEC's cybersecurity rules). 2. Common Failure Points How poorly enforced access policies, insider threats, and deficient incident response protocols create liability. 3. Case Studies & Lessons Learned Real-world examples showcasing the severe financial, operational, and reputational consequences for companies that failed to protect sensitive data. 4. Risk Mitigation Tactics Best practices for building robust access controls, continuous monitoring, and governance frameworks that stand up in court and regulatory investigations. Who Should Attend? This talk is geared toward anyone responsible for or interested in cybersecurity risk management, security engineers, IT managers, CISOs, compliance officers, and legal professionals. By integrating both the technical and legal viewpoints, attendees will gain a holistic understanding of the steps required to protect not just their data, but their entire organization from crippling liability.
Show transcript [en]

My name is Won Lakens. Uh I'm an attorney. Um and these days I would say that about 50% of my practice is uh encompasses um cyber security, privacy law, uh contracts kind of stuff, all the things that are relevant to kind of what we're talking about here today. Uh I'm with Lewis Thomas and PC uh based here in Knoxville and uh we have a little over 130 attorneys across um Tennessee with uh offices in Memphis, Nashville, Severeville, uh and right here in Knoxville. Um so, uh our firm handles a lot of data breach response uh regulatory compliance um and class action litigation uh primarily on the defense side of uh things. But um so today's talk is uh

titled liability landmines and that's for a good reason. Uh the modern cyber security landscape is uh quite literally littered with them. Uh so um what does this really mean though? Uh every mispatch uh is every unencrypted database, every mismanaged vendor contract is a uh is potential legal exposure. And uh because also that security isn't just about keeping attackers out. It's about proving everything that you did uh everything that you reasonably could that you did everything you reasonably could do to uh prevent them from getting in. Uh and then lastly, because whenever a breach happens, uh everything gets scrutinized. Um and that's your policies, your response times, uh your communications, your logs, anything you can think about.

Um so let's go ahead. Ah so it wouldn't be a legal presentation uh without a good old legal disclaimer first. Uh so blame modern society for that. Uh seriously though everything I'm about to say uh it's anformational only and uh you know should not be taken as specific legal advice. Um I'm not your lawyer. We do not have an attorney client uh relationship. And keep in mind that uh some of this stuff is state specific. So um just always be wary of that. All right. So uh first we're going to walk through a few case studies uh and looking at some highprofile breaches um and incidents that illustrate just how uh security failures can kind of turn

into a legal crisis. Um and we won't just focus on the big names either. I tried to include some uh lesser known breaches that didn't make national headlines. Uh but uh then we're going to move on to uh break down the most common failure points uh that lead to legal exposure. Uh and this is where we're going to shift uh from specific incidences or incidents to patterns uh and uh kind of what are the common threads running through these breaches. Um finally we're going to go to move on to best practices and mitigation strategies. Uh and in that we're going to talk about um the most common failure points and uh kind of shift gears and

talk strategy. Uh because knowing where companies go wrong isn't always enough. Um and avoiding those mistakes takes uh proactive measures. Um, and then we're going to have a little bit of bonus material uh that's got a few things that I see come up a lot here uh recently um specifically with uh cyber cyber insurance uh or trying to I guess uh outsource the liability or responsibility for um data security. But uh along with some interesting trends uh in actual data breach lawsuits that uh I've noticed um kind of pop up recently as they uh as more and more come about. But uh finally then we'll go through some uh key takeaways um just to wrap up all the

lessons uh from today. And uh so that's kind of the road map. We're going to start with these cautionary tales and case studies um and look at p then go on to look at patterns uh that emerge from those incidences incidents I'm sorry about that uh the transition to uh then transition to practical steps uh companies can take to mitigate those risks um and then finally go to those uh bonus materials that I was talking about all right so first we're going to go through these case studies and uh I honestly think these are the most important thing that you can have uh is like learning or learning from other people's mistakes. Uh you know that it's

it's really important to kind of grasp that security isn't just an IT problem anymore. Uh it's a corporate liability issue. Uh it can quite literally and does uh bankrupt many companies, small businesses included. Um so that's uh I guess an interesting statistic here in 2024. Uh IBM reported that um the average global cost of a breach uh was it which this does also include all the major breaches but is right at about 4.88 million uh and that does um like I said include the the Equifax's Ubers the few that we're going to talk about here today but uh it also includes small businesses. Uh so moving on here, getting to the first case study, uh we're going to talk a

little bit about Equifax. Uh and if there is a single case that shows how a basic technical oversight um can snowball into catastrophic liability, it is Equifax in 2017. Um so just some brief facts about it and uh go ahead and get these up here. Uh brief facts about it. Uh there's about 147 million people affected. Uh so that's name, social security numbers, birth dates, addresses, uh in some cases driver's license. Um there was a $700 million 700 million in total settlement obligations. Uh $425 million of that came from uh consumer restitution. So a lot of people joined in there. Uh but it did also have to do with uh 50 state attorney generals all bringing suits um

against the company. Uh so uh there was 76 days of undetected data exfiltration here. Um so the attackers you know they had quite uh free rain uh inside Equifax's network for about over two months. And uh why is that you might ask? Uh, and it simply was just an expired TLS uh, inspection certificate that prevented Equifax's own systems from uh, detecting the Xfiltration in real time. So, pretty simple stuff uh, even back in 2017. Um, and then last thing that kind of came out of that was the 20 year uh, of 20 years of mandated security audits. Uh, they got hit with those which was quite costly. Um but thankfully they are a large company.

But uh so moving on kind of from the legal fallout there uh patching known vulnerabilities is a legal duty. That's that's an important thing that kind of came out um of this case. because the breach wasn't just expensive, it was honestly pretty precedent setting at the time. Um, in the sense that the FTC, uh, the Federal Trade Commission, framed Equifax's failure to patch as quote unquote an unfair, uh, unfair practice, uh, which, uh, under section five of the FDC act, uh, for any of you legal nerds out there. Um, the logic was pretty simple, though. uh they they knew about the vulnerability, they knew the patch was available, they failed to ply it to apply it. Um and it resulted in uh a

massive uh a lot of people's personal information getting lost. But uh the takeaway here is that if a patch is released uh and you don't apply it, that's not just a technical oversight. Uh it is indeed going to be a legal one. Um so monitoring uh blind spots um monitoring blind spots or liability multipliers. Uh so the e the expired TLS certificate wasn't uh it wasn't just a technical footnotes. Um it became more of a the cornerstone of the FTC's case. Uh and why is that? Uh because Equifax had systems in place that indeed could have detected the breach. Um and so those systems were effectively rendered useless. uh based uh because of a protocol that went unressed.

So uh next kind of point on that is that documentation matters. That's that's everything. Documentation is everything. Uh and Equifax could not produce records proving uh that all of their systems had been patched. Uh they could demonstrate that uh they also sorry they also could not demonstrate that internal controls had been enforced. uh which two big things. Uh but finally the big thing that came out of that case was that board the board's accountability uh is discoverable. And so what that means is in a lawsuit um after one is filed uh typically goes through a process where they uh you depose uh potential witnesses that you're going to um try to elicit evidence from throughout the case

uh and also produce various documents. But uh so this case kind of uh stands for also the fact that uh the meeting minutes of Equifax's board um actually showed that the concern the security concerns uh had been raised multiple times but were either minimized or uh deprioritized and the lack of follow through became kind of a key key point in the shareholder lawsuits that followed.

Okay, so uh now we're going to talk about Capital One. This is uh one of my favorites. Um so if Equifax was the cautionary tale about uh missing patches, Capital One is uh the cautionary tale um about cloud misconfigurations um and the consequences of ignoring known risks. Uh so just some quick facts here. uh 106 million people were affected. Uh 80 there was an $80 million OC fine and uh OC is the office of comprol currency for those of you who don't know. Um $190 million class action settlement. Uh the consumer sued claiming that Capital One failed to adequately secure their personal information. Uh and finally uh the audit findings um were ignored pre-bach. So that means

that they had a they had you know knowledge that uh their current measures were not sufficient uh yet they just did not care. So what are the uh what are the legal lessons here? known risks uh is equals increased liability. Uh Capital One, they didn't just have a cloud misconfiguration. They had a documented, reported um and acknowledged cloud misconfiguration. And that distinction does matter. Um if your auditors are telling you that your access controls are weak and you do nothing, uh it's not just technical oversight anymore. It's now a governance issue. So, uh misisconfigurations are not bugs. uh in court there is a big big difference between um a novel zero day uh preventable or sorry novel zero day

uh I guess uh philanthropist hacking group taking control of your systems as opposed to a preventable misconfiguration. So mis uh misconfigurations aren't sophisticated attacks uh they're actually evidence of negligence. Um, so they're not about the, you know, cutting edge hacking techniques. That's that's not what's necessarily litigated in these circumstances. Um, the next kind of big point that comes out of the Capital One case is that the public cloud is not does not mean shared liability. Uh and this is actually a big thing that kind of comes up is uh and it's a biggest a big misconception uh is just because your data might be stored on AWS or Azure or uh Google cloud that doesn't necessarily mean uh they're

responsible for securing it. Now they do have some certain basic uh requirements but and that is actively changing uh but it's in fact a shared responsibility model uh that explicitly states that cloud providers uh handle the infrastructure typically uh but you handle your own data configurations and access policies. Um and so that can create some interesting uh issues down the line. Uh finally we got the audit trails must uh must include remediation. So it's not enough to just find the problem. Uh you actually have to fix it and prove that you fixed it. Uh and this is more in the realm of uh regulatory enforcement consent judgment act or uh consent decrees uh where the government is kind

of trying to ensure that the businesses are doing what they say they do. Okay, so now we're going to move on to Uber. And I included this one just because uh it does in fact have a pretty interesting situation where the cso actually got convicted uh and served time in federal prison. Um so if Equifax was a mispatch and Capital One was about a cloud misconfiguration, uh Uber is about what happens when a security failure becomes a cover up. So this this case took a breach response uh straight from the boardroom to the courtroom and uh ultimately to federal prison. Uh 57 million users and drivers uh personal identifiable information were uh exposed. Uh 600,000 over 600,000 US

drivers license numbers were also lost and that was a big uh big thing in the regulatory committee's eye. um 148 million uh attorney general settlements. Um again, all 50 state attorney generals uh came together uh in a unified settlement. Uh and this is again that that you know they're not going to typically do that if it's a small company. Uh but again because this is Uber um and because the breach wasn't just about unauthorized access, it was about uh unauthorized silence. um and that Uber actively concealed the breach for over a year and that uh that delay really became the crux of uh the AG's argument. Um and also interesting in this whole scenario is that 100,000 was actually

paid to attackers to uh keep quiet um right as the breach was happening. Uh, and so Uber's security team actually paid them $100,000. Uh, labeled it a bug bounty, uh, and ran it through, um, I think it was hacker one, uh, and the attacker had the attackers, um, all sign NDAs, which they of course abided by. And so, uh, the problem, uh, is that it wasn't, uh, it it wasn't a vulnerability discovered by a legitimate researcher. U, you know, obviously it was hackers. So, um, and again, as I was saying, Joel S Joe Sullivan, uh, Uber's chief cso at the time um became actually the first security executive, uh, in a data breach, uh, case to go to jail. So,

so what are the uh, what's the legal lessons that come from this case? um and that is that delay does not equal um discretion. So under the GDPR uh companies have 72 hours to report a breach to regulators. Um and in the US uh state laws vary but the typically the range is 60 to uh sorry 30 to 60 days. Um, and as I just said, uh, Uber waited a little over a year, uh, and that, uh, delay was treated as a deliberate attempt to obstruct regulatory oversight. So, uh, lesson from that is that cover-ups multiply liability. Um, so I kind of already touched on the bug mounty thing. uh we won't have to go

back through that again. But uh I do want to re reiterate the uh the security leadership having having personal risk uh because this is the most critical takeaway of the the Uber case uh and the Sullivan conviction. It set a new precedent for uh personal accountability and that when security executives choose concealments over disclosure uh they're not just risking the company, they're also risking uh every one of their customers uh personal uh you know well-being. So uh why does this matter? Uh so Uber wasn't just fined, they were criminally prosecuted.

Okay. So, now we're going to talk about the Marriott Starwood breach. It's a little bit uh different of a situation. Um but it taught the world a painful lesson. Uh and that is that you you don't just inherit assets uh whenever you acquire a company, you also inherit their security problems. Um so 339 million guests were affected. Uh the breach began in 2014 and was discovered in 2018. Uh 18.4 million pounds was the uh fine under the GDPR I believe. Yes. So that's the UK. Um uh and then finally the failure uh the what this case really stands for is the failure to detect u breach postacquisition ongoing breach. Um so Marriott, you know, it wasn't just penalized for the

breach itself. feels penalized for failing to discover the breach during and after the acquisition process despite going through a pretty extensive due diligence um of all uh of Starwood's records at the time. Uh so I guess uh what are the legal lessons here and that is that M&A deals uh they must include some cyber security due diligence. Uh so whenever you acquire a company uh or sell your company, be prepared. Uh you're not just buying assets or selling them. You're also transferring your uh exposure or exposure to that new company. So uh if in this situation, if Starwood uh has a a mal a malware infection uh that persisted for years, Marriott inherits that infection along with hotels. Um

so uh do due diligence isn't just a checkbox. Um that was it. This was uh wasn't a superficial oversight. It was a systemic failure uh to adequately assess Starwood's um cyber security posture. So they kind of it's it was more of a situation where they almost look the other way uh because they really liked the property. Um so uh also you know you were responsible for inherited systems. Um, Marriott tried to argue that the breach started uh actually before they even contemplated acquiring uh the company and so they shouldn't be held fully accountable. Um, the ICO disagreed and said uh once the acquisition closed that Marriott was responsible for securing Starwood systems and including any

lingering uh malware or back doors. So what's the implication from this? Once you own it, you own its security risks. Uh and ignorance is not a defense. So, uh finally, you know, uh regulatory timelines, they do matter. Uh Marriott's argument, uh that the breach originated pre GTPR again didn't hold up.

Okay. So, uh, now we're going to talk about Morgan Stanley. Uh, so if Equifax was, uh, about missed patches, Capital One was about cloud misconfiguration and Uber was about concealment. Uh, Morgan Stanley is about a different kind of negligence. Uh, it's about asset uh, disposal. So, there's no attackers, no sophisticated exploit, just poor vendor management and uh, careless data disposal. So, um the kind of big points about this one that um so no encryption. Uh the servers and laptops uh they contain sensitive customer data but the data was stored actually in plain text uh and completely unencrypted. So, you know, had had it been encrypted, uh, Morgan Stanley might have been able to claim a safe harbor

that exists under most uh, data breach laws. That is, uh, a beautiful thing. Um, but instead, the plane storage became actually a pretty critical factor uh, or sorry, plain text storage uh, became a pretty critical factor uh, in the SEC's finding that uh, Morgan Stanley had violated their uh, safeguards rule. So, um, no verification, uh, no verification, no defense. Uh, so Morgan, Stanley, they tried to rely entirely on the vendor's assurances, uh, that the data wiping had been completed. However, there's no independent verification, no chain of custody documentation, no forensic spot checks. Uh, so basically, they took their word for it and weren't going to ask questions for how much it costs. But um so what's the bottom line here? If

you're uh decommissioning assets uh you need to verify that the data wiping procedures are actually being followed. Uh you know if it cost if you you know you're expecting it to cost 10,000 and it cost 1,000 that's a pretty big red flag. Um and that's kind of the lesson here from from Morgan Stanley.

Okay, so now we're going to talk about Anthem and this is going to get into um healthcare and HIPPA. So that's a little bit even higher of a standard uh and more higher uh safeguard requirements. But so if Morgan Stanley's breach was uh about failing to dispose of assets carefully, um Anthem's breach was more about uh failing to secure the assets in the first place. Um, and so a few quick facts here. Uh, 78 million records were breached. Um, the OCR settlement was $16 million and there was $115 million class action settlement uh that came out of this. Um, and the two big things that really got Anthem were they were they were missing multiffactor authentication

uh and they had no enterprisewide uh risk analysis. And so those two things uh allow the attackers to gain access and kind of do a simple spear fishing email campaign uh that ended up compromising a lot of uh admin credentials. So uh what are the uh legal lessons here is we go um HIPPA security rule is an optional uh the security rule covers entities uh to or requires sorry requires covered entities to implement administrative uh and physical and technical safeguards um to protect EPI. Um, so that's pretty standard. Hopefully, if you're in the healthcare industry, you're following that. Uh, enterprisewide, uh, means comprehensive. So, um, conducting risk assessments at the department level doesn't satisfy HIPPA. Uh, the OCR and said they want

comprehensive enterprisewide, uh, analysis that maps out all the systems, all the data flows, and all the access points. OCR. That's the Office of Added Up. The Office of Civil Rights. Yeah. Yes. Oh, really? Okay. Oh, yeah. Isn't that That's uh Yeah, that's the Command F, right? Yeah. Uh but so yes, that's the Office of Civil Rights. I know there's a lot of acronyms here. I'm trying to make sure that I point them out whenever they come come through. Uh oh, forgot one there. Uh civil liability compounds with uh regulatory fines. That's an important thing that kind of cames comes out of this as well is that um 16 million went to the office of civil rights uh o the

OCR uh and 115 million went to consumers. So it's not just one uh you know it's not just the government that you have to worry about. Now, uh it's also plaintiff's attorneys. So, uh bottom line, if you're handling PHI, uh every system, every database, every admin account is a potential liability. Okay. So, now we're going to move on and talk about some common failure points and the broad strokes legal frameworks that apply uh to cyber security specifically. Um, I didn't necessarily include too much about privacy, even though privacy laws and cyber security, they typically go hand in hand. Um, so they can also, you know, lead to some more uh unintended consequences that I'm not necessarily discussing here. Uh,

because inherently a data breach is a breach of privacy. Um, so failure isn't random. It's repetitive. Uh here are the six most uh common failure points that at least at my firm I think that we see uh lead to the most um liability uh you know risk uh risk avert or sorry risk uh leading to more risk developing within a business. Um, okay. So, we got weak access control. So, this is the the front door to your entire network. Uh, and yet time and time again, uh, people leave it wide open. Um, Anthem, there was no admin credentials compromised. Uh, and it was compromised through, I'm sorry, admin credentials were compromised through a spear fishing

email. uh there's no multiffactor authentication and um with Uber it was the credentials left uh exposed I believe in a private GitHub repository. Um so that's kind of you know weak access control is definitely I would say probably number one. Um unpatched or misconfigured systems. This is the the easy one that uh I hate to see uh happen, but uh they are they're hidden landmines in every in every network. Uh and this is vulnerabilities that are known and documented uh and often left unresolved. Um so uh missing monitoring or logs uh so you can't detect what you're not monitoring. um attackers uh had four years of persistent access uh with uh the Starwood Marriott breach. So, you know,

the their argument that they missed that uh in their logs or in in their monitoring efforts within the organization uh did not was not persuasive to say the least. Um poor incident response and delayed disclosure. This uh this has actually gotten a lot better. Um I think this kind of has to do with the fact that uh as attorneys we're guilty as well and our government of not uh advising everyone I think sufficiently of their responsibilities uh in these circumstances but uh that is something really you know to that I that does hurt you uh and does is a big thing uh that you have to follow and that's the disclosure requirements after a breach.

So um you know a breach is always bad but a botched breach is even worse. Um prime example of this would be Uber. Um absent governance or followth through is also another big thing. Uh a lot of companies have policies on paper um but nowhere else. Um so Capital One uh the internal audits uh reports flagged serious security weaknesses uh but remediation was non-existent. Um so it's not enough to be told about the problems. You have to fix them. Um vendor and supply chain gaps. This is something that's became pretty uh pretty prominent recently and I think that's primarily because your vendors are almost always your weakest link. Um, and attackers know that. So, um, Morgan

Stanley, it's the best, uh, example of this. They relied on a pretty unqualified, uh, asset disposal vendor, um, and they failed to verify their data wiping and ended up paying about 155 million in regulatory fines and settlements. So, why does this matter? uh because most of these things are not sophisticated highly technical uh things. They are basic preventable failures. Um and that's what you hope and ideally to and ideally will stop. Um so what are the uh what are the legal or sorry what's the bottom line is um these failure points aren't typically isolated incidents either uh that they result from systemic patterns. Um and that's the same patterns that we see breach after breach, lawsuit after

lawsuit. Uh and so really if you just need six things to think about uh in terms of what's going to create the most risk for my business, these six things are I would say 90% of 90 to 95% of the data breach cases that I have seen uh at least the underlying problem. Okay. So, now we're going to talk about the legal frameworks that kind of define uh reasonable security. And we won't spend too much time on this.

Okay. So, versus the GDPR, and that's kind of the one that's uh depending on if you're doing international business uh or not, but uh it's often viewed as the gold standard for data protection. Um and article 32 of uh the GDPR is where the uh rubber meets the road so to speak so to speak. Um it requires companies to implement appropriate um technical and organizational measures unquote to protect data. And what does appropriate mean? That's you know pretty broad right? Uh it means access controls based on user roles. Um and this is subsequent litigation has fleshed out uh what it means to be appropriate. Um encryption of sensitive data uh at rest and in transit uh regular penetration

testing um and vulnerability assessments. Those are all very uh appropriate things to do. Okay. So, HIPPA uh this is kind of healthcare specific uh but it's the security rule um which requires safeguards um audits, audit logs and risk analysis. Um so this is kind of the this is the baseline rule for reasonable security. Uh and HIPPA does kind of spell it out nicely for you. Um but the key requirements here they they include administrative safeguards. Um I think I touched on this earlier but physical safeguards um technical safeguards uh and what are those? Administrative safeguards would be like risk analysis, uh employee training, incident response, um physic where's physical safeguards, that's pretty uh uh pretty

self-explanatory, uh facility access controls, workstation security, uh whereas technical safeguards um that is uh unique user ids uh automatic log off uh encryption and uh audit controls. Those are some big things there. Um the SEC cyber rule is kind of new. Um not too new, but uh it essentially if you're this mainly applies to public companies. Um so in that it requires them to uh report uh any within 4 days actually of uh determining that the cyber security breach was material um to their shareholders. So um what uh what does this enta like entail? Uh it is that uh a lot of extensions because almost no one can uh figure out what happens in 4 days. Um but the main

thing here is just to uh make sure that you're telling your shareholders uh that something material and something adverse has happened to your company. And that's that's basic uh securities law. But um so the uh FTC safeguard rule is also relatively new uh and it uh actually targets non-bank financial institutions. I know that sounds weird, but it's it's everything from mortgage brokers to payday lenders. Um and it's uh it's also not just really about protecting customer data. uh it kind of requires more so a documented program that can uh also withstand regulatory scrutiny. Um and what is that uh documented program? And it includes risk assessments, um multiffactor authentication, uh incident response plans, vendor management, uh all things that we've

been talking about here. Um and finally, the uh the 50 state breach laws. Uh that's kind of what I've been talking about as well is how we're we're operating somewhat under a patchwork system right now with no uh broad um governmental entity who's controlling uh there's no uh I forget they call them something in Europe. I'm losing the term right now but uh essentially that control handles all things related to privacy and cyber security um which has resulted in 50 different states or 50 different laws in every state um which is hard to follow especially if you operate your business in multiple states. But uh the in in vague terms uh they use sorry they usually describe

reasonable security in vague terms uh but they all require timely notice to affected consumers. So that's the 30 to 60 days. It's typically typically the uh time frame

could you say? So, yeah, I I actually Sorry. Yeah, I actually I hadn't heard about that until recently. Uh my partners actually um the one who told me about it. I haven't had a chance to look into it. I know that it's going to definitely that's going to be something that uh impacts the whole international data flow exchange thing. Um which I don't know why I'm blanking on there.

Yeah, people are asking questions. Yeah, with the Cyber Resiliency Act, you'll lose your CE stamp on any products that are manufactured that don't meet it. And there are timeliness requirements that as a framework. And then the other question I had was with the riskmanagement framework um which also has notification requirements. Okay. Interesting. And that's you said that's a new EU act. EU is Cyber Resiliency Act. It was published back in 2016 and it goes into full effect in 2027. If you don't meet the Cyber Resiliency Act, you cannot sell products in the EU and you'll lose your CE stamp. Okay. There's also if with the in the US there's the risk management framework there's notification with continuous

monitoring of which you have to notify your uh authorizing organization if there's any breaches and yeah like uh like I was saying kind of earlier that's uh you know not every company uh that I deal with is is international uh or does sells sells things internationally um or you know sorry I collects data uh of EU citizens. Um which is typically seems to be the lynch pin there that they uh which is uh you know I'm sure they wish they did but uh so guess kind of get back into things. Um now we're going to talk a little bit about mapping uh points to legal exposure. So um just this is kind of a nice

helpful chart um that shows you know what kind of each each um failure points uh results in. Uh so an unpatched system Equifax 700 million uh no risk analysis um an think HIPPA um 16 million but

Okay. So, um access control uh isn't just an IT function. It's a legal requirements. Uh and what does that mean? So it's one of the most heavily scrutinized areas uh in is post in post brief investigations uh and is explicitly mandated by the GDPR, HIPPA, FTC security rule and state breach laws. Um get these up.

So uh enforcing the least privilege and role-based access um the principle of least privilege uh or PP uh means that every user process or service gets only the minimum access necessary to perform the role and I'm sure many of you know that. Uh but what does this mean in practice? Uh it means that um implementing role-based access controls um across all of your systems. So, not just your most important ones. Um, number two, uh, deploy multiffactor authentication for all privileged and remote accounts. Uh, this seems pretty standard um nowadays, but you'd still be surprised uh at the amount of people who have hybrid uh employees who do not have multiffactor authentication. So, you know, they're accessing uh their

employer's assets uh on unsecured Wi-Fi a lot of times without um the dual authentication. So, that's a pretty big risk. But, um so another um another thing that that works a lot is uh the automating the offboard uh sorry, automating offboarding and access reviews. Uh, and what happens whenever someone leaves a company, that's what this is about. Uh, if your offboarding process isn't um, automated, you're gambling with open access. So, that's if someone leaves your company, uh, it's pretty industry standard to have that uh, you know, automatically removed the all of their access. Um, dormant accounts were actually a key factor in uh, Uber's breach. Number four uh was to log and monitor all admin activity. Um so it's it's not

always enough to just control the access um you also need to monitor it uh and log it. So every administrative action, every login attempt, every failed password resets and why is that? because uh when breaches occur, regulators uh and plaintiffs ask for um evidence that you were monitoring your critical systems and that's did your did you log when admin accounts access sensitive data? Uh do you have records of privilege escalations? Uh can you produce a comprehensive log of login attempts? Uh IP addresses sorry IP addresses and session durations.

Okay, mitigation tactic number two is uh monitoring and detection that closes the gap. So uh this is you know pretty technical stuff here about the information and event management. Um so SIM uh isn't just a dashboard. It's your single source of truth uh during postbach investigations. Um the key function of SIM is uh normalization. Uh so collecting logs from endpoints, firewalls, databases, uh cloud services uh and identity providers. then correlating those uh logs to identify patterns. Uh user and entity behavior analytics UBA. Uh so if SIM tells you what happened, uh UEBA tells you what's unusual. At least that's my understanding. Um, so this is a if a payroll clerk who usually downloads five mil uh megabytes

a month suddenly pulls 50 50 gigabytes of customer data over the weekend that is going to be an outlier. Uh and that's going to look bad if you do not catch that. So uh why does this matter uh legally? Because it speaks to foreseeability data loss prevention. So uh uh it is your really your last line of defense when um SIM and UEBA fail. Uh and it monitors data at rest in transit uh and endpoints for unauthorized transfer. Um an example that I have here is uh if a user tries to upload one gigabyte of encrypted data to a personal Dropbox account, your DLP should flag that and block it. If not, it's not uh not going

to hold up in court. uh and not not a successful mitigation tactic. Oh, log retention and review. So, that was the last one there. That's pretty self-explanatory. Um, you know, review your logs and review uh and analyze what they say.

Mitigation tactic number three, uh, vendor and supply chain risk management. We've talked a little bit about this throughout the entire uh presentation, but uh what does this really entail? Uh and what are some good tactics to employ? Um security assessments, pre-engagement. So before you sign the contract, you know who you're dealing with. And this is uh where you require um SOC2 type 2 I believe um ISO 27001 or similar security test stations uh from critical vendors. Um so if they're handling PII or PHI or financial data, you can even go further to demand um recent penetration test reports uh C AIQ assessments and executive uh risk summaries. Uh but it's not necessarily enough to just collect reports. You do you do have

to act on them. Um so uh another another important thing here that's a a really useful mitica uh mitigation tactic is security clauses and contracts. Um and this is number one really for reducing supply or supplier and vendor uh potential liabilities. Um if first of all if it's not in writing especially in Tennessee if it's not in writing it doesn't exist. I'm trying to speed up a little bit, but uh so here this means really just require vendors to uh maintain quote unquote industry standards um controls uh which means including encryption uh multiffactor authentication, regular security audits, um those types of things. But another important thing that I've seen recently that I've been instructing all my clients to do is to

insist on 24-hour breach notification clauses. Uh and dimmn and also indemnification for security failures and audit rights. Um those three things can kind of help in the future, you know, whenever you have a two three-year contract up for renewal. Um to be able to actually assess what how they've done. Number uh so ongoing oversight. Um really that that's just you know watch your contractors the entire time they're working for you. Don't just let them go and operate on their own in their own worlds. Um, ask what they're doing and how they're doing it. Um, asset disposal and uh, data retention. We've kind of also talked about that uh, a little bit, but um, it's more than just wiping a drive. um

it's it's confirming that it was wiped and um and this is in fact a legal requirement now. But um sorry, lost my place there. Oh. Uh, one thing that is a nice mitigation tactic here is a certificate of destruction, which is typically where the, uh, third party has an independent auditor come in and say, "Yes, you know, we've done what we say we've done." Uh, and that is a good way to kind of to derk yourself a little bit more. Um, if you don't want to check the work of your vendors.

Oh, that's actually kind of an important one. So, the the shared responsibility matrix uh and that's kind of what I was talking about earlier with the cloud uh cloud platforms and uh it's really important to clarify who is responsible for security control. Uh whether that's going to be the vendor or your client um or whoever. Um you know, are you going to be responsible for data encryption or is that going to be the the vendor? uh who's going to manage patching? Uh if you're breached, basically, you know, regulators are going to ask whose job was it to do this?

Okay. So, mitigation tactic number four, um incident response and breach disclosure readiness. So, this one's pretty self-explanatory. pretty much just making sure you're aware of all your timelines and uh disclosure requirements whenever a breach actually occurs. Um, and also it's it's never going to be sufficient or an excuse to say ongoing investigation uh in that and use that as a uh excuse for why you didn't uh disclose what you're supposed to in the time frame allotted. Okay, two more mitigation tactics. Um, is the board governance and security accountability. So, this kind of goes back to, you know, courts are a lot of times going to ask what did the board do? What were they what were they aware

of? Um, what type of policies did they uh try to uh disseminate amongst the organization, those types of things. So some helpful things you know that tell your CEOs and csos to do um or if you are the csos to do um is quarterly cyber briefings um a board approved cyber budget that's something new that I've seen uh companies you know doing here even smaller size companies um audits for mediation tracking and uh a risk appetite statements that's uh I'm a little conflicted on those uh because it's not entirely sure or or you know played out in the courts whether or not that would be discoverable. So the last thing you want is to write write

something out and uh sign your name to it saying how willing you are to lose your uh customer's data. Uh that's not going to look good in front of a jury. So mitigation tactic number six uh documentation is a legal defense. And this is honestly I think the most important uh mitigation tactic that you can have in in not only data breach but literally in everything uh and that is to keep records. Keep records of everything. Don't throw it away ever. Even though I think that there's a 7-year requirement in Tennessee for keeping things, but um it's kind of aside the point. documentation is your best friend. Um, especially in a lawsuit. So, uh, and

this, what does that include? What documentation am I talking about? That's, uh, security policies and procedures. Uh, it's not just word of mouth. You're not, uh, it's not, uh, it's also not emails. I've seen that before of, uh, their, you know, their policy is to email. It might and that's typically for newer startups uh but you know their policies are changing a lot so they don't want to uh go through the trouble of putting together a corporate hand or a employee handbook with all the different policies and procedures. Um so they have somewhat of a uh constantly evolving uh document that's either everyone contributes to uh which again not the best thing but uh but it is uh

it is important to to um document those security policies and procedures. Um another important thing is security awareness training logs. So it's not enough to just say that you told all your employees how to do and you know what to look out for. you need to seems kind of ridiculous in handholding, but have them actually sign uh you know sign something saying I attended this talk uh and I have learned what I need to learn. Um incident response runbooks and debriefs. Okay. Um so what is it? Uh it's um it's really a postmortem reports uh or a debrief that outlines the timeline, actions taken and lessons learned from a security incident. So I'm always hesitant to put this kind of

stuff in writing um as well, but uh you know the bigger institutions this pretty standard um documenting procedures.

Well, GRC systems and board reporting that's uh It's governance. GRC is governance, risk, and compliance. Um, and they're platforms that track security initiatives, audit findings, and uh risk mitigation efforts. Um, but why is it important is the 2023 SEC's uh cyber security rule explicitly requires disclosure of board oversight of cyber security risk.

policy. [Music] Yeah, I think that'd be pretty reasonable. Um, I guess it's also pretty important to make sure that everyone has access to the the late the most upto-date version. Um, and that you know if you do have uh version control that you're pretty well that you're I guess without error in updating that version control. the the question was uh security policies. Would would you have a hard copy with version control? Just to repeat the question. Oh else. Okay. No, you're fine. Um but yes, so that I think is a is a good idea and acceptable. A lot of people do that. Um version 1.0, 1.1, 1.2. But uh real quick here, uh just going to get

into these bonus materials. And this is actually I think pretty uh pretty helpful here. uh considering the prominency of uh cyber insurance these days. So why can't you just insure it away? Uh the limits of cyber insurance um and that is

um so there's coverage limits in almost every policy. uh and they also all almost always exclude coverage for regulatory fines uh or imposed due to negligence or willful misconduct. So, you know, you think that you're you're it doesn't give you a free pass to have no security uh in place essentially. That is a big point there. Um but uh the plaintiff's dilemma uh difficulties improving data breach damages. This is something interesting that I've recently seen. Um, and that is, uh, it's it's pretty hard. Uh, so as these lawsuits are getting to their final stages, um, it's, uh, it's been pretty hard for plaintiffs to prove that you've actually suffered damages. Um, in that, you know,

like I was saying, Equifax breach, they lost almost half of the United States population's data. So, who's to say that someone on the dark web has not already got uh, someone's data from a subsequent breach? Uh and that's kind of the argument that defense attorneys are making um right now to limit uh company's liabilities. Um and then obviously, you know, there's ways to counter that, but uh oh, and that this is kind of the last thing here. We've got a legal liability landmine checklist.

And uh this is just something kind of useful to think through whenever you're you're making your own policies. Uh something that that I try to a general guide uh that I try to give my clients at least. Oh, forgot governance.

We'll see if this works. This is just a nice little fun poll. Bye. So, it's uh choose the top area where you feel your organization is most exposed. So, feel free to uh send in a response or uh you know just yell it out. Is all the above an option, right? Oh, I don't see it.

All right, I think that is all the time that we've got. Yes, that's uh that was the last slide there. Excellent. All right, big round of applause. Thank you guys.