← All talks

Nathan Wenzler - Mastering the Three Levels of Risk

BSides St. John's39:1912 viewsPublished 2025-05Watch on YouTube ↗
About this talk
BSides 2023
Show transcript [en]

Chris will introduce our next speaker.

All right, everyone. Uh, welcome back. Uh, next up we have Nathan Wler uh, with Mastering the Three Levels of Risk Decision. Thank you. Appreciate that. All right. Hello everyone. Good afternoon. Hopefully uh everyone is fed and not quite yet. Gonna try to keep everybody awake here for a little while. So we'll we'll hopefully this works out. Um so before I jump into the talk, I want to talk a little bit about the talk. Um so I work for a company called Teneal which happens to be up on the screen there. Many of you probably know Teneal is the creators of Nessus. Uh you know we're a company that's been in the vulnerability assessment business for 25 years now. Uh

I'm not here to talk about that. Uh I I get to live in kind of a unique position in the company where um they kind of just let me talk about whatever I want, which is kind of cool. Uh so what I'd like to do today is talk a little bit about risk communication, right? It's a it's a major major problem for a lot of organizations. Everyone seems to struggle with it and uh I've had a lot of opportunity in my career to fail at this in many many ways. And so I thought today what I'd do today is talk a little bit about tips and tricks on how to communicate about risk up the org, down

the org, laterally everybody who you're trying to deal with in your organization. Offer some tips, some strategies, maybe they help you hopefully. And I'll show you some examples at the end which in full disclosure, yes, come from a tenable product, but it's only because that's the tool I have access to. So think of it as an example. I'm not going to talk about product. We have people here who will talk to you all day long about that out in the hall. um that's not my job here today. So if you have questions, thoughts, please speak up. Uh I'm happy to make this a bit of a conversation. And with that, we'll talk a little bit

about well me some more. Uh I hate slides like this. I really do. I don't generally talk about myself in these things, but I do it here because again, uh I've failed a lot at this. And I've been in security over 25 years. I started in government in the United States working for state and local uh agencies building out security programs quite frankly at a time when security didn't exist. Uh we were the IT department um and often we were the people that set us kind of sat in the corner of the cubicle farm every now and then raising our hands saying hey you know not sure it's a really good idea that we're letting all the government

employees like download porn to the taxpayer systems. Do we can we not do that anymore? Um we were trying to do the right thing. We're trying to protect these systems. We weren't very popular with all the users for some reason. Um but but that's what we were trying to do, right? And that's kind of how security formed in a lot of agencies. We came out of technology and trying to do the right thing to mitigate risk. We didn't think of it that way, but that's what we did. I did that for about 15 years. Got out of the public sector, went into the private sector. um started managing teams, became a CISO for a few

companies. So I ran security programs overall u whole different world as you might imagine. Got a little tired of that. Went into consulting uh primarily in sort of the management level. I did a lot of work with seuite uh players often got brought into like big financial institutions who just wanted to understand why their security program wasn't working. Uh spoiler alert, it's usually the seauite. Uh which by the way, if you ever get into that opportunity to consult like that, man, it is a great job. Like coming into a place where you've already been paid, well, your company's been paid. You don't get the money, but your company gets the money. And you sit down and

fill a room of people in the seauite who think they've they've been doing it all right and you go, "No, no, you haven't been doing it right." There's something kind of satisfying about that, I have to tell you. Uh don't do that to your seuite. That will limit your career opportunities. But consult for it. It's good. So, I've been kind of all over the place, right? I I've been in the trenches. I've been in management. I've been an adviser. I've kind of seen it done a lot of different ways. That's why this is up here. Not to tell you that I've done it right or better, but just to tell you that I have worked in just

about every industry or alongside it, law enforcement included, manufacturing, uh shipping restaurants finance healthcare. I I've kind of seen a lot of stuff. And so that's where a lot of this comes from, my experience with this. We talk about risk communication. You know, it's a little bit different for every organization, but I find that there's a lot of similarities. And the similarities usually start with how organizations look at security inside their company. So like for me, I'll give you an idea of kind of what my life looked like as a CISO. Now look, this is just, sorry for the small font. This is just an example. Don't take this to heart. Um, and and this is

representative of teams, not people. So, like right here, I have the CISO here, but this is really the security team. So, I didn't have enough font space for everything. Sorry. But in a lot of organizations, the one I one of the ones I used to work for at least, this was common. The CISO answers to the CIO or the IT department essentially. Now, this gets different in some orgs. Sometimes it's the CEO. I've seen CISOs answer to the CFO. uh sometimes general counsel depends on the business model depends on what the organization how they look at risk how they think about this stuff but for me this is where I lived now for my organization I'm I'm uh based

on the west coast of the US the company I worked for uh all headquartered on the east coast time is a is a challenge sometimes and so like for me on a given day my day always started with a phone call from that person and unfortunately My CEO was not only a little bit of a panicky person who took newspaper headlines a little too seriously, uh, he also had no respect for those time zones. So, I would get the call at 5:00 in the morning panicked about, "I just read in the Wall Street Journal that one of our competitors has been hit with ransom. Where are we next?" Like, dude, I'm asleep. How do I What are you asking me

these questions for? You know, I'm not looking at anything. Come on. But hey, that was the start of my day. whether I liked it or not, that's now my day. Like I'm going to get my team up whenever I got online and say, "Listen, we had a plan for today, but the CEO wants to know if we're next." And so a lot of effort is going to go into answering the question of are we vulnerable in those places? Do did we patch? How does this particular piece of ransomware work? And so you spin off trying to solve that problem. But of course, five minutes after you start that process, I get a call from the CFO who then says, "Well,

hey, I got your budget request. You want another full-time employee and a bunch of new tools. Why didn't we give you money like six years ago? Come on. Okay, fine. So, now I got to have that conversation. I got to stop the ransomware conversation. I'm going to talk to the CFO about why we need to justify another person and why we need to justify the the expenditure for new tools and why it's not really that much money and I have to have that conversation. Of course, come out of that then legal gets me. Hey guys, we're uh we're moving into the EU. Do we have to worry about that GDRP privacy something? There's like something they're doing there. Do we

care about that? Yes. But now I got to stop and answer that question, right? I got to pull out the regulation and find out, are we really going to have to be beholden to it? And I I got to start becoming a legal lex. And you see how this is going, right? I worked for a software company at the time. So like, you know, my sales team would come get me. Hey, can you talk to this customer and tell us tell them about our sock 2 report? Can you help tell them we're super secure and everything's great? Sure. When I get done with all the other stuff I'm doing fine. Um, compliance. We had to do PCI compliance. So, of course, I

have those folks I got to deal with on a regular basis to make sure that the audits are done and the reports are in order and we're not going to get fined. And that's a whole different conversation because they don't care about the rest of it, but they care there. Then, of course, as a tech company, I had product management loved getting me involved because I'd been in the trenches for a long time. So, I was a good internal person to ask about will this solve a problem for our customers? Like, dude, I got my own problems I'm trying to solve, but okay, we can have that conversation. I got to get development involved. Marketing wants me

to do things like this. You know, I'm happy to do that. Development teams, I got to work on those guys, make sure that they're actually coding everything we do in a secure way because, as we all know, developers are awesome at secure coding practices. Why are some of you laughing? That's not funny. No. Okay. Um, and then of course just day-to-day work, right? When you're doing some of the basic security technical stuff, you've got to work with the IT folks, the admins and everybody to make sure patches are getting deployed, firewalls are configured correctly, all all the stuff we do in on a given day. You're talking to administrators all throughout the company. Help them understand why

they actually need to do the stuff you're asking them to do. And really it's it's kind of the whole IT department. Whoever is there, DBAs, web application experts, cloud security architects, whoever. Now, this is a mess, but I bet you any of you who work in security, which is all of you, this probably looks pretty familiar. On a given day, we get pulled in a lot of different directions. And the interesting thing about this problem is it is reflective of something we don't talk about openly in the industry right which is though we come from technology a lot of us built our careers coming out of IT today security is not an IT function it's a

riskmanagement function and all of these asks all of this is happening because everybody sees us as experts in understanding risk and they have questions that's why we get asked about all this stuff, but think about almost any other job in a company. If you work accounts receivable, you know, you you process POS and push, you know, money from one account to another. It's pretty much you can focus on that job all day. I'm simplifying, but you understand. This is really hard to focus, but this is what we're being asked to do because we're trying to have these risk conversations with all these different business units in our organization. People have different needs and different levels of

understanding about what risk is. The lawyer doesn't see risk the same way as a CFO does. Close, but not quite. But we're in the middle. Whether we like it or not, this is where we are. And it presents for us really an opportunity to be the translators between all of these groups when it comes to risk. And that's for me what we're going to talk about today a little more is how do we structure this so it's not so chaotic and we can help everybody kind of speak the same language, talk about risk in the organization the same way and ideally help everybody kind of make better decisions about what to do and not be running around like

this. Make sense? Okay, I see some naughty hands. Good. All right, so here's my question for all of you. If you're looking at this list, on a given day, one single day, how many of you in your job do at least half of these things in a single day? Hands. Couple. Keep them up. I want to see some hands. How many of you do Really? Some of you don't do more half of this in a given day. You guys have great jobs. Like, for those of you handsome, look around. Like, you're not alone, right? There's a lot of your practitioners out here who do all of these things in a given day. It's hard to focus. It's really hard to to to be

able to do any one of these things, let alone six or seven or eight of them. And that's what we're up against, right? We have to be able to try to get our arms around this, narrow this down a little bit so it's not quite as complicated. So the challenge, of course, this is why my talk was what it is. We have to be all these experts. Now, I'm only wearing one hat today because I'm not a CISO anymore. I don't have to. But this was kind of my life. I suspect this is a lot of your lives. We're expected to be experts in a lot of different areas, not just technology. But we've got to find a way

to to talk about this in a way where everybody understands us from a level playing field. So one of the things that I've done or I've put together very simple give it here. It's a very very simple framework. Not reinventing the wheel. This is not like for a doctoral thesis. Just trying to make my life easier. This is something that I used to build when I was a CISO when I was trying to communicate in my organization is to think about the types of risk decisions that get made in every company can really be broken down into three levels. Now what I want to highlight here is these jobs are not absolute in these roles and it's fluid.

Sometimes somebody who's like a director of security may be called upon to make executive level decisions because they're the person who has the expertise. So it's not so much about the positions. I have them here just as sort of a frame of reference. You kind of understand the people you might run into make these decisions. But it's really about there are some risk decisions that are made at the executive level. There are different ones made at a strategic level, kind of a program level. And then there's the stuff that happens tactically every single day from your operations folks. If you can break down the risk conversation those three levels, suddenly I don't have to have 12

different conversations. I can have three different conversations. It's not perfect. It's way better than 12. And if you build something like this, you can start to help the other folks understand how this relates. You can get the rest of the organization kind of in the same mode and let them almost talk amongst themselves, which that's like the ideal world, right? if you can get them to do it on their own. I know, I know humans, I've met them, too. But this is where we want to go, right? Think about the risk decisions that happen at these three levels. If you can do that and you can build metrics around this, if you can build communication strategies around

this, the most important thing you get out of is a lot more buy in. Because if people at each of these levels who are trying to make decisions in the in the organization, you can help them in a way that's meaningful just to them, you win. So questions, thoughts before I I explain a little bit more about this. Any thoughts, questions? This is where I want the food comas to kick in. Go ahead. What type of metrics? I'm going to talk about that at least. I'll I'm going to show you an example. So to the question for those of you didn't hear, what kind of metrics you presented? So metrics are a big part of this. Um and there's a lot of ways to

do it. I'm going to tell you right now, lot of ways to do it and there is no single right answer. I'm going to talk about some strategies at each level about metrics you can use and I'll show you an example just to visualize it. That's the the product thing. Sorry, I'm going to I have to do a little bit. Come on. Um, but I'll show you a little bit at the end about an example, but really you have to tailor this to your work to what's meaningful to you. So, you'll hear me say that a lot today. Anything else? Cool. Let's dive in. So at the executive level, you know, CISOs have a kind of a dual problem. They themselves

are typically an executive in the organization. So they have a whole business unit. They have to manage and make executive level decisions about the security program, but they're also usually the adviser to the rest of the seauite. And so they have to be able to help empower those folks to make good risk decisions as well. The challenge again that we have is if if we're going to go back to this idea that a lot of us come up out of technology and not necessarily out of risk management. We often approach this problem with what's familiar which is technical metrics, right? And that can be a big problem because the people you're talking to at this level number

one don't understand the context of what you're talking about and number two don't really have time to hear it. These are people that have a lot of decisions they've got to make at a given day. Not just about you and your security program. They've got to make other business decisions that affect the course of the whole company. And it's one of the reasons why you see a lot of things like stoplight color graphics or for metrics for executives, right? You guys know what I'm talking about. Red, yellow, green. I see some of you laughing about that. Look, we use those colors for the seauite not because they're idiots. Well, okay. Sometimes they're idiots, but it's not why we use those

colors. We use them because they're easy. They're simple. And if I have 500 decisions to make today, I don't want to spend half an hour on your one decision. Just help me understand, are we good or are we not? If we're good, keep doing your job. We'll carry on. If you're not, what do you need to make it right? At this level of the organization, I don't need to get into the weeds. I just need to know is the company in danger? Are we at risk or are we not? Stop light colors are fine for that. Right? So, finding ways to make this very simple and intuitive where I can take one glance at the metric and get it is really critical

when you're talking to these folks. It's not because they're stupid. I can't emphasize that enough. I've had this argument for too many years of my career. Trust your seauite a little bit here. I promise you. But you've got to help make them easy and quick. And this can backfire if you don't do it. In my first role as a CISO, I got tasked. I got brought in an organization that had a security program, wasn't really doing anything. We had five people on the team, six after I got hired. We were doing nothing. And my CEO said, "Go forth and make it happen. Whatever it takes." Oh, those are good marching orders. So, we did I I told the

IT department, everybody else like this is going to be a no excuses session. We're doing it. And I told my team 30 days, first 30 days, we're going to we're going to go through a patching exercise because it's lowhanging fruit, right? It had been patching for two years. Let's just get it done. You have any complaints, any problems, you tell the people to come talk to me. That's my job now to deal with all the database admins who tell me like, "Oh, you can't reboot my server." Those are fun conversations by the really you didn't build in resiliency to your database. I would have thought for sure you would have built in resiliency on that. I absolutely did. Oh, so you

are good at your job. Why can't we reboot one node then? It's funny how database ads go like oh yeah, I guess you can. Okay, go ahead and patch it. Anyway, um those are whole other strategies. If you guys ever want to talk about those kinds of things, I can help you there. But um team went off 30 days. We went after it. Started that time. My organization had about 300,000 open vulnerabilities in our environment and by the time we got to the end of the month we had resolved about 100,000 of them mostly in the the critical high kind of ranges. I see a couple of you nodding which is good. That's a good sign.

100,000 vulnerabilities in 30 days is pretty good. I got to tell you um and look we celebrated. I took the team out to dinner like they busted ass. It was good. might have been some whiskey involved with that because you know you celebrate when that happens. There was definitely whiskey involved in that. Um but uh yeah, so I went back to the board after that. I had to I I wanted to report to the board and tell them what a great job my team did, right? So what did I do? Got my PowerPoints all ready to go. I built a graph. I was like, "Hey, I'm even going to do this by a percentage. Do a trend.

They'll understand." And I added my little trend line at the top. Started the month with 300,000 vulnerabilities. We dropped down to 200,000 at the end of the month. 34% reduction in risk. Awesome, right guys? Isn't that great? Boy, some of you laughing. You already know where this is going, right? My general counsel at this company, the lawyer of course, looks over at me and says, "So what? You only did one third of your job?" He's not wrong. From that perspective, from a legal perspective, what did I just tell him? There are 300,000 places where the organization is at risk and could be harmed and I left 200,000 of them behind. That's what he heard. So, look, any of us who live in

this industry, we know that was a massive step forward. But to the lawyer, just sounds like they need a new CISO. You have to find a better way to talk about this stuff other than volume-based metrics and big technical numbers. We love our volume based metrics, right? We stopped 95,000 probe attempts on the perimeter. Yeah, that's great. What does that mean to a CFO? Nothing. Because is it 95 million bad or is 95 bad? Like we don't know. They don't know the context. So stay away from volume based metrics if you're going to talk to the board. They don't need to understand that stuff. That's workload management. That's what the ops people need to understand when they're

solving the problem. Keep it simple. Keep it intuitive. Make sure that what you're doing I can I can tell what you're talking about at a glance. If you can't tell it, refine the metric. And again, I don't care what it is. Use stoplight colors. Great. Use letter grades. I don't care. Use a score if they're happy with that. They like percentages. I don't care. Use what makes sense to them and what's easy. And if any of you are about to raise your hand and say, "Well, how do I know which one to use?" Just ask them. I know this is going to be shocking, but I'm serious. As the CISO, sit down with other members of the

seauite and just ask them, how can I present this information to you in a way that makes the most sense? Would you like a letter grade? Do you like a number? H what would make it easier for you? They'll tell you. They'll give you an idea. Yeah, colors are fine. Great. I'll do stoplight colors. Now, ask them. Doesn't have to be rocket science. Just talk to them. And then keep it simple. Good. Any questions about the executive level? What's going on here? And how to how to talk about risk with these folks? Okay. I'll come to more stories about this in a minute. Let's talk about the strategic level. Now, the strategic level is really about kind of the day-to-day

program management of your security effort. This can sometimes come from the CISO, can be your directors, or it can be sometimes, you know, analysts and engineers. We're all involved in trying to make sure that we understand is the program working or is it not working. Are the rules and guidelines and things and processes we're doing today are they actually reducing risk or are we just wasting time? And so a lot of the things that the decisions we're making at the strategic level are answering those questions. Are we efficient? Are we moving in the right direction? Are we seeing SLAs's as an example shrink over time? So good questions. We're trying to answer the fact of is everything

actually working. Now this is the weird level because you're you're the the output of a lot of this is going to go up to the executives and down to the tactical people who have to actually do the work. So this is also the place where we make decisions about prioritization. Right? Everybody we all talk about this today. This is where it lives. And when you're talking to people who are making decisions about what to prioritize over something else, which problem in the org do we want to fix today, having a kind of unified way of prioritizing that's aligned with the business is really, really key here. And again, this may be something where you have to ask and that's okay. The people

you're dealing with at the sea level, they want to do the stuff right, right? right? They don't want to see the company go bankrupt or fined or, you know, get sued by a grand jury for negligence of public funds if you're in government. That is not a fun time. Um, there's a lot of places where they want to help. Ask them. The other thing that I want to emphasize about this is looking for outliers in the program. Now, look, as security people, our whole job is built around finding anomalies and patterns. If you think about it, right, we look at log files, we're looking for events that are unusual. If we're looking for what systems to patch, we're looking for the

ones that don't have the patch. We're looking for places in a pattern where there something isn't quite right. We're really, really, really good at finding outliers. What we're really bad at, though, is finding positive outliers. And at this level, if you're going to make decisions about what's working and what's not and identifying the processes and procedures that do or don't work, you have to adjust your mindset away from who are my admins who don't patch. But you still need to find that out. But also find the ones that are absolutely killing it because they may be doing something that will help you everywhere else. One of my consulting stints was for a very large auto manufacturer and their

their um their loan division, their financial services company, separate company. They act as a bank and they've got something like 114 countries that they operate as banks. Uh every one of those countries has its own security team. They all have to report up to the global team in the US and they're all required to meet certain SLAs and do their thing. and they found they had a bunch of countries that were not just were not patching for anything. So I got brought in to help them figure out why. And I sat with the CISO and the deputy CISO, a couple of the managers, a few other people and they actually had built a pretty nice

like manual program. Um did it in Excel. They had this like whole process looked pretty good. A lot of manual work, but they had it all saying country country red, yellow, green. And they could tell you exactly who wasn't doing it, who wasn't meeting SLA. Now, these guys had a 30-day SLA for patching. And we sat down and I said, 'Okay, show me the problem. They said, 'N no problem. Look here, Germany. Germany, 66 days SLA, more than double. That's pretty bad. Okay, who else? France. France here. Look at 64 days. They're not patching at all. All right. And they went down the list, the 10 countries, 12 count. They had about 30 39 38 countries that were out of

compliance, right? And we went down every one of them. And I finally asked him the question. I said, 'Well, guys, what's going on in Argentina? It's Argentina, what are you talking about? They have an SLA compliance of nine days. They kill it. We never have to worry about Argentina. Argentina just gets it done. We never even talked to them. Like, great. But what are they doing different? Blank stairs. They'd never asked the question. They didn't even look. So, we called them up. We got Argentine on the phone and said, "Guys, what are you guys doing down there?" He said, 'Oh, you got to talk to our We got this admin in the corner that like, you

know, he doesn't talk to people, but we'll we'll bring him on the phone. You should talk to him. And he gets on the phone and he's like, "Yeah, I don't know. I was bored. I I I I'm like, I like PowerShell." So, I I wrote this PowerShell script that just like grabs the Microsoft patches and I throws it out to my lab. And if it doesn't break anything in 24 hours, I wrote this other script and I throw it out to some people I know in the company who are friends of mine. And then I give them them 24 hours and let them tell me if it breaks anything. And then I just let it sit for

about seven days because I figure, you know, they use applications and stuff. And then the end of seven days, if I don't hear from anybody, I wrote a third script. I just deploy it to everything. 24 hours. 24 hours. 7 days. There's your 9day SLA, right? Automation process. And they did it. And I was like, "Guys, great. Can we uh get those scripts, please? Right. Simple question that no one had ever asked. We got those scripts and off we went. If you're curious, by the way, they deployed those out everywhere. Every country was starting to use it. They went in 60 days. Um after implementing that, they went from whatever what 38 39 countries out of

compliance to one. 60 days they turned their whole SLA program around just because they went after the good outlier. So look, strategically look for the places where it works because it can change everything about your program and you can communicate those successes in really meaningful ways. All right, I'm running a little short on time. Hold on. I'm going to blame the drawing. Sorry guys, I'm going to blame you guys for the drawing. Um, let me talk about the last part, the tactical part. Now look, this one's pretty straightforward. If you've been doing everything else right, if you line with the business, you prioritize, your tactical people should know what to do. You should be able to give them a list

of things. say, "Here's what I need you to fix. Fix, patch, write better code, whatever it is, fix the problem." That should be okay. Where this goes wrong, in my experience, and this might be a hard one for some of you to hear. Even though you're dealing with engineers and really hardcore technical people, the problem often when they resist and they don't want to patch and they have all these excuses by why it can't be done never has anything to do with anything technical. I've heard the same six excuses from people about why they can't patch something. And I've defeated all of them pretty easily because once you start poking at it, like the database admin guy I was telling you

about a moment ago, there's no real technical problem. What I often hear is this alignment part. What I've heard from engineers is, "Well, look, I have a full-time job. I work for IT. Security comes along, they dump this report on my desk, and they say, go deploy these 200 patches to your thousand systems." And even if I do that work, security team goes back up to the board and goes, "Hey, our security program's awesome. Check us out. We did the work." And the engineer goes, "No, you didn't. I did the work. How come I'm not getting credit for the work I did?" And so, if I'm not going to get credit for it, and I don't get compensated for it, screw

you guys. I'm not doing it. I got a full-time job. You get past the technical stuff. I cannot tell you how often I hear that these guys just want to be validated. They want to know that they're part of the team, that the work they're doing matters, and that they're aligned with what the business is trying to accomplish. So, have some empathy if you're going to work with these folks, and you will, you've got to understand they want to feel part of this team. Now, some of you might be those engineers in this room. And if you are, and you feel that way, tell your security team you want you want some help here because they may not know.

But this is where we can start to work together in a more meaningful way. And if you can show the engineers that what they do matters up the rest of these these levels, you can align them with the business. You can get them to a place where they can see that the work they do is represented properly up the chain. Now, what does that look like? Can I take five more minutes? Is that all right? Thanks, guys. Let me show you an example of what I'm talking about. this kind of alignment of the work and everything else. This is the example I'm going to show you. Again, I cannot stress this enough. Do this the way that

makes sense for your business. All right, this is the tool I have. Like I said, but find what works for you, what represents risk appropriately for your organization. Do it that way. But at least try to understand kind of what I've been talking about here. So, look, if I'm going to talk to the board at the executive level, I need this to be easy. I need to be able to look at this and go, "Oh, we're at a C. Sucks. How do we get to a B? Pretty quick. I can make that decision pretty easily. And colors, not because we're idiots, but look, here's my cloud environment. Here's my IT environment. Here's my credentials identity stuff. Here's my

web applications. I don't have to be a technical expert to know, hey, our cloud's doing pretty good. Good job, cloud people. Thank you. But how come our credentials situation sucks? What do we need to do here? And as a CISO, I'm going to have an answer for that question right? But it's a really easy way. It's kind of a risk heat map for an executive to see this and kind of go, "Oh, I see what's good and I see what's not. What do we need to do?" You notice there's no volume numbers here. No talk about number of loans, no talk about missing, you know, number of assets or anything else. This is just how much risk do

these technologies contribute to the environment. This is really all I want to talk to at the executive level. As a CESO, I have a plan and that plan comes from sort of the strategic analysis. So strategic analysis is we're going to take this and dive down. Maybe I'm going to break this out a little bit more by team or by admin or by technology area or by business function or whatever I want to do. And obviously we we we're always trying to make decisions about where to start. Most of us use Excel for that. At least I did. I'm sure a lot of you do too. But if I can just look at something here and go, well, wait a

second. How come the United Federation of Planets is sucking it right now? Sorry, I picked the United Trekies out there. That Sorry, that's for you. All right. Um, but hey, that takes me two seconds to figure out that that team isn't patching. I don't have to go try to guess at who's not doing it. I know who's not doing it. So, let's go find out why because maybe they need help. Maybe they're short staffed. Maybe they don't have the tools. Maybe, you know, there's been a death in the family and someone's not there. Who knows? But I don't have to spend a week figuring it out. I can spend a second figuring it out. And as a CISO, this is

powerful because if I'm going up the level, I I need to be able to tell the executives, we have it under control. We have a plan in place. We know the team where the risk is coming from. We have a plan and we're working on it. And as an executive, that's all I want to hear. Tell me you got it. And next month, I want to see progress. Easy, right? Same thing could be said for like SLAs's. I'm a big fan of looking at SLAs's. Look at be able when you're looking at SLA compliance, do me a favor. whatever your number is 30 days as an example. Look at what it is hypothetically at 21 days, 14 days, and

seven days. And the reason I say that is because the number of times I see organizations who say, "Ah, yeah, we have 30-day compliance at like 96%. We're killing it. We're awesome. 30 days." Well, we're still giving the attackers 30 days, right? So, you say to yourself "Okay well hypothetically what do our numbers look like if the SLA was 21?" So, one company I worked with at 21 days, they were at 12% compliance. Wait, you're at 96% compliance on day 30, you're at 12% compliance on day 21. What I hear there is you can do it all in 9 days because why is it at day 21 your teams haven't done anything, but by day 30 you guys

caught up 80 something, 80%. What's up? What's going on? Why has that happened? And so, you're going to look at this. This is program optimization, right? This is how I can look at my what I thought was a good policy 30 days. Maybe we need to talk about making it 14 cuz we can. The team is demonstrating we could do it in 9 days. Let's do it. The more we optimize the program, the better that we can make security work, right? That's the goal. Reduce risk. So, look at your SLAs. Make sure you're looking at that kind of stuff. When you get down to the bottom, which is a terrible way to say that, I'm

sorry. we get down to the tactical level, y'all are now at the bottom. Sorry if any of you in tackle level. I apologize. That was not what I meant. You are the foundation of this thing, though. We can break this out by individual, right? Look at the numbers individually and start to say like, okay, well, you know, hey, this is pretty obvious. This this admin is not cutting it. They they only represent a small percentage of the overall environment and their assets that they manage contribute a small amount of risk, but for some reason, you're not patchy. You're not doing it. What's going on? Why even, you know, we want to give don't even give the attackers one

opportunity. So, let's go talk to that person and find out what's happening. And of course, it's all trending in scores. And we're using this number down here to kind of represent how much risk they contribute. I also, by the way, you can see I want to look at my good outlier. How come they're doing well? Let's go talk to them and find out what's going on. The same same kind of thing. Really easy way to glance. And the nice thing about this is is that whoever is here, we're using the same kind of metrics to represent a risk score and we're using the same letter grades. We're using all the same numbers, which means if I tell an admin

this thing at the bottom and say, "Listen, I need your help. If you can fix these 20 problems in your servers, you move the overall score of the organization by 42 points." And that that score is that so I can directly tie an engineer's work to the to the number that the executives see. And if I'm responsible as a as a security leader, I'll give them that credit because I can go back and say like, "Hey, you want to know why we trended down 10% last month? It's because Nathan finally did his job and stopped being a Slack." No, because Nathan put in a lot of work and patched some servers, right? But this kind, it's

simple alignment. We're not recreating like this is not crazy math here but you can see how a really simple use of of unified metrics means I can talk to the engineers and say can you just if you can just move the number by we're really just talking about risk contribution but contribute a little less risk here we can we can take that all the way up to the executives this is an alignment technique and it means that the engineer who feels like they're not getting validated goes oh security won't take credit for my work like I get to take credit for my work. Yeah, hell yeah you do. That's what we want. And now everybody also talks about risk the same

way. We see scoring mechanisms here where people are like, "Oh, dude, how do I get to an A? I want to move my my servers from a B to an A." Cool. I can help you do that. But you know who else is asking that question? These people. Because a CFO wants to know, how do we move that from a C to an A? What do what is it going to take for the whole organization to move that direction? You get people aligned starting to talk about this stuff the same way and suddenly you don't as the security person don't have to translate 12 different conversations. We can talk about risk in a really unified way that

everybody kind of understands without getting technical. We're not bombarding them with big metrics, big volume numbers, right? No volumes, no probe attempts, no no events trying to help them make good decisions about how to mitigate risk. That's what we're here to do and we all should be doing that. Okay, I went a little over time. I apologize. Any questions thoughts anything? Okay. Wow. All right, guys. Well, thank you for the time today and appreciate it here.