← All talks

Cyberpsychology - Managing Human Bias in Cyber Risk Decisions

BSides Joburg · 202540:1743 viewsPublished 2025-09Watch on YouTube ↗
Speakers
Tags
StyleTalk
About this talk
Cyber risk management is often treated as a purely technical discipline, when in reality, human psychology and biases shape risk perception, prioritisation and management. This talk will explore some of the these human factors and offer guidance on potential management strategies. Cyber risk management is often approached through a technical lens, yet it is profoundly influenced by human psychology and cultural factors. This talk explores the impact of cognitive biases—such as optimism bias, anchoring, confirmation bias, and the availability heuristic—on the identification, prioritisation, and mitigation of cyber threats. It examines how personality traits common among cybersecurity professionals, such as introversion and analytical thinking, shape a preference for technological controls over human-centric strategies. Organizational culture, psychological safety, and group dynamics are shown to play critical roles in shaping risk perception and response. The analysis pays particular attention to the African cybersecurity context, highlighting how resource constraints, informal digital systems, and generalist decision-making amplify bias-related vulnerabilities. The talk presents the viewpoint that effective cyber risk management must integrate psychological insight, cultural awareness, and technical proficiency to build truly resilient digital defenses. About Samresh Ramjith: Samresh is a Partner in Deloitte Africa’s Risk Advisory practice based in Johannesburg and is the current leader of the Africa Cyber practice. His cyber experience spans more than 20 years in the Sub-Saharan cybersecurity industry. His core experience spans deep technical roles through to executive business and people management, across market sectors. He qualified as an Electronic Engineer, before moving into software development and system engineering roles. He then qualified as a firewall engineer, before moving into pre-sales, cybersecurity consulting, and business management roles. He holds a master’s degree in digital business, as well as several cybersecurity specific certifications, most notably CISSP (2009) and CISM (2010). He was the Deputy CISO at ABSA Group, where he led the Cyber Security Consulting Service Group which comprised of Lead Security Consultants, Security Architects & Research teams. He also chaired the SABRIC Cybersecurity Forum, driving banking sector-wide engagement on topical cybersecurity matters. His prior work experience includes leadership roles in EY Africa, IBM Global Security Services and Dimension Data MEA (now NTT Data), the SA Reserve Bank and Siemens Telecommunications. About BSides Joburg: Website: https://www.bsidesjoburg.co.za Twitter: https://www.x.com/bsidesjoburg Instagram: https://www.instagram.com/bsidesjoburg Masterdon: https://infosec.exchange/@bsidesjoburg LinkedIn: https://www.linkedin.com/company/bsides-joburg
Show transcript [en]

[Music] very settling. So morning folks, great to have you here. Uh and we'll just dive straight into it. I think we've done all the intro. So anybody who's been in cyber for any amount of time knows that we've always spoken about people, process, technology, right? If we think about technology, there are more than 2,000 cyber security vendors globally. There's no shortage of tooling. In fact, that's half the problem. There's too much tooling. We talk about process. There's no shortage of frameworks. Just think about ISO, NIST, SANS, whichever framework you can shake a stick at. There is no shortage of frameworks. There's like 450 different certifications that one can do in cyber. We don't have a knowledge problem. We

have a people problem. Because if you think about it and the whole premise of today is why is it so hard to have good cyber? If you were going to buy a network today, you'd go out, you'd buy Cisco or you'd buy Arista or one or two brands and you pretty much will have a good result, right? You can't really go wrong. If you're going to buy an operating system, you buy Microsoft or Linux, maybe Mac OS. Your results will vary, but you can't go too wrong. cyber. If you want to go buy an endpoint solution, oh my god, right? You're going to have to start an RFI, an RFP process. If you want to buy a

firewall, oh god, you're going to have holy discussions, battles about this versus that versus this. Why is it so hard? It's because if we think about every cyber program, what do we start with? We start with the risk assessment, right? Everything else in cyber is very objective, quantified, KIS, KPIs. We've got data points and stats about it. But we start with a risk assessment, which means we get a group of people together and ask them what they think the risks are. And there's the problem. Think. So if we start off on the premise that the risk assessment is flawed in some way, we're automatically going down a rabbit hole and this is why there are so

many gaps in the cyber process. Right? So what I'm going to talk about today is about the human bias and how we become aware of it. We can't eliminate it, but how do we become aware of it and maybe try to normalize for it as we go through our risk assessments as well, right? And I'll start off with like some of the things about what a bias is and how they manifest. The first thing is what is a bias? It refers to a systematic deviation in how risks and threats are perceived by people and they arise from human cognitive tendencies, organizational culture and algorithmic processes. Jason spoke about the algorithmic processes earlier on. You understand how that works. But the

algorithmic process is actually modeled after cognitive processes. And what he was talking about there were biases and risk perceptions. If you think about reality, the nature of reality, we're sitting here together and today and we think this is real. Is it real? Our brain is in case encased in a skull surrounded by bone. Your brain can't see, feel, touch, hear. It's receiving inputs from sensory devices, processing it, and projecting reality outward. We are creating reality as we see it. It's a hallucination. And when we agree that the hallucination is what we're all hallucinating, we call that reality. But everybody has a different sense of smell, touch, taste, color. Some people are color blind, some people don't hear

as well. Your brain naturally gravitates to different aspects of what you're hearing. Even now, right? You're immediately synthesizing information and putting it into a framework into a pattern that you understand. And every one of us does it differently. We are hardwired from a bias perspective, right? And human cognitive biases come in many shapes and forms. We'll talk about some of that today. Organizational bias. Organizations are just groups of people. So if you're bringing people with bias into an organization, guess what? the organization is going to have a particular bias, a particular posture. Anyway, we talk about organizational culture. It's not a thing. It's the people that create the culture, right? We are not bystanders. We are actively

creating the culture in organizations we're in. And we've already spoken about algorithic bias already, right? So, what causes some of these cognitive biases? It's emotions. People, you know, we like to think we'll bring facts. And if you ever seen a husband and wife arguing, you know what I'm talking about, right? The guy comes with rational facts. Let's talk about this. If you would just calm down and then all hell breaks loose, right? Cuz you are trying to bring rationality to an irrational feelingsdriven conversation. Right guys? You know what I'm talking about, right? Human beings are naturally selfish. We work even if we think we're not right. We are naturally selfish and we look after our own motivations and those

motivations start with survival. The most basic instinct is to survive. But if you think about it, every human only has two modes of operations and there's a little bit of a link in between. It's paid avoid avoidance or pleasure gain. If you've had young children, you will know that sometimes telling them that don't do that or you're going to get smacked will definitely get them to do it and then they look at you like what now old man, you know, are you really going to smack me? We know that's not going to work. But if you tell them if you don't do that and I'll give you 10 bucks. Suddenly behavior change just like that on the

spot. Why? because that child is motivated by pleasure gain, not pain avoidance. Now, if you think about every fishing campaign we've ever run in the history of the world, right? What is it? It's pain avoidance. Don't click that link or you'll go on the naughty list. Don't click that link or I'm going to dock your bonus. Don't click that link because some punitive measure here. You're violating policy. Where's the pleasure gain? You're not tapping it to the reward center of the brain to say I'm motivated to do something because there's something personally in it for me. I will go on a list that says I was a really good boy, gold star, right? And that's very important that gamification

and tapping into people's motivations, right? And if we talk about things like age and all these different factors, that also comes into play. But the one other one is limited information processing capacity. What does that mean? The part of your brain that is listening to me right now and processing real time is called the prefrontal cortex. It's a tiny part of your brain somewhere here, really small. It was one of the last parts of your brain to evolve and it is easily overloaded and overwhelmed. And you'll see this that if you're driving in a parking lot and looking for a parking, you will turn the radio down. Why? Why do you do that? Because your

brain can't handle all this input at the same time. It needs to declutter and focus. I'm focused on finding the parking. Why are you exhausted when you learn something new for the first time? It's because your brain is actively processing it. This tiny part of the brain is working like a CPU flat out. 100% processing time trying to synthesize make sense so that it can process it back into your muscle memory. So your brain is easily overwhelmed when you have too much information. Think cyber. Too much information all the time. We focus on the stuff that we can easily solve for. We switch off our active processing and we use more of the emotional and ingrained hardwired

responses. We do that all the time. We call it muscle memory when it's learned. But the biases etc that you're born with and you grow and develop from your environment, from education, from society, that becomes hardwired into your brain as well. How you process the world around you. So it's easy to overload that. And if we think about one of the other aspects of how this comes in, have you ever noticed that somebody points out like a yellow car or a green car, you will spend days noticing every freaking yellow car and green car on the road? And it's like, wow, I've never known there were so many yellow cars and green cars in the world. That is a

phenomenon called the bad manhof phenomenon. It's actually two biases at play simultaneously. One is called selective attention. Your brain can only focus on certain things at a time. It's not a massive multiprocessing engine, right? So now that you've highlighted a yellow car or a green car, your brain is paying attention to it. And that so every time you see a yellow car, it feeds into something called a confirmation bias. Like there's a yellow car. Oh yes, I saw it. Wow, another yellow car. So you confirm what you knew and your brain then says, "Okay, let's look for another yellow car." And you don't even think about this but you go into this loop of noticing it and

because you notice it you say wow really common and you confirm you're confirming your own belief essentially that is what's happening here. It's two phenomenons. It's two uh biases happening in together. This is exactly what happens on social media what Jason was talking about. You'll likely be kicking on something that's rage bait or something that is gory or whatever it is. And the more you click those things, the more the algorithm confirms this to you. The algorithm is actively playing you and how your brain works. So tapping straight into you. So apart from the neuroscience part of it, how does human bias play out in cyber risk? Because that's what we talking about today, right? We know that human factors

contribute to most of the breaches that we see. people clicking links either for they didn't see it, they didn't pay attention, they didn't know what they were clicking, but more than 90% of the breaches are from people doing something they shouldn't shouldn't do either active fishing or um misconfigurations etc. Right? We don't really address psychology, the people and the psychology of the organization as part of our cyber programs. That's almost seen as like an HR thing. you go deal with the people and their feelings, but people and their feelings are in the organizations. People bring their whole selves to the table, right? It's not like when you walk in the door, you forget that you had a bad day in

traffic, that you, you know, got some pressure, there's somebody ill. You don't leave all of that at the door and just come in and be a work robot. That's not how we work. It's not how people work. You need to address all of that. So when you starting a risk assessment or a simulation or you're starting to have a conversation about strategy, be cognizant of the states, read the vibe, read the room, understand who is in it, how are they, how are they bringing themselves, right? And you need to bring that psychological angle into how you deal with large corporations. So in cyber, we often want to go and tell the execs, tell the board, we want

to seat at the table, what are we going to say? Firewall rules are cool. AV coverage is 96%. What the heck does that mean? Because the executive is not, you know what it means, right? But the exec it's not synthesizing it the same way you're using it. And we have this as a bit of a language gap. And we'll talk about that now, right? That plays into the personality dynamics of cyber teams. If you think about it, sabot teams in general, if you do a MyersBrigg um analysis, ISTJ or INTP. So basically, introverted thinking, intuition, sensing. That's kind of what most cyber people are are like introverted right? We're really good at absorbing information, thinking about it,

reflecting it on it. But ask them to go have a conversation with someone new. No, you can be online and be lead hacksaw, but real world maybe not. You know, you can go and headshot things in Call of Duty to your heart's content. But can you just go over there and ask that guy for something? No, that's not not me, dude. Not me. That's just how we are. That's human. That's how we are. you're not going to change that, right? It's pointless to try. So what this we just got to understand what the strengths and the challenges are then when we when we are actually bringing ourselves to the table as well, right? One is that we're incredibly

precise, detail oriented and we are we take pride in tech mastery. So going in and learning something to the nth degree, being the best certified person, having the knowledge that is what motivates cyber people. You want to be the best at whatever it is, right? You want to be better than the bad guys. You want to know everything. But where that becomes a challenge is that instead of talking to people and having conversations, we rely on technology. And that's why if you look at many enterprise security programs, a lot of it deals with what tools are we going to buy this year? What is the next shiny thing? Cuz last year's EDR was cool, but

this year's EDR is even better. It's got AI, guys. We got to have AI, right? Because we're comfortable in that that we can control the tech. It listens to us. It doesn't react violently when we tell it to calm down. You know, just reboot it and it's fine. Most times this leads to a communications gap as they said because we we communicate in technical ways. It's like you know could you check that and then this guy did this and then I I launched a command shell and I did this cool stuff and I was like what the hell are you talking about dude? I have no idea what what this any any of this means. Right? And

and that's something you can overcome but you be you got to be conscious of it first before you go into learning how to change that communications gap. And we're also pretty non-confrontational. doesn't mean non-confrontational in the sense that if a taxi cuts you off, you don't want to beat them up. But non-confrontational in the sense that we won't often go in and challenge someone's decisions, especially when there's a power gap. You might say, "I'm just an engineer. That dude's an exec. You know, I'm just going to let it go." Or it'll be like, "That person said something completely wrong, but it's fine. you know, I will deal with it my way and we'll sort it

out and then we will go back and report results instead of challenging things. And what does that mean? It means that you can become very frustrated in your job very very quickly because you feel like a square peg in a round hole. You are trying to have a conversation but the communication is just not happening. The person is just not understanding what you're saying. Right? And then you feel frustrated and because we're introverted, we internalize the frustration and then you go and kill a whole lot of in Call of Duty and you feel a bit better for a while, but then after that it's like, gez, I got to go to work tomorrow and these people just they're

such a bunch of morons. I really don't I really can't. And that's why our average tenure of a security person in cyber is about two years. Just long enough for you to get there, find the coffee machine, hit the coffee, turn around, walk out the door. It's how it is, right? So, what's some of what are some of these biases that are that are playing out and and these are some of the communications gaps that we will see coming through in many organizations. One is an optimism bias where the organization is always thinking about things in a positive light. You know, let's be positive. One of those happy clappy organizations like it's other people's cyber incidents. You

know, we read about in the news, it's other people's stuff. It won't happen to us. we're too small or we're not a target or we're not a bank. There's an optimism bias because people like to kind of deal with things and see the world in rosecolored glasses in general, right? Availability heristics where the board will overweight some recent activity. Did you see those guys got ransom attacked by ransomware? Okay, what's our ransomware strategy? We need a ransom, you know, task force. Everything becomes about ransomware. Meanwhile, you might have a major access management problem in your organization and ransomware is like a medium priority, but we don't worry about that. We got to go and look at the stuff

that's making headlines, easily diverted, right? Anchoring biases are also very dangerous where you might have an initial risk assessment and we see this a lot. You do a pin test, you get a result that's a blood soaked report, right? Goes to an exec. execution says your oaks are useless. Look at this. It's a terrible report. And even if you improve the report over time, it gets less blood soaked and gets more orange and eventually even green. The perception is still that cyber is not where it needs to be because you've anchored that executive's mind in the initial report. It's actually what happens in negotiations as well when you if you're sitting around a table negotiating with somebody and they'll

ask you what's your price? How much does this cost? Why is they why are they forcing you for a fee when you don't have all the information? It's because they're anchoring the price at a certain point. So every negotiation will be even if you come back with more information and say it's double the price, but you told me it was the other price. What are you trying to pull here? So these biases play out in lots of different ways. It's not just stuff that lives out there. These are actively exploited by people who want to manipulate people in their own situations. Group think is another common one and this plays back into the sense that we are non-confrontational in

some cases. There's often a sense that if there's consensus, everybody agrees and we all think the same. Cool. That's the right direction. It might not be. In many countries, African countries, Asian countries, for example, junior people are not empowered to go off and challenge authority. It's not seen as a thing you do. You challenge it through your layers of management and hopefully at the end of the day it results in some sort of meaningful interaction, right? And if you think about it, people in the ops space, people at the coldface are seeing the problems. You're seeing the issues. You know what is happening on the ground, right? It doesn't get sanitized into this word document that

goes to a risk committee eventually through 15 layers of sunshine, right? But because the organization doesn't want to challenge the the norms that that you're seeing, right? So if you think about the optimism bias, don't tell me stuff that doesn't play into my narrative, especially when the CISO has gone to the board and said, "Cyber is cool. I sleep really well at night." And then there's a horrible data breach and it's like, "Yeah, but you know, that wasn't really what we were focusing on in our cyber program. We were looking at ransomware and I told the board that." So you have that sort of playing out, right? We also have things like framing effects where executives think about not

just executives or people in general are not really good with stats. If I told you there's a 2% chance that the earth will get hit by an asteroid today. I don't know what that means, right? 2% of what? The hour, the day, the what what does it mean? But if you say there's a good chance that you're going to lose 15 million rand today, that will make you pay attention, right? And how many times we go into our cyber programs and we say, "Yeah, you know, there's a 50% chance of this happening and 60% it's like the weather report. So what?" You know, but if it if you tell somebody there's a good chance it's

going to rain, carry a jacket or an umbrella. Chances are that's a bit more meaningful, right? So there there's a way that you frame the information so that it's consumable by the mere mortals outside of cyber as well. The other one that Jason was talking about this morning was also automation bias. We've seen this a lot where people are now going to deepsek and chat GPT and whatever and generating code and just assuming that the code is solid, right? And they look at the Deepseek one and look at the the other one and they'll be like, "Okay, which one do we like more?" And that goes into production. We blindly believe that the tooling is producing the right outcome.

So the thing with Genai that you got to really be aware of is it's 80% accurate and 100% confident. So it's like the greatest salesperson. Don't worry, this can do whatever you want it to do. It's on the road map. You know those conversations on the road map. Don't worry, it's coming. When's it coming? I don't know. But don't worry about that. We got this now. So automation bias massive thing especially when our tech enabled spaces right we rely a lot on technology analysts and sock people will just go in and just look at what's coming out as an alert and say that's that's real and you could waste a lot of time and energy chasing

things that don't exist. The other bit is confirmation bias. This plays back into the social media piece where we look for information that affirms our world view. I think polo drivers are terrible. So what happens? I notice every single Polo driver badly because it's confirming my world view that Polo drivers are terrible drivers. They are, but that's the I've got a lot of data. So So and and we will actively discount information that challenges our worldview. And this is why like what Jason was talking about in the US when you say let's bring facts to the table. People don't care about facts. People care about how they feel. If your facts challenge my worldview and makes me feel

uncomfortable, I'm going to discount your facts. Facts don't really matter. They matter in some spaces if you're going to the doctor, for example, right? But that's a different conversation. But in most conversations, you brought facts to that rational conversation you were having with your wife. No, it's how how people feel. Don't challenge their worldview. Right. and sunk cost fallacy. We see this a lot because you've spent money on some sort of tooling in cyber. You bought whatever the greatest shiniest thing Gartner said. You bought it and immediately realized this thing doesn't really work. But now you've got a three-year timeline with this thing. You can't just chuck it out. So what do you say? Well, we've

already bought might as well use it. You know, we've sunk it's sunk cost. It's like we're already doing it. It's like waiting in a queue at checkers. If you've been waiting for 15 minutes and they choose another hour long, you'll be like, "Ah, you know what? I've already been here for 15 minutes. What's what's a bit longer? I've already sunk the time." When it might have been actually easier to just drop the thing at the at that 15 minute mark, go and do something else and come back later when the queue's gone. But we don't think like that. We think about what we've invested already. And that works in relationships and all of that stuff as well, right?

I've already put so much of time and effort. it's hard to let go because you've invested yourself in it. So, how does this play out from a from a from a CESO perspective? What can we do now that we know that we have these kind of challenges in in in our space, right? The impact of optimism bias is that quite often you end up with underfunded cyber security programs. You will only get funding when something goes wrong because everyone's optimistic. Hey, life's good. We're not a target. then you get taken off at the knees and it's like oh now what right so the way to to kind of deal with that from a CISA perspective is bring in benchmarks bring

in data points that kind of talk about uh what the downsides are so it's like cool you want to cut cyber budgets that's not a problem but then remember we have got critical exposures in these areas those are the risks and there's a potential loss of x amount of rands because then you're putting it on the table you're not saying oh but I need this tool you know you say well cool that's your decision Mr. customer or Mr. exec but these are the downside risks and then they need to accept this they then it becomes conscious you have to conscious consciously accept there's a 15 million rand loss expectancy or that the plant's going to go down for a month

or that there's something horrible is going to happen from an audit perspective right so that's how you kind of kind of deal with that from a recency and availability perspective you know you've got to look at the threat of the month versus the overall threat landscape what's changing ing now. What's what's the state on the ground now? Because we always start off and you've probably seen this, right? There's millions of these articles of the top 10 threats for 2025. 2025 is a very long time. It's an entire year. You know, I'm sure at the start of 2024, no one was talking about the top threat of 2024 being crowd strike, right? Then that's what took everybody

down, but it wasn't no one's no one's radar. So you've got to adapt and move as the recency becomes more available, especially in an environment that's changing fast. From an anchoring perspective, you know, you need to kind of reanchor as you go along. So as you make micro increments and make changes, you got to kind of shift that narrative. People are slow to change. So you got to guide that ship all the time like a tugboat with a giant ship in the harbor. You got to keep nudging it the right direction. And over time people start seeing that okay we are improving actually and you get the drift into the right space right from an automation

perspective this is where red teaming purple teaming comes in you you can't just blindly rely on on software you build a tool that goes in and starts creating havoc from a bias perspective on its own or is riddled with vulnerabilities you are still accountable you're the cyber guy I can't sue chat GPT a two Right? I can't say naughty get in the corner. You know, it's not going to work. You will as the human in the system will get the brunt of the repercussions cuz you should know better. You're the techie, right? So, we got to bring in some sanity in that space. From a confirmation perspective, you also you got to automatically kind of challenge yourself because if you're

only confirming what we know, you're creating blind spots. Actively creating blind spots. So you got to say, well, as much as we looking at this stuff, these are the things we ignored. This was the parking lot. How is that moving? Is that increasing, decreasing? What's the risk happening in that space? What's the information? You know, that SharePoint vulnerability that was out. I'm sure people knew SharePoint was vulnerable for a while, but it didn't fit into our risk framework and our narratives. It might have been a patch that was there and we're like, ah, it's a medium. I've got other things to do. And now when all hell broke loose, it's like let's go back SharePoint

because we had to. So you got to look at those blind spots and really think about what does that mean going forward. Uh group think and power distance. It's really about anonymous reporting. I think it's not the answer because culturally there is a lot of kind of stuff happening in that space. But you got to create anonymous channels and blameless retrospectives. I think that's important. And it's very hard in some organizations because there are a lot of organizations that are looking for the one throat to choke, the punitive who who gets fired today. And that might not be a great organization to work for. Maybe you should look on LinkedIn for another job if that's the case cuz I

wouldn't want to work in an organization like that because some organizations are unable to grow and develop and think and challenge themselves. There's a power dynamic and you're at the bottom of the food chain unfortunately. So sorry for you. I don't care what you have to say. Not great, right? But then you've got to create anonymous channels. You got to create um a way to get that message across. Like I think Ian, Shack, and I were talking this morning about somebody trying to report a vulnerability on someone's e-commerce website in South Africa to the company and they want to take legal action against him. So he's trying to do the right thing by disclosing and they're like, "No, you're

the problem." That's that's not a mature result, right? And then you've got that Sankos fallacy. You got to look at, you know, even though we spent that money, if we cut our losses now and we move on, we could actually potentially save or have a better outcome or improve the risk posture and and highlight the opportunity of reallocating budget or ending a contract rather than just grinning and bearing it for 3 years. Not not great, right? Same thing goes for bad girlfriends. So and the other issue that comes in here right we've spoken a lot about the psychology and the bits and pieces but if we look at our a mirror to our own lives the a cyber team is a very very

stressful place to be right if you think about a sock alert fatigue you are bombarded by information 24 by7 it's just alert hell right especially if it's a badly tuned sock badly tuned device you're going to get thousands of alerts a day it's very easy to just stop paying attention. You're exhausted because all those alerts need some sort of remediation, reporting. You can't just ignore it. So, you're exhausted because there's just not enough time in the day for everything. And even weekends go with changes, moves, all that sort of stuff. People who are tired are 40% more likely to make mistakes. What happens with mistakes? They lead to misconfigurations. What the misconfigurations do lead to

cyber incidents? What the cyber incidents do? Make you tired. It's a vicious cycle, right? So wellness in cyber teams is a real issue as well. You're always stressed. Think about yourselves. Even while you're sitting here, half your mind's thinking about what's what's happening. Was did anything break this morning? Thinking about the WhatsApp group. Your phone beeps and you're on it, right? Your wife wishes you were that attentive, but you know it just doesn't work. So you it's a chronic stress and what happens? Chronic stress releases a hormone called cortisol into your body. Cortisol makes you gain weight, makes you eat junk food because that's what you want. It triggers a whole bunch of emotional results responses. Gives you heart

disease. Reducing your cortisol levels is critical cuz you know we're way too young right now for this kind of drama in our lives. So stress is is definitely a terrible thing. And it sounds terrible, right? Cuz we think about it. We all want to be loyal to our companies, loyal to our brands, right? But if you saw Microsoft, first chance they got 5,000 people buy. and then a few months later another 5,000 people. So think about misplaced loyalty and misplaced stress as well. Be a little bit selfish with your time sometimes. And safety is around the low psychological safety. If you think about uh even like when I was at at the bank 600 cyber people cool 45,000 staff

your ratio of cyber person to staff is just phenomenal just off the charts right there's no way you're going to be able to create enough buffering to deal with this and if anything went wrong it came back to cyber cyber guys that's what they do I've been in board meetings where and exco meetings where when you start talking about cyber everyone just points to the sea and say that's his problem. Why are you telling me about this? Speak to him and we all say cyber's everyone's problem. No, it's his guy. That guy, he gets paid for it. So, you think he's going to come to the table and say, "Guys, we got a big cyber

problem." No, cuz they'll fire him. Like, it's your problem. You can't you can't do the job. Is that that's what you're telling me? You can't fix it. Then we need another seesaw. So there's very low psychological safety and you end up in having this loop of stress and poor health and poor eating habits and no time to exercise and there's no balance and then what happens? You burn out after 2 years. I hate this company. I hate everything and I'm going to go do something else. And this is why we have this kind of churn in cyber as well. So we got to really think about how we deal with wellness in in cyber because it

doesn't make for long healthy lives, right? Bringing it all together if we think about where where we embed this because it's great now that we've got some conscious awareness of this thing. So how do we bake this thing to a cyber program cuz ultimately it has to become programmatic so that you can change right remember anchor bias. First is you got to think about the boards at a governance perspectance right you got to think about how do you incorporate bias awareness around cyber risk and risk management at a board and executive level. How do you tell them nicely, guys, you see the world through rosecolored glasses. Those that's not actually how the world works. You know,

just because they got broken doesn't mean we should smile that it wasn't us. It just means we might be next. That's all. Or we might be broken and we don't know it. Also a nice thing to say, right? So, so bring that visibility in it. Look at the the the environment objectively. And how do you do that? You conduct an assessment. You conduct an audit. you get a third party to come in and look at your environment and give you a view good or bad and part of our business of from a delight perspective is audit and I can I can't tell you how many times we have say arguments but we have a mis a

misalignment between what the report is saying versus what the exec has presented to his his boards and his audit and risk committees and for him to go back and walk that narrative back that everything was cool is is massively damaging for his personal brand. So, it's not that he doesn't care that there's an access management problem. It's that he's told everyone for 3 years that there is no access management problem. And now this stupid auditors are telling me something stupid and different and the stupid damn report and I hate them all. We see this all the time. All the time because we went with our guts. That is called Dunning Krueger effect going with your gut when you

think you know something but you really don't but anyway then we got to pair this thing to into how do we protect look at the we said gamified fishing simulation carrot and stick kind of thing bring that in detection you know end user behavior analytics are really cool to look at stress-driven anomalies so something when I was at bank we also did was we tried to predict fraud and there were actually 20 psychological dimensions of a fraud event about to happen about how people's personality and behavior changed that they started contacting people they didn't contact before. They came to work earlier. They started looking at stuff and paying attention to systems that weren't theirs. They tried

to understand, you know, what were the thresholds around these things and just like covertly asking things. And those 20 dimensions actually went into our UBA solution. And it was actually quite a a good predictor of when somebody had been compromised by a syndicate or they were acting on their own and they were about to commit fraud. So you know th those are the kind of things that you can look at. Um from a response perspective you know playbooks automation getting some of the grunt work out of the stuff we've got to do. So there is a place for gen AI and automation and agentic AI and that's taking out the the kind of you know continuous repetitive work so that

we can actually lift our heads out of the mud and do something more interesting and then recovery is around bias aware awareness blameless postmortems etc. We want to learn you know everyone says we're a learning organization we're a family and we're a learning organization but how do you actively learn from the incidents? Do you you know kick people out of your family or do you say well we'll do better next time and this is how we're going to do it right so th those are the kind of areas we can look at and then in conclusion right cyber security is not just about tech you know a lot of people come in and say what technologies do I

need to know I don't know what risk are you trying to protect or mitigate it's really about the people in the system the people in the system drive the system and often I see that the people are a byproduct in the system. We talk about cyber security skills gap, but we've got the bystander effect where we're waiting for everyone else to solve the problem. You know, how are we investing in talent and culture and people and bringing in juniors, right? I'm getting ready to retire. I can't wait. But who's the next me that's going to come along and go and do this and tell people about bias? So each of us can look at how in our own lives we are

bringing the humanity and the people into the system and bringing into the process as well. Right? You cannot succeed in a cyber program without addressing the human element. We after desktop support are the second highest in the users face piece of the organization. Think about every piece of technology we have on the user's desktop. endpoint detection, privilege access management stuff, all of those things. The user interacts with cyber all the time. If you create a high friction environment because you didn't consider the human in the system, they're going to reject your tooling. You will find PAM passwords being saved as local variables on a server because the admins couldn't care to use your PAM system.

you will find your sale point will be bypassed because people won't onboard their applications. Those are the things that break organizations. Not that we didn't buy the right tools, right? We just couldn't get people to use it. If you merge some psychology with the enterprise cyber parts of it, you probably going to improve your resilience a little bit. Try it on your wife tonight. Don't say calm down. Say, I hear you. let's talk, you know, and see what she does, right? Maybe bring some flowers and a bandage. You might never know how it can go. So, so, you know, lean into the crazy sometimes. That sounds the same, you know, and and and cognitive bias is not

an embarrassing thing. We are all thinking, breathing living machines. We all have biases around what we like, what we don't like. But be aware that we are biased in some way or the other, you know, and and bring that to the table so that you can actually neutralize it in some way and observe the people around you the next time you're in your organizations and stuff like and listen to what they're saying and how they say it and where their mindset is because chances are you'll pick up on some of the biases that they that they bring in as well. And that's that's it for me. Thank you so much.

I think we've got two minutes for quest. Any questions? Any thoughts? >> Yes. >> Could you just go through those detection steps behavior that >> Oh, bad for sure. So, happy to have a discussion with you about that. We can we can have a I think there's a break now. Can have that because it's a bit of a longish discussion. Anybody has a question that needs a one-word answer? No. Cool. Guys, I'm around. We're at the delight stand. So, just come say hi and we can we can chat. Thanks a lot.