
hi so I'm Martin sols and I'm going to be talking about what it Security Professionals can learn from Sydney literature I personally have been doing I.T security for about 20 something years and nowadays I work as a researcher and also um in the ethereum foundation so what this talk is going to be about let me try to make this slide a little bigger so I can see it myself as well um foreign so basically safety engineering I think has a lot to teach us I.T Security Professionals and um I hope by highlighting some of the the commonalities between what Safety Science has to offer us we can become sort of better IIT Security Professionals and this stock is Not By Any Means comprehensive the ID safety literature is immense and clearly I cannot squeeze it down into a 30-minute talk but this is to pick your interest and maybe you can you know look I'll have plenty of references at the end and you can try to have a look at what you're you know interested in and try to bring some of these ideas into your practice as an I.T security engineer and I would also like to know like clearly obviously security is not the same as safety right when you're in a mine there's nobody trying to release methane gas on purpose and like lighting it on fire that's that's what's happening in the mind but at the same time when you know there are miners who will like turn off the methane gas detectors because it annoys the hell out of them there's going to be um management who said production targets that cannot possibly be met uh given the the tick box exercise of security of safety that that they mandate without any form of inventive idea of of trying to meet those safety goals um that those stick boxes are supposed to meet and there's going to be people who are who've been doing this for a while and feel uncomfortable and leave the mind and just stop working because they don't feel comfortable doing this and in some sense I think these I.T Security Professionals have all been in this mind right we've all seen all these issues in our daily uh work and and feel sort of hmm is there something that we can do you know that we can learn from these accidents and and and issues that have happened um in in safety so I also want to talk a little bit about what this is not about um this talk is not going to be about some kind of prescriptive thing that you're supposed to do as a as a mandatory thing um this is a bunch of ideas that other other people have have thought about and put quite a lot of effort into and and for you to sort of just think about them and see how you can incorporate some of this into your daily practice um you need to find your own way there's no there's no way that is prescribed to do this and I think you have been as it Security Professionals you sort of know this inherently when you go to a new company that you work at or a new consultancy gig that you got you have to sort of understand their context and where they're coming from and incorporate your knowledge and know your know-how into their processes so you can help them the best you can um but I think it's important to also realize that a lot of time effort and money has gone into Safety Science and plenty of people unfortunately died uh to to that that eventually resulted in massive amounts of work done to to try to understand why and how can we prevent this from happening again right so I just want to start with something called down and down and in versus up and out um and I'm going to start with something a scenario that I think we're sort of familiar with or let's hope we we can all try to be familiar with let's say there's an attacker you know that got into your company uh without with some kind of phishing attack and they pivoted around in your infrastructure and uh this was this was noticed because somebody was just looking around their their their their AWS account let's say and they they saw something suspicious that didn't feel right and then they they notified you know the the the people they knew best which was the administrative team like hey something is off is this your account and the administration State team was like oh wait a second this is definitely not okay and they shut down the the the this attacker um at their trails and this you know that there's no real leak of information I think that actually happened that you would need to report okay so this is a scenario that I think we can all sort of agree can happen in a large Enterprise and let's look look at sort of the class interpretation that you would have within an Enterprise well user clicked the bad email right so therefore the user needs to be trained I think we've all heard this explanation we lack preventative measures against pivoting right let's add the preventative measures that we lacked so the attacker could in people's around their infrastructure we lack detective measures because clearly the stock didn't see this happening somebody else did so let's add these these detective measures that Were Somehow missing because when the the system was conceived security didn't wasn't even asked whether they have any requirements that would need to be be part of the requirements analysis that they did for the system when they built it right and the Cs admin team had manually fixed the stuff right so they had to go in and manually fix it there was clearly no reactive measures so that we were lacking reactive controls so this is kind of a classical like explanation that you'd see in almost all I.T security places that I've worked at they go through this kind of interpretation and like that's that's what happened this is this is the thing right and this is what in safety literature this would be called down and in and I'll talk a little bit about what I mean by up and out so there are two ends of this sort of uh instrument there's the sharp end the people who are actually writing the the the code the people who are the systems that are at the end of of what happened right so the the the user who clicked the email like that's clearly a sharp end there's this admin team that reacted right and there is the system that was compromised these are all very close to the action that happened right we all understand that these are in intimately close to what actually took place in this scenario that I just described but there's also the blunt end and I think very often we forget the blunt and forget to ask questions about it and if you actually think about if you have a look at some of the safety literature they'll talk about the blunt and basically Non-Stop and they'll and and if you still start talking about the sharp end they just look at you like have you lost your mind like are you kind of interested in like losing your mind here because it will be not interesting to them because they understand that if you fix what's on the right hand side the LA the stuff on the left will not only be fixed but all the other issues will also be fixed so what is on the blunt end well this is what they call the distal cause so the things that are like further away from the direct thing that actually happened and this is the lack of priority for the detective measures right because there's 50 people let's say in your organization this is a large organization like what the hell are you doing there was really doing something else right because they they they weren't just like sitting at their desk and like twiddling their thumbs so they must have done something else and they must know that detective measures are important right so what else was important and more importantly if the other thing was more important that who made the decision and why was that decision made and you shouldn't try to blame this person who made this decision but we should all sit down together and decide how are we making decisions about priorities because you haven't actually thought about the way that you make that decision if you arrive and and correctly in a way because you clearly arrived at something that wasn't what you really wanted right so maybe the the people who make these kind of decisions need to do that differently and it could be various other ways but maybe there's something to be done there now let's say that there's this system right that was built without the involvement of IIT security they built this system and they somehow forgot to write and to talk to it security about the the the required you know during the requirements analysis phase so the preventative measures were never included why right like why was that why did this happen why why was it security never informed and I'm not I don't mean going there with a baton and like hitting the head you're you should be actually curious of why we're just completely unaware that we're there that we exist actually do they know we exist do they do they know our our you know our emails do they do they know that they you know for every new system that they're building that is anything major they probably should be including these kind of things and if they don't then like that maybe says something about your communication methods towards the the Enterprise itself because you're part of this Enterprise you haven't communicated that you exist let's see that there huh well anyway um let's hope the oh it comes back that's good um okay so we can also talk the same way about phishing email right uh we can say that this phishing email likely was The observed by other employees um why didn't they notify us do we have actually a good good good report with people are they afraid to send us this email I have seen organizations that people are actually afraid to report security incidents because then the security Personnel come down on them and then it's a huge pain for them and they're like oh let's just keep it internal right let's not talk to the IIT secretary but if that's happening then maybe your attitude towards what's happening around is the problem and you can try to fix that right and I think you all understand if we get that thing right you'll this thing on the per user clicking the wrong email it's going to be almost almost a secondary question right of course it's interesting we do I want to understand that but the thing is you start here and you end up there the problem is that the people start here in the end here okay so next thing is who do we on bling and of course if you blame the pink thing that's at the sharp end what will you get well you will get trainings right you will get some detective systems and you'll get some preventative measures and that's more or less what probably happens right now in most Enterprises right so notice that it's always easy to blame the things at the sharp end because they're obviously connected to what just happened right they were obviously there like the person obviously clicked the email right but it's not that obvious to think about okay but who else got this email and why didn't they actually notify us that's a more complicated question to both ask and to answer okay so instead of what other things can you do well you can as I just explained right you can try to understand why security was not involved you know or you can try to say well you know there was something really interesting happened here actually that we sort of forgot halfway through this this scenario uh discussion because somebody actually did something that they clearly were not you know like that was not their job role there was not their part of their policy or guideline or the you know the ways of working or whatever they noticed something that was not right so there was some kind of positive slack in the system there was some kind of positive control in the system that that actually helped you get out of this sticky situation right and if you think about it the assistant admin team that's not their role it's not the thing that they normally do but they actually did do the right thing right so again there was some kind of positives back in the system that worked in your favor to get rid of this problem and actually if you have a look at all the things on the left hand side it's all the negative slack it's all the things that like you look the thing you look at the thing and you found the whole you found something that was missing that's called the negative slack you you found something that was just not there even though you really hoped it was gonna be there right but on the right hand side if you have a look at the things that actually did go right uh it was things that were positive stack in the system things that were you know you never trained the system admin team to do this you never ask people you know you never trained this person to look around for for things that didn't go wrong that wasn't the kind of stuff that they were expecting yet they did the right thing it's not this this kind of uh scenario is something that is actually in my mind but it's more an amalgam of other things I've seen and I have read about and it's quite classical that the things that do go right are actually the positives like in your system and they are the ones that will save your ass and not the things not the holes that you've been plugging all day long okay so this is this kind of positive slack and the negative slack in the system right so when you have this um you see the things that are like missing that the holes that you want to plug on the left hand side right that's kind of what is missing and what are the things that I need to need to fix because it's broken and all you're looking are at are the things that are broken and instead of looking at the things that did go right and actually are kind of interesting and you could do better for example you can try to do War gaming with the with the assistant admin team like clearly they are quite capable of doing the thing that that you wanted them to do so maybe you can train them to do it better maybe they can be part of the team and we together can try to make this this Enterprise more secure and um yeah so yes so this I already explained playing about the um this person who who noticed something unusual maybe you can try to recruit this person into the IIT security team maybe they could be the I.T security Champion within that department and help help uh that department be part of your you know like sort of project the the knowledge and know-how that you have into the department so that you can you can make sure that next time they build something they won't be missing the preventative measures okay so this we're just gonna jump a little bit topics here um I think it's also important to recognize that um that security is sort of it can be looked at at least as a hierarchical control system where you know like a normal control system you have this kind of control obviously you want to hit I don't know 100 liters of water and you put some you know some heating into it and then you measure the temperature and eventually it gets too hot and you turn off the hot Heating and then you know it gets too cold and you hit turn on the heating again right so this is a typical control Loop that you would have in when you try to heat up water now sticky ID security does have a certain thing like like you can look at ID security from this perspective and say wow there's of course the public who is unhappy about certain things and they're going to pressure the lawmakers to make lows so the lawmakers are going to make lows and then they're gonna tell the the lows are going to be made and The Regulators are going to try to enforce these laws and then the regulator is going to try to enforce his laws on the company and the company is then going to try to enforce that through their managerial control right and then the converter control is eventually trying to address you know um push this control through uh often you the I.T security professional who will write policies and guidelines and ways of working and all that sort of stuff and eventually the person who's writing the code at the sharp end will try to keep in mind that the things that they need to write you know need to meet all those obligations that are up in the here key right and then there's this reporting line that goes all the way back up right where we're going to monitor the people who are writing the code and we're going to write some risk reports and comply science reports and then that management will be happy about it and then the company reports that back to the regulators and the regulator is going to report back to the government about you know how we we managed to implement the laws that that we have that that have been set by the public right this is this kind of uh this kind of loop um right um the question is not whether this exists I mean I think you all understand that this exists and if you ever had the chance to talk with the regulator you know that this exists look I think it's also it's something to think about of where you are on this chart really like where are you within this because you are somewhere in this control Loop and it might be an interesting thing to think about or where you are and where you want to be and often a person who is let's say a nice security professional will be in multiple places within this Loop you're also of course part of the public so you can you can influence public opinion and some people in seabase for me you might say have more saying in in pushing public opinion in some in some uh direct Direction than others um but you're also writing code and you know you're also Frontline Personnel in some ways right you're also doing things that get executed and it could potentially be even online and and affect the overall security of the system directly and of course you have indirect control in other ways right you write risk reports and that get propagated up and then eventually might even reach the regulator and then of course the regulated through government so it's just interesting to think about them and something to to keep in mind that this is this is one way of looking at security as a control Loop okay so I just want to talk about something that I think is quite interesting and you might see some similarities um so how do at how do Frontline people Frontline work gets uh adapted in decentralized mode of safety so Central Advanced of safety is something that I'm sure you have seen is basically the centralized mode of security uh we all write policies and guidelines and stuff and we just beat people on the head too so that they would they would meet those guidelines right so what happens well there's going to be these guidelines and and and and and and um plans and notes and and and and requirements that these people should be doing as part of their work right this is the plan that we set forth for these people to do and then what happens is that these people actually need to get some work done and clearly the policies and the guidelines and whatever were never consulted with them so they need to smooth over some of this sort of rough edges they will try to do what's called fluency within the safety literature this is called fluency which is an activity that Smooths over these kind of contradictions that they're supposed to do with you know they're supposed to like meet these policy requirements but clearly you know they also have some work to do and the work to do doesn't quite seem to fit with what is going to you know be set forth in the policy so they try to smooth it over and if you look at it from the outside it sort of looks smooth like they they seem to have figured it out and what happens is that when they keep on figuring like sort of smoothing over this kind of contradictory requirements they're going to start discounting the kind of stuff that they smoothed over what they'll say is like yes you know like we kind of know that it's not exactly through point you know according to policy but it's kind of not too far away and and th