
All right. So, next up we have um Ally from Net Scope. Net Scout. Sorry, too many Net things in my mind. Was close enough, right? Hi, folks. So, before we start, we're going to do a quick draw based on some of the questions that you probably seen on the screen. Hopefully you have good memory. So starting what year was the first B size event ever held in St. John? Who said that? Next question. Where was the St. John event held? Who said that? There you go. All right. You get a prize.
Third question. What year was the first ever besides event ever held anywhere?
Okay. Question before last. Where was the first ever B size event held? Yes. Great answer. The gentleman over there. And final question, an easy one I think. Name another city where Bides has been held. Vegas. [Laughter]
That's it. All right. All right. Let's begin. So, hello everyone. My name is Ali Zadisa. I'm a cyber security specialist at NetScout. We're the scouts of the net. And the title of the presentation is a convenient truth. Uh it's kind of a joke because the convenient truth is that the network uh is something that everybody today relies on and needs in one way or the other. And the network depends on the packets. You're going to see that as a trend during my presentation. Packets and the packets themselves whether you you you want to admit it or not, they're the absolute source of truth. Packets do not lie and cannot be altered or modified. So that's the bottom line of
my presentation. So a lot of things have changed. Uh when I started working in cyber security, it seems to me that a lot of trends right now affecting um the market. um cloud, ease of access, global network, um new laws in Quebec, Europe, and and elsewhere, punitive measures. All of these things affect the effectiveness of cyber security programs and by definition put the assets at risk. So, how do you define risk and how can you measure risk in real time? Um, it's somewhat easy to have, you know, sit down and from a security perspective say what is your risk tolerance, but to be able to do it in real time, have clear indicators of if
you're deviating from that level in real time, it's it's very hard to do and anybody tells you otherwise, then I'm being honest with you. So, that is something that a lot of people have challenged with and in in my opinion. One of the reasons behind that is from the beginning we use different and separate metrics to measure the effect the effectiveness of cyber security versus operations the workload which is ironic because security is there to support operations where we use different tools different metrics different teams who don't talk to each other and that silo in itself creates opportunities for the bad guys to win so what is the the question that I ask you is what is the acceptable level
of risk risk. So think about that as I go through my presentation. So the threats themselves have evolved as we evolved. I used to be in SC uh and remember when I started uh there's no such thing as cloud. So it was very easy in data center and that made it easy. uh you have a perimeter, you defend yourself, but now you're everywhere consuming everything from onrem to cloud and sometimes you catch up on new services that you're not aware of. But the threats also have evolved as operations productions have evolved and they're persistent depending on the sophistication of bad actors. Sometimes they're nation states, sometimes they're, you know, your average your average kid. But the
threats evolve in such a way that the detection period it takes a lot lot longer. The incursion period happens well before detection. There's active reconnaissance attempts every single day. So yeah, the attacks have evolved and they're they're more advanced, but our defense or strategy has not changed regarding these new threats. At least based on every stats that I've looked at and all the numbers that we all seen, it seems to be that the bad guys are winning and the good guys are not winning. Um I believe some of the stats that I've seen out of Canada, US and Europe, the success rate for advanced threat was 60% and above. And these are the numbers that you can validate
through uh mandant report for example. So the cyber security strategy for so long has been relied on things such as logs. They're great. They're convenient. Uh but the reason why we chose logs back then because it was less costly than packets. It was less management overhead. But logs by themselves are not being successful against these new threats. If logs were sufficient, your your serious instant response team would not need to come in and start capturing packets because you already have the logs. So there's a need for packets and the technologies that we have been implementing to implement using to implement that strategy was for so so long was using packet oh sorry was using logs so the question that I asked
to team and ask yourself is has that strategy been working for you are you really getting your money's worth the cyber defense world and the security world you're familiar there's no shortage of good vendors a lot of solutions and we should in my opinion, we should stop chasing the magic bullet or the one solution or the buzzword of the day and focus on the process and procedures and overall visibility. The defense in depth, it's something that's old but it actually works. The challenge is what layers of your defense in depth needs more love or attention and how you go about finding that and you know focusing energy. There are better technologies out there and the
security sector especially uh regarding packet analysis and deep packet inspection has evolved tremendously. Uh today we can with ease analyze packets in real time create metadata on those packets which can be used from security to net ops to devops different context different use case. So a lot of benefits that are available today that was not available before. One thing also has changed the dependency of these new cool and nice features are easy to consume has hidden the complexity behind it. It's easy to access a service but that creates bigger attack surface. This complexity creates more dependency on cyber security to have a more more situational awareness in real time. It's not enough to say that uh I have this IP and I see it's
talking to I don't know emote the level thin malware but what is the impact behind that malware it's which servers what are what are the use case of those servers what is the context of that service within your environment are they transactional are they testing what is it you come come out of it because if you cannot have that information real time and kind of superpose the security versus operations you're not focusing on things that matter you're not focusing on your crown jewels and that is what's missing and that is the security gap that we have in the market today. So going back to the question, what layers of your defense and depth needs more love? My answer to you is
network packets, something that you already have the networks of today, our enterprise networks um have evolved tremendously um from SD1 drag access to cloud and it seems to be there's always some kind of project regarding network. So the network for me is foundational element and this everchanging entity creates complexities and blind spots. If you you ask your average customer or you know secure architect do you know how everything's connected? Do you truly know in real time the dependencies between those services the user the session the middleware? They can't. It's a it's it's an unfair question to ask. It's because we don't have the right data set to give those honest answers in real time. If imagine if you could tell
in real time the dependencies that you have on your your average cloud provider on an availability zone. If you could see that it's being saturated and you have to migrate to a different zone that creates a risk for your cyber security strategy in real time. We often neglect that. We we have we're consuming services such as UC collaboration but that's a target for cyber security and most vendors don't focus on UC as if it was a it's a thing apart. It's just but it's actually a thing. So you need to have proper visibility and that could come only by leveraging the packets that you all have in your environments. Why packets? Because they give you comprehensive
visibility. What do I mean by that? It's to put things in context again traditionally the approach was threat based approach. Everybody as I said losing logs that approach has not been working successfully based on the number of incident response that a lot of companies are doing and the successor the bad actors which means by default that you have to change your strategy from a threadbased approach to a risk based approach and in order to for you to be aware of all the risks you have to leverage every single data that you have already in your environment and packets being the absolute source of truth as I mentioned cannot be modified altered or or erased such as uh logs give you that
context and using deep packet inspection or DPI in real time gives you multitude of benefits to have that situ situational awareness to put things in context. More importantly allows you to eliminate the silos that exist between the teams because the benefit you get from from network can be enhanced by like adding the layers of a cake on top to security to DevOps. Uh and just as an example, so with regard to when to act, the time is now. Ask yourself, are you seeing truly what you need to see based on everything that you have? Are you acting uh the data that you you are using is it enough? Is it sufficient? Do you using the right data set? You
obviously there's a lot of stuff that multi solutions and vendors in place but are you seeing the the proper context between LDAB DNS and DHCP and the session the user do you know what's normal in real time for that specific user and that specific applications versus abnormal for you to act to say you know what I'm seeing jitter or different packet entropy for that application that is not normal let's investigate because from a network perspective that could be just performance or you know availability But from a security perspective, as you may all be aware of, a change sudden change in packet entropy for an application or jitter could be a clear indicator of compromise for an attack. It's only by
superposing these different teams and silos through packets and leveraging the same data set from different context you could put things in perspective. And by doing so, you'll focus on things that matter because you can understand what is normal traffic for a specific user and application in the context of the reality transactional database and so on. You'll focus on things that matter first. So your average analyst when you see that ticket sees the IP, hey, this is important. I'm going to start with this instead of investigating start capturing packets reactively and then understanding what the consequences are. And this also gives you other benefits. Uh by leveraging packets in real time and having metadata, you'll always have the
right information regarding reconnaissance because you see everything as far as sessions. What was the first attempt of a TCP Mox happening three months ago or two months ago? Who was targeted? Where is it coming from? Do you have capability of doing contact tracing? It's not important to say I have a malicious IP uh that was talking to blah blah blah. Okay. Who was patient zero? What level credentials they have? Who were they talking to six months ago? Can you go back in time and to be able to back in time application of IOC's like tomorrow if there's a new version of log 4j can you truly say or admit to yourself oh this was here also 6 weeks
ago or you have to go back or you have no means of going back by leveraging packets in real time and creating metadata through DPI as we can do today you have all those side benefits which saves a lot of time in actual troubles you know actual day-to-day office secure operations streamlining eliminating silos but also also incident response and investigations and this allows you to get ahead of threats. So what are the final benefits of using DPI? Well, it is scalable from a money perspective. Investments the benefit you get from troubleshooting and availability uh based on network can be layered from a cyber security from a devops perspective. You eliminate the silos. You know where the dependencies
are in real time. If you see changes, you can act upon it. So it it's very good from a TCO perspective. The data set that you get is a lot more rich. You get the full session conversation, DHCP, DNS and all those sexy buzzwords. So this gives you more data and again the right data not just more tools but the right data set this allows you to have a better integration within your security stack as well to put things in context you can effectively by using packets in real time measure if you're deviating from the acceptable levels of risk. So if a CISO asks you where are we with that where are we risk-wise you
have proper metrics to give those answers and you can put things in context if you're running new service or using old service you can know what the comprehensive visibility is and have proper answers with regard what is normal healthy behavior for that user session applications and so on. So when you see changes that are not normal, you can't just put it aside and say it's just a performance issue. It could be security. A lot of times advanced threats are hidden and we ignore them because we say it's a performance. It's it has toability. But when you dig you know enough you understand it's cyber security and it's just hidden well enough. We have to use these tools that
we already have from different perspective. That's it guys. Thank you so much for your time.