← All talks

Cybersecurity, The Eras Tour

BSides Lisbon · 20251:00:08155 viewsPublished 2025-12Watch on YouTube ↗
Speakers
Tags
StyleKeynote
About this talk
Jen Ellis is working to reduce cyber risk for society. She partners with security experts, technology providers and operators, civil society, and governments, to create greater understanding of cybersecurity challenges and strategies. Jen promotes better collaboration among these communities, more effective cybersecurity advocacy, and broader adoption of security best practices. Jen previously worked for cybersecurity firm, Rapid7, for 11 years, building the company’s security research, advocacy, and community engagement functions, before founding her own company, NextJenSecurity. Jen serves on the UK Government Cyber Advisory Board and various UK government working groups. She is an associate fellow of the Royal United Services Institute (RUSI), co-chair of the Ransomware Task Force, co-host of the Distilling Cyber Policy podcast, and sits on various nonprofit boards/advisory boards. She has testified before U.S. Congress and spoken at numerous security or business conferences.
Show transcript [en]

Bomb deal. Uh, this is the bit I'm most nervous about, by the way. Um,

and I am now going to finish destroying the Portuguese language. Beautiful language though. It is. Um, thank you. Uh, for those who are like me and obviously don't speak Portuguese, I basically just said, "Thank you for having me. It's very nice to be here." Um, okay. So, I was excited when I got asked to come and speak here. And I was even more excited when I heard that this was going to be this year's theme. Um, because this is a thing I think about a lot is the state of where we've come, what's happening right now, how we move forward from here. And we do see a lot of um challenges, a lot of glum. Um so I

thought, okay, this is good. This is aligned. We can speak about this. But then I got distracted by shiny things like crows do. And I thought about the fact that this is the 12th edition of Bides Lisbon, which like also a round of applause because that's a really amazing amazing thing. Like this is fantastic. And you're going to have to work with me here a little bit. I started thinking that 12 years is like an era. Let's pretend 12 years is genuinely like an era. And I thought, what better way to talk about the journey we're on than to have a cyber security eras talk. Um, so that's what we're going to do. We're going to pretend we're Taylor and

we're going to talk about the eras that we've been through and we're not going to focus too much on the idea that 12 years isn't really an era. It's fine. Um, as an aside, don't worry. I won't be singing. I won't be dancing. And I won't be wearing a sparkly leotard. Um, a quick disclaimer, I am not swift on security, unless you hate the talk, in which case I'm absolutely swift on security. You should send all feedback to Swift on security. But otherwise, no, not not in security. So, let's talk about where we've come. Um, to me, the story of human evolution is really the story of our ability to innovate, to adopt technology, and the way that that

technology then shapes our culture and helps us move forward. If we look through history, we think about ages in terms of our ability to better develop tools and to use those tools in different ways. If we talk about the Stone Age, the Bronze Age, the Iron Age, often specifically what we're talking about is the way that those tools have been used to aid in our defense or in our attack. And I believe the same is true about the information age, which is what I'm going to focus on talking about today. So here it is. This is the information age in all its glory. I picked all the most important things like when yan cat came about. Um and as we all know,

hopefully there are actually some Taylor Swift fans in the room because if not, this talk is a disaster. But as fans fans of Taylor Swift know, uh Taylor's version is always better. So, here's Taylor's version of our timeline with her eras and a little bit more of a cyber security flavor. It's a bit of an eye chart. We're not going to go through it on this because we're going to start with our eras. So, the first of our eras that we're going to talk about is fearless. I like to call this the age of cyber exploration. So in the beginning there was the cold war and we had a situation where the democratic allies of the west were very worried

about what would happen if we had a catastrophic event for example nuclear war. They were worried about the ability or inability to rely on circuit switch telecommunications as a way of being able to keep in touch. And they were worried that we would all end up completely disjointed, disconnected, isolated, unable to communicate. So in the early 60s a guy called Paul Baron at the Rand Corporation started researching systems that could sustain operational sort of could could sustain operation during a partial destruction. In ' 66, a guy called Bob Taylor who was working at the Advanced Research Projects Agency, ARPA as we call it for the US government, started a project to look at how resources could be shared through

between remote computers. And he then recruited a bunch of people to work on this project. Larry Roberts, Donald Davies, Paul Baron, the guy who' been talking about it earlier. and the design for arponet was created and this as we know is effectively the roots the foundation of the internet so of course once arponet was created security was not far behind and I do think that that's the role that security plays often is that we see innovation and technology drive forward and often it does so to answer risk to answer opportunity to answer threat but in doing So it potentially creates new threat or new opportunity or new risk and we sit at that intersection trying

to help people manage that. It's a really important role. It's a very evergreen role in my view. So in the 70s another Bob Bob Thomas he created a program called creeper and basically it moved across arponet and it left a sort of breadcrumb trail. It's not really fair to call it the first piece of malware because it wasn't maliciously intended. It was a thought experiment. But in response, a guy called Ray Tomlinson, who was the inventor of email, in 1972, he created Reaper. And Reaper would basically follow Creeper around and eat the breadcrumbs. And that is the first example we have of an antivirus software. So of course once we had that, commercialization follows and more

antivirus. So the two first major cyber security companies were founded in 1982 which makes them a generation old now right we are the second generation in that regard and where they led many other antivirus companies followed and the culture around the idea of security and of the internet and of this whole other dimension of virtualization grew. We saw this sort of explosion in cyberpunk media and we saw communities forming online. We saw hacker collectives start to form, sharing information, sharing tools. Of course, some of them didn't like each other very much and we saw rivalries forming and the form that these rivalries took was to try and outdo each other with their hacks, right? Who could be the most

elite? Um, and that meant we started to see widespread attacks against public systems. Taylor would say that we're happy, we're free, we're confused and lonely at the same time. I told Joe I was going to sing and it's miserable and magical. Oh yeah. So, as a result, we started to see arrests. In 86, Lloyd Blankenship, better known as the mentor, who was part of Legion of Doom, he had been arrested for his activities, and he wrote an article, he wrote a paper um called the conscience of a hacker, which is better known as the hacker manifesto. And it gets a shout out in hackers, which I love. I get very excited every time I see it. Um for me,

the number one thing in this is yes, I'm a criminal. My crime is that of curiosity. And I know so many people in the security industry that that statement still to this day, 40 years on, it still represents their ethos. And I I love that. This is also the first example of a major piece of writing that we see that puts forward cyber security as a public good, that puts hacking forward as a public good. And that's a battle we're still fighting today. It's a conversation we're still trying to have. And part of the reason we're having that conversation is in 1983, President Reagan was at Camp David and he watched the movie War Games

and he called his aids and he said, "I just watched this movie. Is this a real threat?" And the aids said, "Well, Mr. President, I'll go and find out." So they went off and they came back and they said, "We've spoken to the joint chiefs, sir, and they confirmed this is a real threat." And Reagan said, "Well, we need something to solve that. We need something to address that." And as a result, the first major international piece of anti-hacking legislation was born. In 1986, the Computer Fraud and Abuse Act, which is the US anti-hacking law and is the blueprint for anti-hacking laws around the world, was created. And it was because the president watched war games.

And to me, this is kind of the end of the era of fearlessness, of just blanket exploration, because then the law makers came in and they set down some rules. So we'll move on to our second era. So era 2 speak now. I like to think of this as the infosc community develops a voice. So even though the law had come in there were some people who were still willing to walk on the wild side and sometimes it was for good purpose. So the AIDS Trojan as you guys probably already know is the first cited piece of ransomware. It was developed by a guy called Dr. Joseph Pop. And basically it required people to send him $189.

And what he did was he gave that money towards AIDS research because he said it wasn't being funded enough. So in his case, his ransomware act was in his mind an act of public good. Of course, not everyone's so altruistic. There were a bunch of other things that we saw in this same era that were much more self-serving. and created a blueprint for how the industry would move forward. Of course, we continued to see representation in media. In fact, it got more mainstream. But at this time, hackers were really being um viewed as being somewhat subversive, nerdy, on the edges, fringes of society, not mainstream, right? Not not everybody we know. It's okay. Haters are gonna hate hey

hate hey hate hey we're just going to shake it off and we did we didn't care we were a booming industry ready to go. So we started some conferences we started meeting publicly RSA was founded. Defcon was founded. Obviously these today conferences still continue and are absolutely massive. And we started to speak out. We started to gain more of a voice and become more public. In 1998, Loft, which had started out as a hacker collective and then had turned into a fledgling business, they went and testified to Congress. Just a quick show of hands. How many people in the room are aware of this or remember it happening at the time? It's hard for me

to see, but I do see some hands going up. Okay, great. So, um, I'm very embarrassingly going to age myself now. I was 17 at the time this happened, and to be honest with you, I had no interest whatsoever in security. Um, but I liked the internet. the internet was cool. Uh, and I can remember sitting watching the evening news with my parents and seeing this story and the headline, this thing that Mudge had said that he could take the internet down in 30 minutes or less. And I remember at the time being like, but the internet, it's everything. It's the future. I was completely blown away by the idea that somebody could take it

down. And I think that to me is the moment which security started to go mainstream. So let's move on to our ex next era reputation. I think of this era as cyber security goes mainstream not surprisingly um as indicated. So by 2005 we have a billion users on the internet and where the users go industry follows. So businesses are starting to adopt the internet. People are getting websites. They're getting email. They're starting to look at other ways of doing things. We're starting to see cloud security uh sorry cloud computing emerging as a capability. And of course security started to mature. It started to expand. It started to develop specialisms. It started to get more complex. People started to make a lot of

money. This is the era of the IPO. And yes, if you look closely, I am in one of those photos. Um, and so we made it rain, but we also saw massive consolidation during this period. This picture shows 200 different companies consolidating down to 11. But the great thing is in security is that we're constantly seeing startups. We're constantly see constantly seeing innovation. We see a lot of regional small companies. So it's a very vibrant mixed environment. It isn't just these 11 companies. And I think that's a really good healthy thing for our industry. By the end of this era, the value of the cyber security industry was estimated at about $150 billion. By now, it's estimated to be closer to

the 200 million. And the projections forward are pretty impressive. Of course, media representation of us had got sexier. Um, and you know what's sexy? Activism. Uh, that was on the rise. We heard a lot about activism. You know what else is sexy? Lamborghinis. Um, so we heard a lot about organized crime groups and their big medallions and their fast cars and the fact that they operate as businesses. Um, we heard about the fact that they had developed marketplaces, complex marketplaces, just as complex as our own marketplace is with just as many specialisms. And of course, the reason that they're able to do this, the reason they're able to thrive is because of the safe havens

that they often operate in, the unofficial relationships they have with certain governments that give them free passes to operate against nation states they may see as being adversarial. in some way. But the nation states didn't stop there. This is the age when we started to hear and we start to track APS and we saw a huge amount of activity from them. In fact, we saw two of the biggest compromises of all time, catastrophic compromises. The W to Cry closed 80 hospitals across the UK. I think that's all our hospitals. We're only a small country. Um it's not really. Um and and obviously not Pettia is the most expensive cyber attack of all time still although JLR is trying to

give it the run for its money at the moment. Um and so this was a a really interesting time, right? And of course as we saw attacks against hospitals, we started to understand that the way that we've traditionally thought about risk in security around data protection around the CIA triad may not be all the risk that we can think about. We now today know that people have lost their lives because of cyber attacks against hospitals. But at this time we were just starting to register that there's software that runs cars that the the the software in your car controls the brakes and the steering that there's software that controls planes. We started to think about this much more and think about the

concept of harm. And I remember very clearly um I was working with the US's Food and Drug Administration. Um to me they are the regulator that is furthest ahead in cyber security and I'm a big fan of theirs. Um they were working at the time on pre and postmarket guidance for connected medical devices, right? Um and the medical device industry did not want to know about security at this time at all. They used the process of going through approvals with the FDA as an excuse, right, for why they couldn't patch things. And the FDA were not having any of it. They were like, "Nope, that's not the case. We're going to change all of this." So, they got a

group of experts together and and somehow I ended up in the room um at the kids table with the crayons and they started talking about what the guidance should look like and how it should be run. Very quickly it became clear that there was a disconnect in the room because when the security people talked about harm we traditionally were talking about loss of data and we'd gone into the room thinking yeah healthcare data is valuable it's important you know when they talk about harm they're talking about loss of life. It's a huge delta to have in your conversation. A huge disconnect to have and a really great reminder to me that when you are talking

to people, you do need to make sure that you're on the same page and what you're talking about, what the dimensions are. But in any case, so at this time we started to see people focusing more on other areas where there was an intersection between virtual and physical. And just a quick show of hands, who was here two or three years ago when Josh Corman um presented? So Josh likes to say where bits and bites meet flesh and blood. Of course, this is also the time when we start to see a lot of stunt hacking as a way of highlighting some of these issues and also as a way of getting, you know, big headlines and made stage talks at

Defcon. This is the era of named vulnerabilities which you may hate but everybody remembers heartbleleed. Um, and because of all of these things happening, this is the era when policy makers started to really pay attention. And I have to be honest, like this is a photo of I hope anyway of um Portugal's uh um assembly. I have no idea how engaged the Portuguese policy makers are. You guys will tell me. I hope that they're engaged and I hope it goes in the right direction. After seeing the intro video, I'm less confident. Um, but if you guys have information on that, I'd love to hear about it during the rest of the day. Come talk to me. I'm very

interested. So, policy makers started to get more engaged. We saw a lot in the US, a lot a lot across the EU. The UK is very engaged. And as a result, we started to see a lot more of the community responding and getting involved. There is still so much room for more people to have a voice. And it is so important for policy makers to hear from those who work on the front lines of these issues. So if you are interested in this, do think about getting involved. It really matters. We also got better at sharing information. This is the age of security research reports, right? Like so many research reports come out and it's constant. And

I have a bit of a have a bit of a ranty soapboxy moment that I have around the fact that we don't have data in security. We have pockets of data. We just don't have good joined up overall data. But we've got so much better at sharing information and sharing the data we do have which gives us a really important starting point. And of course another really important thing that happened during this era is besides Lisbon started and you guys got and I've already slaughtered the Portuguese language enough so I will not say the name but you got your own cyber security agency. So you could say that we got smarter and we got harder in the nick of time,

right? We did our jobs, but now we enter the tortured poets department, which I like to call the age of cyber security burnout. So 2000 was a hard year, right? We entered the year with a global pandemic and people working remotely. The world changed really dramatically during the course of 2020. And in fact, we recorded 560 ransomware attacks against hospitals in the US alone in that year. 560 during a pandemic in one country. It really set the agenda for where we are with cyber crime and it highlighted how badly we're losing the battle and we have not seen that improve. Right? The level of ransomware continues to rise. There are no barriers to entry for

attackers. It's getting easier. AI is democratizing this process for them. We're not making the progress we need to make. We did though have a kind of weird moment where the president of the United States was on the nighttime news talking about cyber crime. He said the word ransomware. I was like, that's weird. That's kind of like a watershed moment for us, right? And then he talked to this guy and he said, "Oh, you know, attacks against our critical infrastructure have to stop." Or else. I love the words or else. Or else what? Just or else. And he said it not once, but he said it twice. They had two separate recorded conversations where he said this. And I thought, okay, this is

arguably the most powerful man in the world talking to one of the other most powerful men in the world and laying down the law and putting security forward. This is a watershed moment for us. Things are going to get better. And then this happened. And things did not get better, it turns out. And we saw the importance of the cyers in the conflict. In the run-up to the conflict, we saw widescale attacks against Ukrainian critical infrastructure. We saw the via attack which actually had impact across Europe, not just for Ukraine. But we also saw the importance of defense. We saw Ukraine rally. We saw them work with other people, not in their nation. And we saw how important it was that they

were able to maintain their communications networks, which throws us back to that conversation about Arpanet, right? It proves how important that whole exercise was that in times of conflict, being able to communicate and stay connected is the difference. It can make or break you. So then we hit typhoon season, which it kind of feels like it might be today. Um, you're welcome. I brought the rain from the UK. It's fine. You can thank me later. Uh, we had Vault Typhoon. We had Salt Typhoon. We had a lot of little baby typhoons. It was just typhoon after typhoon after typhoon. And it highlighted that these folks that we've been really focused on are not the only people we should be focused on. The

US is incredibly focused on China. It feels to me as though nobody else is and it shouldn't be that way. I've literally had conversations with the UK government where I've said, "What are we doing about China?" And they've gone, "That's the US's problem." A quick show of hands. Who agrees with that statement? Is it the US's problem? Not nobody else's. Okay, good. Excellent. Can't see all of you, but there's not a lot of hands going up in the front row. Okay, so but that's okay, right? Like, so we had those catastrophes, but everything else is improving, right? It's not like we're still facing massive lack of leadership engagement. It's not like we're still facing continuous consumer apathy. It's

not like the attack surface continues to grow and more and more opportunities for attackers emerge every day. It's not like we're facing ahead and saying that there's going to be a huge explosion of AI related vulnerabilities that will just continue to boom that attack surface. As a quick aside, this this essay was written by Steve Christy Culie in 2007. Steve is I think for most people he's considered to be kind of the godfather of the CV program and of security vulnerabilities, right? I don't know if he would love that statement, but I'm going to go with it. So he wrote this paper basically saying that we failed to learn from the mistakes that have gone

before and that we continue to see the same vulnerabilities arising. We continue to see cross- sight scripting. We continue to see buffer over point problems and like why now sometimes there are legitimate reasons you know it is complex but there are often cases where we could have done better. So what Steve put forward in 2007 was the idea of forgivable and unforgivable vulnerabilities. And this this work, this proposal laid dormant for almost 20 years. And I'm very happy to say that now the community is starting to talk about it again. Policy makers are starting to take a look at this. They're starting to think about given the conversations we've had around liability for software, should we view all vulnerabilities as

unforgivable? No. Should we view all vulnerabilities as forgivable? Maybe not. So there is a good conversation happening about this and you guys should get involved and encourage your government to take a look at this. Of course, there is another challenge. So this is about the law of triviality for those who are not familiar is a thought experiment developed by a guy called C. Northcut Parkinson in 1957. He basically came up with this idea of um a group of people tasked with developing a power plant. Um and uh I I believe it was a nuclear power plant. Um, and rather than focusing on the really hard problems of how to do it efficiently, how to do it cleanly, how

to do it safely, instead they spend all of their time figuring out what the bike shed should look like. And this is the law of triviality. It's the idea that when faced with hard problems, we spend all of our time solving the easy problems. And as security professionals, we see the law of triviality at work every day. And what scares me is that as we look at the development of AI, which obviously has a lot of benefits to offer, that we continue to see the law of triviality at play, we continue to see people focusing on the little pretty shiny things and not the really hard problems, which we'll talk about a bit more. And of course, there is this

issue. We rarely agree with each other. I bet you there are a bunch of you sitting here listening to this disagreeing with everything I'm saying. That's all right. That's the point of talks, right? To stimulate discussion. But sometimes we really get in our own way. We're so busy focusing on this tiny minutia detail. And again, I'm not saying we should trivialize things. We don't want to trivialize. We need to make sure we focus on what's important. But sometimes we lose the big picture because we're so focused on this like little widget detail. We forget that our audience is not just ourselves, right? If this is being recorded, there's every chance that there will be people who watch this who might be

policy makers. And for them, the more that we disagree with each other, the more confusing and complex it becomes for them, the harder it is for them to engage, the easier it is for them to say, "Ah, it can't be that important. They can't even agree." Sometimes I'm the problem. It's me. And we need to think about that. So, that was bleak. Let's look at what the future holds for us. I have imagined Taylor's next album for Her Nothing to Crow About. Uh, which I'm calling What Does Our Future Look Like? Um, so here we are. We face evolving attackers and threats. I'm very arachnophobic. I do not like the slide. Um, but it's a

scattered spider reference, obviously. Um, we face increasing complexity. It's not going to get easier. It's not going to get simpler. It's only going to get more complex. And we haven't figured out how to solve complexity at all in any way, shape, or form. And what do we love to say in security? There are no silver bullets, no easy fixes. So, we know it's all hard. We know it's going to continue to be hard. And then we have the rise of AI. And let's be honest, in security, and by the way, I love the T3 thing here because of the threat tool target thing. I love it. Take it forward. I stole it from somebody else. It's my gift to you.

Um, and thank you for that one person. I love you too. Um, in security we have no plan, right? Like we've started to use it a little bit as a tool because companies have been working AI or rather ML into their solutions for a while, but we don't really have a plan on what to do about it as a threat. We have no idea what to do as a target. I sit on the CV board and I am slightly shocked by how little conversation there is on that topic. I keep being like, um, so AI, uh, are we worried at all about the ability to chain exploits? No. Oh, okay. Good, good. Um, are we worried at all about

the fact we have no definition for what a vulnerability looks like in an AI set? Oh, no. Okay, cool. Cool. All right, I'm going to get back in my corner and be quiet. And this is the problem. We don't have a plan. And right now we like the CIA triad. But here's the thing. We've talked about it as a triad for a really long time, but really for the first 20 years, what we meant was confidentiality of data. That's really what we meant. Don't steal our [ __ ] Then ransomware came along and we meant availability of systems and data. And that was very important. But again we really we called it a triad but we were

never really talking about integrity of data and now we are and we face a world in which people are building systems and the data integrity is poor. You know why? Have you been on the internet? So we already know the data has very little I mean there are obviously examples of AI development that's happening with very good clean pure data. So they're not just pulling from the internet. That has good data integrity. But there's also a lot of AI development going on that has terrible data integrity. And that's before you introduce a malicious actor, which we have no plan for, but they have a plan. Okay. So then there's also the Quantum of Solace thing. I mean, like what

what's the worst that could happen with quantum? It's fine. It's not our problem. It's for governments to resolve, right? Cool.

And then we have the threats to democracy. I was asking somebody recently what the differences between cyber war and cold war because I don't know if we're in one again. And I don't have an answer to that. Again, come find me. Tell me your thoughts on that. I'm super interested. I don't know the answer, but I do know that we should be worried about these three people and also what's happening in the Middle East. We should be worried about the threat that faces us because we are on the front lines of that threat. And you may not have signed up to be on the front lines. It's nice that you don't have to wear camo and do a lot of physical exercise,

but you're still on the front lines of it. And we need to be aware of that because the attacks are happening. vault typhoon proved that. So, Xi Jinping has issued an edict telling his uh country to be ready for 2027. There is no guarantee they'll make their move in 2027, but this is the timeline that he has set for when they should be ready to try and take back Taiwan. Again, show of hands. Who thinks that will impact us here? Cool. If your hand's not up, get it up. Um, we don't have a plan and we have no idea of how this will impact us. It's not the only uncertainty we face. This year has been interesting

and I think Europe has had a huge wakeup call. Sorry, a huge wakeup call. And we don't know where that's going to go, but it will impact us in security. This is one of the things I find so fascinating working in cyber security is that I end up in conversations on topics I never thought would be related to what I do. But what we do sits in the sort of fingers spread out into everything. Because again, we're at that intersection, aren't we, of technology adoption and and capability and risk, and everything feels risky right now. And then, as we saw in the opening video, surveillance is on the rise. Disinformation and deep fakes are on the

rise, and they're getting really good. Not not this, but they're getting really good. So, we got problems, and I don't think we can solve them. And thank you. That's great. No, I would not do that to you. That would This is meant to be cheery. Um, so this brings us back to where we started. I was fascinated by this. And by the way, whoever did the artwork on this, like really great job. It is absolutely gorgeous. I love it.

So, we talked a little bit about surveillance and about the other themes around government interference and some of the threats we face. What we haven't talked about, though, is crows. I love crows. I was really excited to find out that Lisbon's mascot is a crow. And I think crows are really intelligent. They're really curious. They're great birds. So, I asked the internet, "What do we know about crows?" And the and the internet said to me, uh, they frequently symbolize intelligence, wisdom, transformation, while also representing death, illomens, and trickery across different traditions. And I thought to myself, you know what that sounds like? Security professionals. I mean, okay, maybe not the the death part. I'm not suggesting that all of you

guys should go out and basically become vigilantes and try to avenge the world. Avenge the security bro. Um, I'm not suggesting that. But the story of St. Vincent paints crows as protectors, as guardians. And that doesn't that resonate for security professionals? A isn't that what we're trying to do to protect, to be that person on the gate, the virtual gate. So, I started thinking about this crows and security pros. It's going to be like one of my autobiography titles, crows and security pros. Um, and I started thinking about what we have in common. So curious and intelligent protectors and guardians engaged in battle. So crows are the sign of the Celtic goddess Morgan. She is the goddess of battle. I was like

cool that makes perfect sense because we are locked in that whole defense offense thing associated with predictions of doom. Yeah. Okay. That's us. I agree. We we we can be a little bit like that. Uh I just spent 30 minutes talking about Doom. Um, and also attracted to shiny objects. I I have a shiny MacBook. I'm very happy. So, given that we have these qualities, and they're pretty badass qualities, I think maybe there's a little bit more hope, but maybe we could reframe our album as everything to curve about because there are some good things happening, right? There are good developments. Security has not stood still. We've just shown that there's been a huge amount of involution since the

beginning of Aranair and the first Reaper. So where are we? Well, we're reaching new audiences every day. The level of engagement with the concept of security has never been as high as it is now and it will continue to grow. There are no organizations really that you talk to these days that aren't aware. They might not understand. There's still work to be done, but they're aware today. And that is it's a foundation that we can build on really well. There is a ton of policy engagement. And look, I get it. There will be people in this room who will absolutely roll their eyes at that and they say, "Is that a good thing?" And you're right. If policy

engagement is not done well, it is a harm, not a favor. That's why it's really important that we participate in the conversations, that we educate, that we bring people along, we help them shape the right policy. But some of the policy I think will have very positive impacts. One of the things that we've seen is a major shift towards much greater vendor accountability. And I worked with software vendors for a long time. I understand it's a burden, but it's kind of an appropriate burden, right? I mean, how many other industries where would would you be like, "Uh, I sold you a thing and it's broken. I sold you that food. It's gonna make you sick.

It's okay. I get away with it." Um, I built a house for you. Oh, it fell down. Uh, but you know, you'll just you'll figure it out. It's fine. This doesn't happen in other industries. And yet in this in tech, we go, "It's okay because we must protect the entrepreneurs. We must protect the innovators." No. uh we must hold people accountable and we must have better expectations. Um the UK government did a survey when they were looking at uh legislating around security of consumer devices and overwhelmingly respondents said hey if a product is available in a shop it's secure. Uh it's a reasonable thing for them to expect that. It's actually not a reasonable thing for a vendor to decide

to depprioritize and to say we accept it as an acceptable level of risk. Does the user accept that level of risk? Apparently they do. So I think this is a positive thing but again it needs to be done in the right way which is why I sound like a broken engagement is really important. Um I'm not going to tell you this is just good. Right. So this is the palau process. Quick show of hands. How many people in the room have heard of the Palmau process? Oh, nobody. Perfect. Right. Uh the Palm Mau process is a collaboration between governments. It was started by the UK and French government but is now got many other governments participating and

the whole goal is to address the proliferation and abuse of what they call uh commercial cyber intrusion capabilities. spyware, digital forensics, my old friend Metas-Spoite. Uh, they want to make sure that stuff's not being used in nasty ways against your average Joe. And look, I will say I participate in this because again, you can probably spot me on the slide. Um, I participate in it. I have questions. I don't know whether it will be successful. I suspect what will happen is we'll end up with a split marketplace. But that actually is a step forward. And I think this is important because as security professionals, we can be quite purist. Policy works on incremental improvement. Policy makers are never looking for

revolution. They're looking for evolution. Security pros often want to get to the end result and I get it. Like you do, you want that end result, but sometimes jumping there breaks a bunch of stuff and policy makers have to be wary of that. So one of the things that we need to get better at as we engage policy makers is thinking about how to accept those incremental steps. But you can hold people accountable to it being a pathway. So when the UK bought in the regulation for consumer products, I was like, when are we doing the next bit? Because that's just consumer products. But consumer products is a step. And the regulation, the things they regulated are pretty basic

to be completely honest. It's stuff like um you have to have a way of receiving vulnerability disclosures. You can't have universal default passwords. Um you have to tell people how long you'll support the life cycle. That's all good stuff, but it's very entry level for security. And so, it's okay to go back to them and be like, "Have you considered updating the legislation to add a few other things?" You have to make a case for it, but it's okay to have those conversations and to hold policy makers feet to the fire. They need that, but you have to accept that it's going to be baby steps to get there. So, for Pal Mau, for me, do I

think it's going to solve the problem? Do I think it's going to stop governments using surveillance on their citizens? I sure don't. But what I think it will do is I think it will help us to identify who are the people in the market that want to do the right thing and who are the people in the market that don't. And that enables us to clear a lot of space to say, okay, those guys are engaged. They're doing the right thing. Now, let's focus on the real problem areas and we'll look at other things to do there. There's a bunch of other stuff that's happening that's happening out of this community, out of the security community

worldwide. We're seeing, as I said earlier on, a lot more intelligence sharing. There is all sorts of organizations all around the world to do this. I'm sure that you guys have a bunch in Portugal. There's insistence volunteering. This is people giving up their time to basically be a helpline. There's education and engagement going on. I've seen it here. It's great to see this kind of stuff happening. and to see people getting really engaged in these particular areas and thinking about how they can add their cap their expertise and capability. There's public private partnerships. I use the example of no more ransom because I happen to love it, but there are so many examples of of public

private partnerships happening now, particularly in the law enforcement arena, but not limited to that. And it is making a difference. It is making it more expensive for attackers. It's making it a little bit harder for attackers. It's making it a little bit easier for SMBs to engage, which is really important because most of them sit below the security poverty line, which if you're not familiar with it, go check out Wendy Naylor's research on it. It's really great. The security community policy engagement is increasing all the time. And obviously, I'm a broken record on this topic and I'm here to give you a great sales pitch on it. Um, but get involved. you know, next year, why not invite

somebody from the Portuguese government to come and talk or somebody from the EU to come and talk about what's happening with the CRA? It would be really awesome. You've probably done this in the past. In fact, there are probably people in this room who work for the Portuguese government and I shut up because you're already way ahead of me. Um, there is so much nonprofit advocacy happening and these people are carrying your water, your message forward to governments, but you can get involved with them. you know, look at the ones I'm sorry, I'm not an expert on the ones that are in Portugal, as you can tell, because these are not ones that are particularly active here. Um, but EXO is

active in Europe and talks to the EU a lot, and you can totally get engaged with them if you're not. So, look for ways to do that because it's a really good way of amplifying your point of view without taking all of your time. There are all sorts of corporate pledges. Please note that the US's uh SBD is not up here um because nobody followed through. Um but there are corporate pled pledges that are driven often by private sector potentially with some sort of government support. Um and the companies that drive them do normally adhere to them. It still means that we have this big disconnect between companies that do that stuff and other kind of 1% that are good and everybody

else who's struggling. But it's a start. It's a foundation and it shows will. And these companies, the companies that do these things, they can afford to support messages that push security forward, that help more people learn about it, that help people engage. Companies like Microsoft and Google, which run huge programs to provide either free or heavily subsidized tools and provide a lot of advocacy. So quick question for you. How many people in this room participate in something that they do on a voluntary basis? That is one of the many categories I just talked about. Okay, so I can't see everybody, but I think if you do, you should get a round of applause. Frankly,

the re reality is security professionals are generally astonishingly hardworking and work in an environment that is very difficult. I mean, you know, you're constantly rolling rocks uphill. It's a very British saying, but hopefully it makes sense. Um, you know, roll rock rolls back down, flattens you. Um, it's an exhausting thing to do and to think about volunteering your time is hard and I get that. It is not that I think you're all sitting around sunning yourselves waiting for some loudmouth British woman to tell you what to do. And I do understand that we all sort of feel like we're just small cogs in a much bigger machine. What difference can we individually make? So, I want to tell a little story. This

is my friend Colin Morgan. Colin Crow, we're gonna call him now. Colin worked for a long time for Johnson and Johnson. Johnson and Johnson is a Fortune50 company. That means it is one of the biggest businesses in the entire world. And Colin um many years ago was sat in an auditorium not dissimilar to this and he saw Josh Corman speak and Josh was giving his cavalry talk and Colin got inspired. Colin was like I work for J&J. They make a lot of connected medical devices. This really resonates with me. I want to do something. So he went back to his employer. He was not a particularly senior person at Johnson and Johnson. He was not a VP. He was not

in the executive suite. He was an average Joe cyber security dude. He got inspired. He went to his company and he said, "This is what's happening in the world. This is the risk. This is the risk to us as an organization. This is the engagement of the FDA. We need to build a program that enables people to disclose vulnerabilities to us and then has an ability for us to triage them and do the right thing. and he sold that story internally for two years because J&J is huge. It took him two years to get buy in to get to the point of having a vulnerability disclosure program. And a week before he went public with

it, some idiot British woman knocked on the door and said, "I've got a vulnerability I'd like to disclose." And because Colin had been gone gone through that process for two years, it completely transformed the engagement. I when I reached out to them, the reason that I did the outreach rather than somebody in my team doing it is because right at the time that we were making the outreach, Johnson and Johnson was in the middle of a large class action suit around talcum powder. People have heard about this. Johnson and Johnson were getting sued and they were in a very defensive stance. And I thought we're going to go to them and I'm going to say, "Hey, we've done research on this

thing, which is the Animus One-Touch Ping. It's an insulin pump." So, part of it connects to your body, and the other part is the remote control, which goes in your pocket. And the remote control takes readings of your insulin level, and it communicates with the pump, and it tells the pump when to release um insulin. And we found that that communication was pretty trivial to either disrupt or to spoof. So you could either push a fatal dose or you could withhold insulin delivery. Right now I'm not a doctor. I don't even play one on television. But I am led to believe that that means death. And so I thought, well, when we go to Johnson and Johnson and tell them about

this, we're going to have a lot of lawyers who talk to us and they're not going to be open to hearing about this at all. But when we got on the phone, this guy was on the phone, Colin, and he led the call and there were lawyers on the call, and there were communications people who were the worst. I'm a recovering communications person. And because Colin had spent 2 years getting buy in, they totally let him lead. So he said, "Okay, we're going to go away and we're going to do our ver verification on this and we'll come back to you." In the end, he got Johnson and Johnson to agree that the bug was real, that

something needed to be done about it, and that patients and physicians needed to be notified. This was unheard of at the time. Johnson and Johnson decided to proactively issue a notification to physicians and patients telling them of this issue and we worked really closely on with them on it. So we were able to have coordinated communications. As a fun aside, this is when I learned that these kinds of notifications go out by post and I was like, what year is it? I don't what is what is post what what does that um but that's how they do it and the FDA who is their regulator came out and said this is the example everybody should follow

and because he did that and it was the first US connected uh medical device manufacturer to ever proactively put out an alert without being told to by the FDA for a security bug. It actually did change the shape of their industry. Not overnight. You can believe that to begin with all their competitors went around to their customers going, "Junson and Johnson's got a vulnerability and we don't." Obviously, that stuff happens. But actually, it enabled the FDA to lean in and to point to what they were doing, and that put pressure on all of the other vendors. And over time, the industry completely changed. He's one person and he created that change and he did it in the quietest most unassuming

way possible which is why I come like a loudmouth and tell his story as much as possible because Colin is amazing and he's an unsung hero. The security industry is full of unsung heroes. I'm sure that all of you are unsung heroes in your own ways. One person can have huge impact. I'm going to tell you another quick story. Um, so I joined Rapid 7 in uh 2011. I don't work for them anymore, but I worked for them for 10 years uh almost 12 years. And one of the things that I did is I helped build the research department and I partnered with HD Moore and we built a research function. And a year in we found out that HD was being

investigated uh for hacking offenses under the US anti-hacking laws for a security research project called critical.io where he was scanning the internet. The investigation went on for three months and at the end of that three months it was dropped. But not surprisingly, HD did not enjoy that experience very much and he decided he would probably rather not do research for a bit. And I got really really mad about it. Not at HD. I totally understood HD's point of view, but I got so mad that there we were every box that you could tick to say this is a verified research project. Like we had a who is lookup explaining what we were. We had a web

page. We had done three or four disclosures already. You could opt out. Like anything you could do to say this is legitimate good faith research, we had really done. And yet he went through this process. And in the process of that, I found out that researchers all over the place were getting threatened by companies. And I was really indignant about it. And an indignant gen is not not necessarily the best gen. Um, so I went and I fell down this like huge rabbit hole of learning about the law. I've stood up here and I've talked to you about policy a lot today. I didn't have a background in policy until this point. I got mad and I decided that something

needed to be done. And it's a long story that I'm not going to share because it's a long story and we are behind on time. But basically, I ended up meeting with the US Department of Justice, the Computer Crimes and Intellectual Property Section, and I expected them to fob me off, and they didn't. They were amazing, and they said, "We understand and we agree." And this was under the Obama administration, just in case people are confused. And they then decided to work with us. And in 2022, they issued charging guidance that said good faith security research should not be prosecuted. I don't have a law degree. I'm not a policy person. I'm not even American.

I had no thought that I would be able to create any impact or change when I started down this route. When I started, Erin's law, which was an attempt to reform the Computer Foreign Abuse Act, had just failed. People were looking at the CFAA as if it was toxic, as if it was radioactive. Everybody told me I was not going to get anything done. And first, we got the DMCA exemption in. The Digital Millennium Copyright Act has an exemption encoded that says that security researchers will not be prosecuted. That was our first milestone. It wasn't me. It was a whole bunch of people that did that together. And then this happened. And again, this was not me. This was a bunch of people.

It takes a village. But it it started at that meeting. If that meeting hadn't happened, we wouldn't have this charging guidance. And this started a role where other governments across Europe have started to look and see if they can do something. And now we have language that's come out from the EU saying, "Hey, vulnerability research is really important. We should look at how our laws impact researchers. Comp countries are starting to talk about safe haven language now. One person can have a huge impact. I will say by the way it took 10 years. It wasn't quick. And if anybody at any point had said to me, what have you done? What have you achieved? I would

have gone and hidden under a table and I probably would have cried because it's hard to show what the progress looks like. It's hard to know what your outcome is going to be because again progress is incremental. It's baby steps. But one person can have a huge impact. Okay, I've lectured you enough. I'm hoping that I've inspired some of you. We're at the end of our eras tour pretty much. This is what it looks like. This is our consolidated eras in business strategy thinking. There is this idea of phases of industry life cycle. You have the startup, you have growth, you have maturity and then you have renewal or decline. In order to renew, you have to continue investing.

You have to continue developing. You basically have to go back to the start of the circle and start up again and continue to learn and grow. This is how I think it tracks against our eras that we've talked about today. I think that where we are right now is potentially on the precipice of renewal or decline. I think it's up to us to decide what we want. In the opening video, it said this is our time and I agree with that. It is our time. We have to act. We have to step in. There is so much against us and it is costing society so much and we stand on the front lines whether we like

it or not. I think you can also look at it this way, right? You can say we've had our exploration, we've had our formation, we've gone through consolidation, we all feel the exhaustion, but I'm hoping I'm really hoping that some of you will feel inspired and you will have the aspiration to go forward. So my last question to you is, do you want to be a groupy or do you want to be a badass? I think Taylor would tell us that we have to make our own sunshine, but if we do, the sky could be opaque. Maybe not today, though. Thank you.