
Hello everyone. My name is Ada and I'm a cyber security associate consultant at more kingsmith and on a daily basis I support clients in achieving cyber essential certification and I also help them strengthen their cyber security maturity by conducting cyber security reviews. And today I'll be sharing the lessons I've learned from building my own fishing simulation tool and I'll be exploring the rise of fishing as a service. So let's begin. 14. That's how many weeks I had to complete my dissertation. 14 weeks to build something completely new, a fishing simulation tool. I proposed this project as I knew about fishing in theory and it excited me. But I really wanted to explore what it actually takes to build a fishing
simulation tool in practice. And as a computer science student, I expected the challenge to be technical. You know, the code, the data, the setup, the usual. But this project turned out to be different. You see, what I built didn't just test my technical skills. It tested my understanding of people. And somewhere between writing scripts and analyzing clicks, I realized something unsettling. Over 14 weeks, I had built something capable of deceiving people. So if I could design a convincing deception within a controlled and ethical environment as a student, imagine what someone with malicious intentions could do. But before we dive into that, let's discuss the journey I had gone through. Building the tool was like reverse
engineering trust. The platform itself demanded real engineering because my tool was responsible for um scheduling campaigns, personalizing emails, tracking clicks, and finally visualizing all the results in a nice dashboard. And it was challenging work. But behind all that code, I was really learning how people respond to design, tone, and persuasion. The deception didn't rely on any complex algorithms. It came from small creative choices like a convincing subject line, a spoof sender name, or a touch of urgency. This wasn't complex social engineering. It was simple psychology. A cloned Google alert that we're all used to with a small add-on, a view map link that could lead to a malicious page. Think about it. If you saw an alert that
looked just like the ones that you're used to, um, telling you that someone was trying to log in to your account from a different location and you had a chance to see that location, wouldn't you want to check it? And to make sure you did, the email added one more line. For your safety, your account will be locked in 24 hours if no action is taken. It turns curiosity into panic. and panic makes people click. As we can see, subjects one and four were a bit too curious. But you could argue that this was just my research and not the real life, right? Well, what started as a research became very real during a client
engagement. A company hired us to investigate a security incident and we discovered that the originated compromise started from a fishing link that harvested credentials. But the client was convinced that no one ever clicked on any link because surely it would have been reported back to them, right? Well, that same email was later circulating internally to other employees. The techniques were familiar. Urgency, authority, curiosity, just like in my research. The only difference, the indicators of compromise showed that this time it was fishing as a service platform. But what exactly is fast? Let's go back to my project. When I was developing my fishing simulation tool, I started to think about the types of people who could create something similar and how
that's changed over time. First, you've got the builder, a classic malicious coder who builds everything from scratch. They understand the tech, builds every detail, and maybe even sell the tools that they create online. It's difficult, slow, but powerful. Then there's the vibe coder, the modern tinker. They don't start from zero. They build on top of existing tools like goldfish or evil jinx and now obviously use large language models to generate templates or code for them. It's faster, not too expensive, and even though errorprone, surprisingly effective. And finally, the subscriber, the one who doesn't build at all. They just subscribe to a fishing as a service platform. Everything is pre-made, the templates, infrastructure, even the evasion techniques.
They just pay and launch. So, what used to take skill and time is now a service. And that's what makes it so dangerous. The numbers show just how widespread fast has become. Now anyone with money can launch sophisticated attacks. Barracuda networks recorded a massive spike in the early 2025 out of all of the fishing as a service attacks they observed. Tyon 2FA accounted for 89% evil proxy 8% and sneaky to FA 3%. Furthermore, any 2025 threat landscape report showed that Darkula, a platform clone login pages for more than 200 organizations across more than 100 countries. Lucid expanded fishing to iMessage in RCS and Flower Storm was used to bypass multiffactor authentication from Microsoft 365. The pattern is clear. Fishing kits are
now a service that anyone can buy to stay ahead in the ongoing race between attackers and defenders. So the big question is if cyber crime is becoming plug and play, how do we defend against it? This question isn't just how to block tools. It's how to defend the most vulnerable point in a chain. The real battlefield isn't the servers, the code, or even multiffactor authentication. It's people. Fishing exploits emotion, pressure, and trust. When someone clicks on a malicious link, it's not stupidity, it's humanity. But the good news is that same humanity is what we have the power to protect. And don't get me wrong, technical focus is important. and cyber security awareness training and testing help
users spot suspicious links, mismatch sender names or anything unusual in the email. But the real change comes from culture. The same human instincts that fishing exploits can also be our best defense. Now let's see how Google's project Aristotle showed that the best teams aren't the smartest. They're the ones where people feel safe and that truly matters for cyber security because if people fear blame they stay silent and if they feel safe they report early and early reporting prevents disasters. So what could that look like in real life? For example, leaders thanking people for reporting suspicious emails even if they turn out to be false alarms. A no blame reporting policy being part of on boarding so that
everyone is familiar with it and everyone knows that there is no penalty for incident reporting and that the whole process is confidential or teams running quick near miss conversations so that people can learn from close calls. And this is how the culture can change not with fear but with trust. Another example is the kind of culture you see in companies that openly encourage curiosity. They teach people how to pause for a second, question anything that peels off and verify without worrying about bothering someone. So on a day-to-day basis, just make it normal to ask, "Does this look right?" or have a quick Slack or Teams channel for sharing suspicious messages. And definitely give people easy tools like a
one-click report button because it's all about making curiosity the default and not an exception. And these are just a couple of examples, but the principles are universal. If we want to strengthen our human layer of security, we can focus on these three simple shifts. Pay attention to how you feel when something feels seems off because that sense of unease is an early warning. Encourage leaders to stop shaming mistakes, but to promote reporting when the incidents occur and empower individuals to question, explore, and support one another by sharing any suspicions that they may have. Because cyber security isn't about paranoia. It's about awareness and responsibility for ourselves and for others. If we build a culture that values care and
curiosity, we can then turn human vulnerability into resilience. So yes, going back to my project, if I can do it, so can they. But we also can build resilience and culture together. And remember, you can't patch people, but you can empower and educate them. Let's make cultural change our best defense. Thank you.
>> Are you are you open to questions? >> Uh yes. Here's just a list of resources I've used to um prepare this presentation. And if you know, feel free to connect on LinkedIn if you'd like. And if you're scared of QR codes, my name is also going to be displayed on the last slide. >> Um, any questions? >> Yeah. Um, so would you say that a >> So would you say that a a culture of safe reporting is more effective than technical controls? >> That's a really good question. I don't think we can choose between the two. I think we should just not forget of the importance of cultural change. So that that should be like an additional layer
of security rather just omitting technical controls or just focusing on cultural change. But so I wouldn't say so I would say that we just need to focus on implementing both of them. >> Anyone else? Oh, down the end getting steps in. >> So, the tool that you built, did you open source it? Is it out there for others to use? >> Um, no. It's a private repo, but um I am planning to prepare some paper on it and that will be released probably. Um, but yeah, feel free to talk about it so in depth maybe after the talk because it's a long story. >> Anyone else? >> But Goldfish is quite similar. When I was looking at when I discovered
Goldfish and I was comparing it to my tool, I realized that I was not aware of it, but it's quite similar to my tool that I created. >> Thank you very much. >> Thank you.