← All talks

Do Not Fire Your Project Managers!

BSides Göteborg 202622:033 viewsPublished 2026-03Watch on YouTube ↗
Speakers
Tags
CategoryCareer
DifficultyIntro
StyleTalk
About this talk
This talk reframes penetration testing from a final pass-fail exam into a learning-centered practice. It argues that treating vulnerabilities as failures incentivizes silence over discovery, and proposes a pentesting philosophy centered on early, continuous testing; honest communication; and treating findings as teaching tools rather than blame assignments.
Show original YouTube description
Do Not fire your project managers! If penetration testing can cost people their jobs, the rational response is to stop doing it. Yet that is exactly how pentesting is treated in many organizations: as a final exam, a verdict, a quiet blame assignment. In that model, vulnerabilities are not learning opportunities — they are liabilities.This talk challenges the idea that pentesting is meant to “prove” security at the end of a project. It questions zero-vulnerability expectations, late-stage testing, and common assumptions about how security expertise should be embedded in development teams. Introducing the concept of a pentesting philosophy, the session argues that the real danger is not insecure software, but cultures that discourage discovering uncomfortable truths. If your pentests optimize for fewer findings instead of earlier ones, you are optimizing for silence — not security. Learning objectives: - Question whether their current pentesting approach incentivizes learning or avoidance - Recognize how fear and blame distort security outcomes - Explain what a pentesting philosophy is and why it matters - Reconsider what “success” in penetration testing actually means
Show transcript [en]

Thank you Jas in the previous session. This is little ongoing. >> No problem.

Hello Unus, I hope you're able to hear us. >> I'm able to hear you. Are you able to hear me as well? >> Yes, loud and clear. Perfect. All right, we are running 2 minutes late. So, I think let's start. So, I think you can just start and then proceed.

Um I see eight attendees so far. >> Yes, we areing now right now from the the session. >> Yeah. >> Do you want to wait for one or two minutes? >> We could wait. Uh yeah, we can start at 11. Give them a shot.

So this is the last one before lunch if I understand correctly. Right?

Let's wait one more minute then we start. >> Sounds good. >> The topic seems very interesting. >> I hope so. >> Yeah. So uh is this a project manager for cyber security or a project manager for the whole program? Uh we'll see. >> Yes, looking forward.

All right. Yes, I think the time's here now. So, yeah, please feel free. >> Perfect. Thank you. >> Okay, so do not fire your project managers. um a pentesting philosophy. I once attended a presentation actually not too long ago where the speaker said if pentesters find vulnerabilities at the end of a project, fire the project manager. The presenter was arguing for a shifting left that is handling security earlier in the development process. So early in fact that by the end of the pro project there should be not there should not be any vulnerabilities left and if vulnerabilities are found then the project manager has failed and should be fired and that single statement reveals so

much about how some view pentesting a final pass fail gate just before before the end of the project. uh final digital ring of fire, you should say

some uh offense to that and uh I ended up asking if discovering discovering risk can cost someone their job who will want to do it and this is not a theoretical this this affects behavior. If pentesting creates fear, then findings get softened, risk gets downplayed engagement pentester, would you really want that on your conscience? Any pentesting philosophy that punishes truth will eventually stop hearing it? So if we view vulnerabilities as liabilities, we are actively incentivizing the wrong behaviors. We are creating a culture where discovering risks becomes a career-ending event. And if your job is on the line, how motivated would you be to do a pentest? If pen testing is a final exam, if vulnerabilities equal failure, if

someone must take the blame, then the problem isn't the people, it's the whole philosophy. And we need a better pentesting philosophy.

A pentest in it's not a methodology checklist or a compliance artifact. It's about defining the core principles that guide our security efforts. What do we expect pen testing to do? What is the ultimate goal? When in the development life cycle should it happen? How do we re vulnerabilities weapons or tools? What behavior does it encourage? If we we're just running tools without a true purpose because the way we frame pen testing determines whether it creates learning or fear. And this leads us directly to our first principle.

Pentesting brings insight, not judgment. Pentesting provides something development teams cannot fully generate themselves. and pattern recognition across systems, fresh assumptions being challenged, but it should not assign blame. It should not grade individuals. It should not become a performance review. A successful pentest tells you something you didn't already know. If it confirms what you knew, good. If it surprises you, even better. Surprise is not failure, it's learning.

V vulnerabilities are inevitable. No organization has perfect code or has perfect tools. And how can you possibly ever know future exploit techniques? Static analysis helps. Secure coding helps. Automation helps. But none eliminate vulnerabilities. The mistake here is expecting zero findings. Expecting zero findings or zero vulnerabilities misunder understands both software and security. And if fail if vulnerabilities are inevitable, punishing their discovery really makes no sense. Silence though is far more dangerous than disclosure.

Pentesting should be continuous and incremental and not a final ring of fire. When we have end of project pentesting, we create high pressure. We create high cost of change and def a defensive posture and at that point the architecture is already fixed. Deadlines are real. Rewrites of the application is are painful. So what happens? Well then at that point findings get minimized. We can accept that it's an accepted risk. It's deferred to later. Late pentests don't improve systems. They just re reveal how much learning was delayed. And still you can't test what doesn't exist. But you can test architecture decisions, early prototypes, security critical functionality like authentication flows or update mechanisms. Security reviews don't require finished

grow alongside the system. When testing grows with development, findings are cheaper, learning builds over time, and surprises shrink.

Yes, I agree. Shifting left is is a positive thing. It But what do we actually mean? What are we shifting left? If it's responsibility, then no, we're doing the wrong thing. But shifting expertise developers should absolutely understand security but they are already experts in their professions. They can't be expert at both programming and security. And that's why we have security experts. That's putting too much responsibility on one role. Shifting security left without shifting security expertise just moves the blame earlier.

embedding one security expert per project or per team. It sounds ideal, but uh in reality there just aren't enough experts and long implementation phases will underutilize the experts. there is not that much that many security critical uh decisions to be made early in the projects how and also expertise decays without exposure to the variety that's why I would say that team adjacent expertise having uh security experts that support multiple teams and see a variety of different systems that will allow the exper ities to grow and by seeing many systems fail through v variety not by isolation. You want security experts to work across systems to stay sharp and see patterns but also you want them to know your

architecture to understand your constraints and return repeatedly close enough to understand the system but not embedded in the team.

Uh just a question here. Unus are you available for question now or later? >> Uh preferably later. >> Okay, perfect.

Vulnerabilities are teaching tools and we can reinforce learning through targeted training. Pentesters find instances of vulnerabilities but developers fix entire systems. Vulner one vulnerability well explained. It fixes the issue but the developer who un understands what the problem is will be able to fix similar patterns that was not found in this particular instance and it will also prevent reintroduction of the same vulnerability. But one finding v one vulnerability and we'll fix it in multiple places and we prevent it from being introduced 100fold. The value of a vulnerability is proportional proportional. This is not about finding more bugs but it's about preventing classes of bugs. It's not turning developers into pentesters. It's not offloading responsibility. It is reinforcing real

lessons using real findings, making repetition less likely. Long-term security comes from understanding, not memorization. That's defense in depth at the human level.

So what is a successful tent test? Well, the earlier we can find vulnerabilities, the the less like the less cost is of fixing the vulnerability and at the later points. And if we have proper knowledge transfer in this process, if we are able to describe the vulnerabilities in a ped pedagogical way, then we have um we will be able to prevent repeated mistakes. And you also need an honest communication. You need to be able even though it's uh it can be hard sometimes and it can be uncomfortable if we are able to communicate honestly we will still win in the end and continuous improvement by doing trainings by learning from our mistakes this is how we will succeed.

So a pentesting philosophy is not about finding flaws. It's about creating a culture where where find early explain clearly learn together and prevent recurrence. Thank you. >> Thank you Ununas. Uh I have some question as well but let's start with Francisco. He has a question uh and his view is that what if the manager decides to leave by themselves. I'm unsure Francisco is mentioning about the time of project execution being finished or on the pen testing but I think I'm assuming that he's talking about end of project if this happens by themsel what will what do you think about that? If the project manager chooses to leave result of the pentest then I think it's

a problem of how we have communicated the the results if I understood the question correctly. Well, what will be the impact in that case on the project because the security posture because he knows what should be acted upon in terms of findings, actions. So what do you recommend because then at that point you will get a new project manager who might not know anything. >> Yes. And uh I guess that will reset the the learning curve, right? But uh uh I think it is inevitable that people will switch jobs and will uh do other things in life. So um um I think if you have a culture where where you learn and uh if you are able

to maintain that within the organization, it shouldn't rely on one person leaving or staying. It should rely on the organization and its ability to to learn these well to to learn from the results. Yeah, very well said. Uh Francisco, do you want to add anything or do you get an idea now about this?

I think you're not able to unmute for some reason. I'm trying to allow that as well, but it doesn't work. But feel free to write the question.

Uh, sorry, I think I managed to fix it up. Thanks. It's good. Thanks a lot for the answer. >> Thank you. Anybody else have a question uh towards Jonas?

Okay, I will take one question and after that if not more then we can close the session. So uh so unus what do you think uh when doing the pen testing how much project manager should be involved because do they should be informed about every activity or only the starting because if they know they will be able to helpful in the execution of a pentest. What do you think? I think uh that a tight integration between the pentest team and the project team is there's uh uh a lot of uh wins to be made from from working close together. Um I think that uh in this sense where where we have a direct communication the

direct line of communication uh with the developers we are both able to get feedback and uh and uh uh solve problems very early on and we can immediately describe and uh communicate vulnerabilities as we find So, and I think that's very useful. >> All right. So, you believe that there can be interactive roles during execution of that pentest activity as well, not only at the end to give a shocker. >> Absolutely >> perfect. Yeah, seems reasonable. Okay. Anybody else has any question before we leave session? All right then. Uh, thanks a lot, Jonas and uh, wish you a nice weekend as well and we will take it after the lunch now. We're looking forward to the next

sessions. >> Thank you very much. >> Thank you. Have a nice day. >> You too.