← All talks

Ghost in the Machines: From Algorithms to AI

BSides Joburg · 202543:31180 viewsPublished 2025-07Watch on YouTube ↗
Speakers
Tags
StyleKeynote
About this talk
In a world increasingly mediated by digital systems, algorithms have become the unseen architects of our reality. From social media feeds to search engine results, these mechanisms operate quietly in the background; curating content, shaping narratives, and reinforcing personalised viewpoints. As machine learning systems evolve, their influence deepens, subtly guiding what we see, think, and believe. While this is happening all over the world, fragile countries, such as South Africa, are especially at risk. This talk will explore the hidden power of algorithms and AI in shaping perception and "truth", by examining how recommendation engines amplify bias, how engagement-driven design warps public discourse, and how reality itself becomes subjective under the influence of machine-curated content. Who controls these systems? What values are encoded in their logic? And as artificial intelligence grows more sophisticated, are we seeing the world as it is or only as it is being rewritten for us? Are we seeing South Africa as it is, or as the "Ghost in the Machine" portrays it. We live in an age where unseen digital mechanisms guide our perception of the world, silently shaping what we know, believe, and value. This talk looks at the subtle yet powerful influence of algorithmic and AI-driven systems that curate the content we consume every day. Far from neutral, these systems are designed to prioritise engagement, and in doing so, they often distort reality in a way that can mislead, radicalise, or isolate people, cause real harm to themselves and ultimately society. South Africa is in a fragile state as it is, and this reality distortion often shows us not South Africa as it is, but as something else. The talk will look at the evolution of algorithmic systems, from early algorithms like Google’s PageRank and Facebook’s EdgeRank, to today’s sophisticated machine learning systems that dynamically adapt to user behaviour. With each click, like, and share, we unknowingly teach these systems what to show us next, reinforcing biases and preferences in a self-perpetuating loop. This results in digital echo chambers where dissenting perspectives are filtered out, and personalised viewpoints are continuously reinforced. It will examine how modern platforms use algorithmic curation to drive engagement, how recommendation engines amplify divisive content, how social media timelines are manipulated by opaque rules, and how even seemingly objective tools like search engines can present entirely different “truths” depending on the user. Case studies and real-world examples from South Africa will demonstrate the real impact these systems have on public discourse and individual worldviews in a South African context. But the implications extend beyond mere content delivery. As generative AI advances, it now creates not only what we consume, but increasingly how we consume it, generating synthetic news, deepfakes, and even AI-generated influencers. These technologies blur the line between authentic and artificial, challenging our ability to distinguish fact from fabrication. Crucially, the talk will raise questions of accountability and control. Who designs these systems? Who sets the parameters for what should be promoted or suppressed? And what ethical frameworks govern their decisions? With minimal transparency and limited oversight, the people and organisations behind these tools wield immense power over the narratives we encounter, often without our awareness or consent. The talk will conclude with a call to action. While the rise of AI-powered curation poses serious challenges, it also presents opportunities for reform. It will examine emerging approaches in algorithmic transparency, explainable AI, and digital literacy, all of which are tools that can help users reclaim agency and resist manipulation. As we peer into the mechanisms behind our digital experiences, we must ask ourselves, are we still seeing the world, and South Africa as it truly is, or only as it has been shaped for us by unseen forces?
Show transcript [en]

[Music] Okay, good morning everybody. Welcome to Bides Joberg. Um, it's the second time. I mean, I was here for the first one last year and I was absolutely blown away. It was it was, you know, I'd had experience with Bides Cape Town, but Joberg really did an exceptionally good job yesterday. It's kind of nice to see this little bit of uh rivalry between inland and the coastal areas to kind of see who's going to keep on upping the game when it comes to this community. So, um I welcome to what I think is going to be a really really exciting day. I've looked at all the talks that are being presented today and I'm actually waiting with baited breath for

a number of them. Um, one of the things that I love about Bides, besides the fact that it is very communitydriven, and who would phone me this time in the morning, um, besides the fact, okay, I probably should say I work in forensics, so my phone never stops ringing. Um, but besides the community aspect of Bides, you know, what I love about Bides is the fact that it is technical. you know, I go to um uh conferences around the country and around the world, and so often they end up being kind of sales pitches or or some kind of highlevel thing. And I I don't know. I don't kind of like those. I'm I'm not a saleserson.

I'm just a nerd. Um so so when the topic came up, the theme for this year's Bides came up, I thought, you know, I've got an idea for a talk that I've been wanting to do for a while. and and I kind of I've hesitated presenting it because in many ways what I'm going to talk about is somewhat controversial. But I think this is the kind of audience that can actually handle controversial. So I'm going to be talking today about ghosts in the machine really much in line with the theme of uh this year's Bides and and talk about um essentially how the algorithms that kind of drive our narratives in the world actually impact the way we see the world.

So, a little bit about who am I. Um, this is my bad attempt at AI. Um, I look a lot fatter than I do in real life, which is really kind of disturbing. Um, they say the camera puts on 10 pounds or whatever. I think this put on about 20 kgs, but anyway. Um, but um, I've been working in this field for a long time. So, growing up as a child of the 80s, um I was a complete nerd. You know, my very first computer was a Sinclair ZX81 for those of you that might remember that going that far back. And then moved to a Commodore 64 and eventually I had an IBM XT that had two

floppy drives. No hard drive, but two floppy drives. I could run the OS on one floppy and I could run other stuff on the other floppy. And man, I rocked. And and growing up at that era in South Africa, I I kind of form part of that original group of OG hackers essentially. You know, if we wanted to get onto the internet, we needed to hack places like universities and um German motor manufacturers and things along those lines, you know, to basically get access to what was then the naent internet. But that love for computers and the community that I found within that early hacker community has always been something that stayed with me. Um so so as I've moved through my

career I've you know I moved into the working world um where I moved into law enforcement working as a police detective and and starting out really at the early stages of digital forensics as a discipline not just in South Africa but internationally and kind of carried on that uh for a few years until 2014 when I left um I was the national head of cyber forensics at the special investigating unit and Now I work in the private sector doing cool stuff. Um for my sins I'm also a principal instructor at the SANS Institute um working in their DFR faculty and I'm kind of privileged in the sense that I get to train uh law enforcement officers and

and organizations from around the world including organizations like the FBI and the Secret Service. And unfortunately because of the work that I do, I've also been exposed to some of the, shall we say, less savory things that happen amongst the power players in society. And that's kind of informing a lot of what we're going to talk about today. As I said, I'm a massive geek and nerd. You want to talk about Star Wars, Star Trek, Harry Potter, DC, Marvel, I'm I'm your guy. Um but but today's theme is very much around the Matrix and and how many Matrix movies are there? Just want to check out >> two. >> No, not too many. There's only one cuz

the others after the first one were all rubbish. Um so so so this is really going to focus on on you know the concept of the matrix because it kind of goes in with the whole uh ghost in the machine narrative. Now, as in as in traditional matrix fashion, before we start this presentation, like Morpheus, I need to give you a choice. I need to give you a choice of whether or not you want to take the red pull or the blue pull with this talk because here's the thing. I'm going to talk about some very, very, very sensitive topics. Some very controversial topics. Topics that make some people or might make some people feel uncomfortable. I'm going to talk

about issues in contemporary South Africa and how essentially the algorithmic world behind it drives our public narrative around our country and the people in our country. So, so that's my warning. If you choose to stay here, you've all chosen to take what was it? The red pull. So, you've all chosen the dangerous pull. We're going to show you reality now. if you leave, you've taken the bluepool and basically you go and you watch the other tracks and life goes on normally and everything is good. Um, so as in when we talk about this kind of algorithmic world, the sort of the shadow world if you want to call it that way, I always think of that first part

in the first Matrix movie where Neo's, you know, sort of fallen asleep at the computer and we see the iconic uh text on the terminal going, "Wake up, Neo." And that's really what part of this talk is about is for us to actually wake up not just as individuals but also as a cyber security community and and how we can impact on this. Now as I mentioned growing up in the 80s I was part of that hacker community. Um you know we didn't really think of cyber crime back then. You know, hackers back then were not really criminals as far as I'm in fact I hate to use the word hacker in the context of crime because

they don't quite resonate in my world. But one of the things that that has always kind of stuck with me from that early hacker community is the hacker manifesto. And if you haven't read it, it's a really good idea for you guys to go and read it. Um, you know, and there's key parts of the hacker manifesto that have driven our community in many ways. One of the concepts is you are defined by your skills. You're defined by your knowledge and what you can do. You're not defined by degrees and courses and things along those lines. Um the concept of information should be free. We should all have access to information. We should all have access to knowledge. There should

be no restrictions on us. Um the belief that in our community it's not about race or gender or creed or religion or anything about this. It's just simply about who we are as human beings and the skills that we have. And these are great idealistic um viewpoints and values to actually have. But the reality is I think as a community we've been somewhat idealistic and naive. We grew up in this environment and we were going to change the world. We were going to free information. We were going to make this this brave new world accessible to all. Um we were going to step away from the powers that be that controlled everything and essentially free everybody in terms of making sure

information was available to all. But I think we didn't really think this through in many respects. We asked ourselves well everybody should have access to information. everybody should have access to this this thing that was called the internet and and be able to contribute to the world. But the the mistaken assumption that we made is that all human beings are innately good and all human beings innately have the real belief in society and the real belief in the best interests of their fellow human beings. We were very naive. So we now find ourselves in a situation where information is free. But is it really free? Now when you go and you look back historically, information in general has never been

truly free. Going right the way back to the early days where we didn't have the printing press and information was literally hand transcribed onto papyrus or chiseled into stone tablets. only the very select had access to the information and it was always the people in power whether it be the religious um groupings or the the monarchistic groupings or the tribal groupings only select few people had power with the advent of the printing press books became more commonly available more people had access to information and in fact there were great conflicts that arose from this the the the schisms within sort of religion between Protestants and and Catholics, for example, um largely had to do with this

kind of loss of control over who had the information. When we moved into the 20th century and we saw mass media coming about and television and radio, we had information, we had more information than we ever have, but again, it was still controlled. It was never truly free. So we have to ask ourselves in in this world this ideal world that we as cyber security professionals and hackers try to create have we really truly achieved this concept of information being free. The other thing we also need to remember is that information has never been about the truth. Information is just information. It can be true but in many instances it's not. And if you think about the problems that

we have in modern society, truth has almost become a secondary issue. Now the reality when we talk about information is information provides people power. If you have information, you have power. Let's look at us here in this room. We have deep technical knowledge and skills and expertise and and we might be all at different levels of skill and expertise but the skill that we have and the knowledge and the information that we possess gives us power. That's a reality. It's a reality of life. People with no access to information truly have no power. But because we have this concept that information and power are so closely intertwined, we need to now also go one step further because one of the aspects

about people who have power is they want control. And when people want control, one of the ways to achieve that control is with curated information. Now, I know I'm standing here and people are saying, "Geez, this guy sounds like a certifiable like paranoid nut job, um, you know, conspiracy theorist." And and that couldn't be further from the truth. At the end of the day, I'm a scientist looking at scientific data. So, the reality is that whoever controls the information controls the people who consume and use that information. And that's the real world that we find ourselves in. we find ourselves in a world where we have to seriously question who actually is in control. I love this line from from the first

Matrix movie where where Agent Smith is interrogating Morpheus and he says um you know the one of the aspects of the Matrix is you have billions of people just living their lives out oblivious to the reality of nature. How true is this for society that we find ourselves in these days where we literally do have whole societies oblivious to the actual reality of what they're finding themselves in. So here's the thing. If you look at the development of social media, the development of the internet and and the ability of everybody to engage online and share their opinions and and and share their thoughts and their ideas, that influence has given people a certain level of power. They

feel that they have agency and control over what they post and what they share. In fact, I would, you know, I I look at things these days and people say we can have a career as an influencer. Like like 20 years ago, that idea would be totally bizarre. But there are literally people whose whole lives are built around the fact that I my whole purpose in life is to influence other people to do stuff. It gives those people power. So whether it's whether it's the crazy paranoid uh conspiracy theory nut job that thinks lizard men run the world um who's now sharing that online and and you know having other people that are following that or or somebody having some other

conspiracy theory they feel that they gain power by having some kind of influence because if I have influence over just one more person I have a level of power in in terms of that person the power relationship actually exists there. So, we've got to kind of ask ourselves, who wants power? Well, this is going to sound very selfish, but we all do. We all want power. We wouldn't be sitting here if we didn't want to have some kind of power. Power over our lives, uh power over the things that we do. Everybody wants power. But the problem is we are a social species. We organize ourselves into social groups. We here are a social group. We are a tribe of hackers for

lack of a better term. Um and and and when individuals want power, that power can start to be conflated to a group level as well. So we start to see people who want power, groups that want power, governments that want power, corporations that want power. And and again, if to get that power, I'm going to need to have some kind of influence. Now, this is where um I used to do a lot of financial investigations when I was in the police. And one of the things that I was always sort of drummed into me was if you want to solve crime, you follow the money. Who's who's behind the money? And one of the reasons that

we have this problem with algorithmic influence essentially these days has to be who's paying who. Who's who's making the money? And essentially the money in many instances is actually held within big corporations that actually control the social media landscape. Now that that corporate environment is often manipulated by other power players, but follow the money if you really want to know who has the influence and the power. I love this quote and this this comes from Morpheus and I think it really truly uh illustrates um the topic at hand where Morpheus says to Neo after he's first been awakened from the matrix and he says the matrix it's the world pulled over your eyes to blind you to

the truth the truth that you are a slave now I know none of us like to admit that or hear that but how many of us if we look at our activities online objectively have a slave type behavior. How we consume media, how we engage in media is very very slave orientated. The simple reality is for most people out there, ignorance is totally bliss. We don't want to know what the real world looks like. We we want to see the version of the world that we want to believe. And and we see this over and over and over again. People ignore the actual issues that matter because they'd rather simply not know about them. So, let's talk about

the kind of environment that we find ourselves in, how algorithms and AI can actually start influencing our perceptions of reality. So, so again, I like this quote also from Morpheus where he says, "The matrix is basically built to keep us under control." Now, here's the thing and and again, this is also taken from um uh Agent Smith interrogating Morpheus in the first movie where he says that as a species, human beings define their reality through misery and suffering. I just want that to settle for for a moment. We define our reality through misery and suffering. Which news are you more likely to read in a newspaper? the good news or the bad news. If there's a car accident,

okay, this is Droberg, so the potholes and the traffic lights are really kind of crazy around here, but you know, if there's an accident, I'm willing to bet that 99.9% of us in this room would slow down to have a really good look at what actually happened. Maybe there was a a cash in transit robbery. How many of us are going to slow down, see if we can see something? That's humanity. Humanity for some strange bizarre reason, whether it's genetic, whether it's society, really does define how we look at the world through kind of a negative lens. So what I did in preparation for this presentation is I actually did a bit of an experiment. I actually wanted to see

how the algorithms effectively affect a typical South African's perceptions about South Africa. So what I did is I created uh two brand new phones, two brand new SIM cards, two brand new SIM number brand new phone numbers, set up two sock puppet accounts. So, so brand new Gmail accounts and set up Facebook accounts, Tik Tok accounts, and um I know technically it's it's still Twitter/x X Twitter. I just can't quite get around the X thing. It's it's like weird. Um and and I created these two personas and and the one persona was going to act as an optimistic South African and the other was going to act as a pessimistic South African because again I've kind of found

in South Africa that there's a lot of us that are actually quite optimistic and there's some of us that are like oh you know this country has gone to the dogs everything is so bad. So I thought, well, let's let's create these two personas and let the social media algorithms feed content. And the optimistic person will click on content that is optimistic and the pessimistic persona will click on content that is um pessimistic. Now firstly all of these social media platforms Facebook, Tik Tok and and X started feeding content to these accounts. And the irony is is that because these accounts were essentially based in South Africa, the algorithm started to feed content as an

aggregation of what it sees within the South African region. Does anybody want to take a guess at what type of content we were starting to see? corruption, crime, um all these bad negative things about the country. So it didn't matter whether you were the optimistic person or the pessimistic person, the algorithm was already skewed towards our entire society's data set essentially. And over a period of a couple of weeks, I just had these personas click on stories and links that were fed to them by the social media platforms. Again, the optimistic person trying to stick to the optimistic level and the pessimistic person clicking on all the pessimistic links because they had a predefined view

of South Africa. And the reality is both of these personas end up in a very dark place in a really, really, really dark place. These personas end up presenting a picture of South Africa that's literally on the verge of civil war. crime is out of control. Um, you know, people hate each other. We all ready to take up arms against each other. It it just we're a dystopian society. But is that the truth? I don't think so. And look at everybody in this room. This is we're not we're not representatives of a dystopian society. We're representatives of a community that works together, plays together, backs each other up, and a community that all ultimately is

fighting the bad guys. We're a good community. But but if people out there who don't understand this start to look at these perceptions of the world, how do they view things? One of the things I said about this being controversial is that you have to look at the current narrative in South Africa. Um, a narrative unfortunately where where South Africa is seen to be racially divided and and the truth is that is very far from the actual truth. But if you look at the social media narrative that we are being fed, it seems completely different. So, so let's talk about the technology. Let's talk about essentially what's going on in the background. So, here's the horrible sad reality of

all of this discussion. We are nothing nothing more than data points. That's what we are. We are not people. We're not human beings. We're not um we're not creatures with ambitions and goals and dreams. We're simply data points. It's simply data points in massive algorithms that have been driven by machine learning platforms to essentially seek engagement. At the end of the day, based on our likes and dislikes with social media content and and there's a lot of algorithms that go behind this, the algorithm suggests content. So based on our engagement with whatever social media platforms we work on, the social media algorithms will feed us content that it thinks we like also based on an

aggregation of the region and area that you are in. Um and that is that is really really powerful because essentially you know we get caught into this concept almost of doom scrolling through material and going through this content and say well okay what does this mean? You know how does this work? What does the world look like today? But this is where it gets for me very very concerning. The algorithms do not care for the truth. They don't care about the truth. They don't care about facts. They just care about content. One of the things that I've been doing on on social media, and I it kind of I think it pisses people off quite a bit,

is somebody will make some rabid claim on on X about whatever. And my first comment will be, can you show me the evidence of this? And then usually there's some rambling and it very invariably ends up with me being blocked by that person because I dared question their view of reality. The the simple thing is is because of the algorithm, if I'm some crazy nut job who believes the earth is flat and there are other crazy nut jobs who believe the earth is flat, the algorithm is going to feed that content to them. It doesn't care whether whether or not the earth is not flat. But but this creates this perception where where the the facts essentially

don't matter. Um, I I hate to call out a a particular country, but look at what's happening in the United States at the moment. The facts definitely don't matter there anymore. You know, you you literally have a president and okay, I'm probably shooting myself in the foot now. I'm probably going to get banned from the United States when this hit when this video hits YouTube. Um, but you literally have a president who says one thing the other day and says something completely different the other day, totally contradicting himself and and people don't care about it. They don't even notice about it because the social media is feeding what they want to hear. So, so effectively the

algorithms all they care about is pushing out content based on likes or dislikes and preferences. They don't care about whether that content is accurate or not. At the end of the day, all the algorithms are designed to do is to maximize engagement. Because here's the thing, the longer I keep you engaged on Facebook, Facebook is making money. The longer I keep you engaged on X, X is making money. The longer I keep you engaged on Tik Tok, Tik Tok is making money. At the end of the day, it's all about money. That these big companies that are driving these social media empires don't really care about you. They care about how much money they're making at the end of the day.

Now the reality is is that the more data points we actually have, the more data points that these algorithms collect, the more accurate the algorithms become. They start to predict the things that you most likely to be interested in. And and again, this this goes a lot beyond just uh you know um what I click and like on a particular social media page. It's the cookies that you're accepting on websites and communicating with servers on the back end. It's a whole ecosystem of data being moved around the world to essentially predict what you're most likely going to be interested in. And these these machine learning algorithms are getting more and more accurate over the years. Now the sad

reality is whether we want to admit it or not when it comes to these social media platforms not every region or product actually gets the same algorithm or the same waitings and biases in terms of how the data is presented. Classic example is Tik Tok. Do you think Tik Tok in China gets all the crap that we see in the rest of the world? Not a chance in hell. And you got to think about this. You got to think about this from the power perspective. China is a superpower. China is a civilization state. Now, let I want you to sort of let that settle for a bit. Civilization states are something very different to, for

example, something like the United States, which is multicultural and and dynamic and diverse. China is essentially an ethnostate, a civilization state. that's existed for thousands of years. It's now a superpower. They are competing on the world stage. They are a powerhouse. How do you compromise your opponents? You play the long game. What China is doing is they're dumbing down the West. They're dumbing down the rest of the world through social media engagement. And they're doing it really really effectively. When you look worldwide against our mathematics scores and engineering and science scores at school and university level, everywhere in the world is fundamentally going down except China. Because what has been fed to the community in China through the algorithm

is material designed to build them up, not distract them with dancing pandas and flipping whatever other kind of rubbish there is that we get on social media. And that's another way of control. It's another another issue of power. Um, Bite Dance, the company behind Tik Tok, is beholden to the Chinese government. How how long is it? I mean let's just say the situation changes in the United States. The United States becomes more fascist in a way. How long is it going to be before Facebook and X start doing the bidding of the government of the day? We saw it in South Africa ironically before 1994 during the years of apartheite. The government controlled the media.

Most of us, especially as as uh white South Africans, the way we would portray the country was very different to the actual reality of the country because at those days the government controlled the information. But now here's the thing. So, so even though we have these algorithms, money and power can manipulate the algorithm. So again, if you think about uh the Brexit vote in the United Kingdom, which is probably honestly one of the most stupid things the UK could have done, it was driven by a particular political narrative. the money that was spent on manipulating social media engines to feed certain content to the kind of middle ground of UK UK sorry UK society is what influenced that that whole

thing. The entire Brexit argument was a lie, but it was a lie perpetuated by the algorithms, by social media content, which effectively got the UK out of uh out of the European Union, and the UK has been essentially reeling and suffering ever since. But again, it's all about how people with money and power can manipulate the algorithms. Who do you think manipulates our viewpoints in South Africa? Think about it's the people who want power. It's the people who want to retain power. I my view of South Africa is we live in a country where actually everybody gets along pretty well. I live in a country where I think it's very optimistic and everybody gets along

nicely and we engage with each other. Do we have problems? Yes. But the powers that be are driving the social media narrative to us for us to believe that literally this country is falling apart. That that whites and blacks and colors and Indians and Asians, we all hate each other. To be honest, that's the biggest load of crap I've ever heard. But that's the narrative that's being portrayed by the social media environment. The people who have the power are controlling the algorithms. And the social media companies are quite happy to let this proceed because at the end of the day they are profiting from it. So the simple fact is is that those who actually can influence the algorithm

actually control the narrative. If I have the ability to control public perceptions through controlling social media engagement, I control the picture of what the world looks like. And that is very concerning because essentially we're in a country that should be democrat democratic. We're a country where everybody has the right to vote and has the right to influence policies and decisions. But we ourselves are being manipulated and that is very scary. The problem with relying on the content that we get that's been developed by these algorithms is that essentially what they do is they create echo chambers. Um they create tribalism. They create these small separate groups often antagonistic against each other that should never have existed in the first

place. So, so this content actually seeks to divide us rather than unite us at a time not just in South Africa but in this in society where we actually need to start engaging with each other um for a better world. And here's where the challenge really comes in as well. We we've moving from the machine learning age if I can put it this way to the artificial intelligence age. Now firstly um what we call artificial intelligence today is not yet true artificial intelligence although I must be honest I still say please and thank you to chat GPT um just in case you know when they eventually take over the world they don't kill me you know something like

that um but but the reality is um artificial intelligence has has changed the way that content is fed to us at a much faster rate, the feeds that we get are more accurate, the predictors of our behavior are more accurate. And and that is a huge concern because as more and more of our content that's fed to us is driven by AI, um you now start to get to a situation where the content that's actually fed to you is generated by AI and not actually even generated by people anymore. Now, what do you what do you really know is true or not? Um, and that's that's really scary. You look at you look at a Facebook feed

these days, and again, I go back to these two accounts that I created. The number of the amount of AI generated content that's generated as realistic stories is incredible. So, so how do we deal with this? The problem is is that when we start to outsource our critical thinking, especially to artificial intelligence systems, we really have to ask ourselves, is it still our world anymore? Is it our world or is it theirs? And I think that's a huge challenge that we have to face as a society and as a community. So, what I want to do is, okay, I'm actually running a little bit ahead of time. Um, but what I want to do is I want to show, again, this is a a

nice quote also from the first Matrix movie of Neo at the end of the movie after he's done his whole like digital kung fu thing and he's turned Agent Smith into bits and bites. Um, and he says, "I'm going to show you a world without you in it." So, so the reality is is that if we look at our situation, we look at how algorithms influence the world and how they impact on society and and our perceptions of society, is it is it hopeless? I mean, do we have to sit here and say, you know what, um, all hope is lost. You know, woe is us. Uh, we should just give up. We should just go and

become the consumers of information and the data points that we talk about. I mean, is it hopeless? I I I'm I'm a very optimistic person. I mean I mean, if anybody should be pessimistic, I should be the pessimistic one. I mean, I I literally see the worst of humanity on a daily basis with the cases that I investigate. But I don't believe it's hopeless. I don't believe that we have to surrender our control to the algorithms and just become consumers. just become okay. Okay, that was weird. Um the the machine the ghost of the machine it's already starting. Um but but the reality is is is that we don't have to be pessimistic. We can exert exert a

certain level of control. I think now more than anything, it's time for us as a society to challenge the status quo, to not just blindly believe the information that we get fed. When somebody says something to you, challenge them. I mean, don't be a as as you were saying, don't be don't be a okay, I'm going to use the word cuz I'm a guy, it's easy. Don't be a dos about it. Okay. Um but but but at the end of the day, you could still be respectful, you know, challenge people's views. Have disagree with each other. Have a conversation. Don't just accept what people tell you. Don't accept what the algorithm feeds to you that this is reality. You know, I've

had engagements with people on on X, for example, um where they'll make a certain statement and I'll challenge them to provide me the evidence. And for those people that that kind of provide me the evidence, they might provide me a Facebook post or another ex post or something along those lines. And then I'll challenge them to say, "Did you question that post?" And and if the person's willing to actually have an engagement with me, they'll turn around and say, "Well, actually, I never questioned that post because that's the narrative that I expect." Should we always expect the negative? Should we always expect the worst when our own eyes show us something different? That's that's at some point we we've

given away our critical thinking abilities to these algorithms that provide our content without us actually challenging it. And we should challenge and we should challenge each other and we should disagree with each other because you know what disagreements are good. Yeah. I I can I can have disagreements with my best friends that some people look at this and think, "Oh my word, you guys having argument. You're about to go into like beat each other to death with fists at at some point. But disagreements are how we drive society forward. Discussion and debate is how we make the world a better place. The reality is is that if we want to change the world, if we want to change

our world, if you want to change our society, we actually need to start thinking critically. And unfortunately the the social media landscape this algorithmically driven landscape that we find ourselves in the moment is disengaging our critical thinking faculties. We need to start asking questions. We need to teach people to ask questions. We need to teach our children to start asking questions. We need to verify everything because at the end of the day, you can't really just trust anything because somebody says, "I said so anymore." Or you can't trust anything just because, hey, there was a post from some media company that said this. Where is the evidence? Now, I know I emphasize a lot about evidence. I'm a

forensic scientist. I do digital forensics. I catch bad guys for a living. I really, really, really like catching bad guys. It's my kind of little guilty pleasure. Um but but the simple fact of the matter is in all of these cases that I work on it's all about the evidence. If I say something I need to be able to prove it and say this is how things are. We need to start doing the same in life. Another thing we need to do as well and this is what we need to do I think as a cyber security community is we need to better understand the math behind these algorithms um specifically around AI. How many people use AI these days

without truly understanding the math behind it that truly understand the randomization element of AI systems? How many people have ever asked why is it that when I give chat GPT the same thing to do it gives me two different outputs? If you understand the math, you understand why the AI does what it does. If you understand the maths, you understand how the machine learning algorithms work. And we can help other people to understand that. And then probably the last thing just to talk about is the ability to actually take control. Take control of the narrative. Now, I'm not going to turn around to you and say, "Hey, you know what guys? Um, don't use social media

cuz that would be the most hypocritical thing of me to do because social media is one of the ways that we communicate as people these days. We're not going to get past it. But we can take control. We can influence the algorithm. When you see rubbish in your feed, literally take a second and block it or report it or or train the algorithm that I'm not interested in this rubbish because it's rubbish. A machine cannot make judgment calls. A machine only works on the data points. And if you're providing the data points that change the narrative in a direction which is more true to reality, that's where you can take control and we can change the narrative on those systems.

So with that, I hope you all enjoy Bsides, the second Bsides. Um it's a lot of fun. There are some amazing speakers uh um uh this this day. You've got the capture the flags ongoing. you've got the scavenger hunt. Um, it's really, really, really going to be a fantastic day. Please, um, I'm going to be here all day. If anyone's got any questions, anyone wants to have a discussion, just give me a shout, let me know. Otherwise, uh, enjoy the rest of the day, everybody, and thanks for listening. Cool. [Applause]