← All talks

Fabricated Realities: AI Personas and Simulacra and the Threats They Pose

BSides Edmonton · 202435:515 viewsPublished 2025-10Watch on YouTube ↗
Speakers
Tags
StyleTalk
About this talk
BSides Edmonton September 23-24, 2024 Talk: Fabricated Realities: AI Personas and Simulacra and the Threats They Pose Abstract: In the past few years there has been a rapid advancement in the field of AI. The concept of manufactured entities that can replicate human behaviours, speech, and emotions have been the stuff of legend, literally, with myths of the bronze protector Talos from Greek legend and Egyptian automatons dating back millennia. This presentation will go into how the legends have started to become closer to reality in recent years, with improvements in AI technology resulting in the rapid mainstream growth of AI personas and simulacra. This talk will focus on a high-level overview of the threats implicit with these AI personas and simulacra and broad suggestions of mitigations that can be taken to lessen their impacts. Speakers: Sarah Hunt 2024 Slides: https://drive.google.com/drive/u/0/folders/1ess6fUZNd9BbWK7pPBrh8UVE-7GXtMyG
Show transcript [en]

she's

J

yeah so

so when my parents got married it came didn't exist uh set benefits of reality the reality was that my mother was just

okay what about to do

so specific

culture does have

so however

is

this involves

ComEd years teolog become more and

back work graphic designer andology

isud are education CH however and

bous

this is is also being help technology so two days before the arm jary president

bid

small ading GED by

to which uses of American

cover indivual and one

B you

may the

security Now the sh new all the How Daddy vation ask digital system and services Heth organizations are information technologies that can create identities organization security

broken down one

we reality is

confusion very techology in 201 theal communication

us this 1 22

mil

y after only 800,000

the this

toil the SC around someone since the 13th cury audio being used

to fishing engine it's it is sounds cost for example this year uh fin off it was convinced million to cyber

criminal AIO for thees there's just been a massive increase in Bar Cas if I show you this

and with board board is C grp syntic frog some

fory criminal as the example of this

doesn't exist is

high of a simar SC was this when City ended uping

techology to create high vide or audio recorders face for reports which can Cy sensitive information

voice has legens material potentially allow the criminal

something

Anor pictures a

everything of Truth

giving me a consequences see I'm have

to system has already

but the mere possibility of using AI to create simulated versions of

EV can you really take a

person sure so I'm talk fast this which one was

created you what and the answer is the one of uh

little bitong

it's better wor able

to is responsible for creating technology motivation for what they do is and security safety Mery are with the concerns created about theous is important that govern includes considerations prot so obviously found TV but your face about your voice what do you

of one of the

issues who lack ofation or the ability to prove the source legitim one of the B

so of a pie

of can't fully propos solution Recreation they generated ha is using

chain has ctoc reality is very simar

theer is that

that access and authoring a system where

individual accuration identity

another popular choice

TOA fast

creation and

the analysis patterns and then the P can test such as

[Music]

unusual something to as well dig

sit specific manipulation to all say

however techologies are conol and [Music] also

changing as become

as also helps improve to the point each no longer affect us

Ando so I set up video of my

excellent I

suck e

I disagreed with the message so I hacked it [Music] you back

computer e

I'll ask you you know this it's not

whatever it's

just investigator

science and 10 hours for the face and

hour again because

someone who

the right hand they so

know it's not me cuz they don't notice my frencher

D you made a comment at the beginning and it surprised me you remember mentation you had done for our team okay year ago were saying you needed 15 seconds of someone's recording through their voice you said 3 seconds so a year later we're now a 3 second for the

is and I can't remember was

like you like the video all have unicorn bir

and person would have preferred if she had not put that thing on her head then I would have had the

airphone thank you there a question if you like for what could be talk foration um what do you think group based some something that works just play fin fight it's going to be done on the other side as well yeah right so what do you think May word in the teacher bees

have to beware how Med people like

but and so but if way like have to

um I guess but um to prent someone from getting that together just arguing

not uh but that's everyone doing it reality just not really one so Sarah with bu fre has anyone thought about limiting the frequency of someone's voice as a defense against this so for example if I have a podcast or if I was putting out informational things with problem um could I perhaps F through and say remove some of the frequencies of the human voice pange so it might make it harder for that voice you be created I haven't seen

thing that as well was with federals saying during a video conversation especially when a decision about the ma to make sure you actually move your hand with your fingers open in front of your face cuz if it's a fake you'll see in between the fingers that there's another face count that was one way of identifying whether or

[Laughter]

not to not issues so doing this oh